UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Numerical analysis for derivative-free optimization Jarry-Bolduc, Gabriel

Abstract

In many optimization problems, the objective function is available only as the output of a blackbox. In this context, derivative-free optimization (DFO) methods can be used to solve the optimization problem. Derivative-free optimization methods may be classified into two main categories: direct search methods and model-based methods. This thesis presents novel theory and algorithms in both categories. Model-based methods often approximate the gradient or the Hessian of the objective function. The properties of the gradient approximation technique called generalized simplex gradient are scrutinized. Second, two Hessian approximation techniques called generalized simplex Hessian (GSH) and generalized centered simplex Hessian (GCSH) are defined and analyzed. In particular, we investigate how to approximate partial Hessians, and minimize the number of function evaluations when employing the GSH/GCSH. A useful notion in direct search methods is that of positive spanning set. In this thesis, we present the first deterministic algorithm to compute the cosine measure of any finite positive spanning set. Then we investigate the structure of positive bases with maximal cosine measure. We focus on positive bases of intermediate sizes. Last, the main theoretical concepts discussed in this thesis are utilized in DFO algorithms. The GSH are employed in a derivative-free trust-region algorithm and positive bases with high cosine measure are employed in a direct search algorithm. We examine how the theoretical advancements made in this thesis translate to gains in efficiency in DFO algorithms.

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International