UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Variable metric methods without exact one- dimensional searches. Fitzpatrick, Gordon James

Abstract

The selection of updating formulas for the H matrix and the subproblem of one-dimensional search are considered for variable metric algorithms not using exact linear searches in the subproblem. It is argued that the convex class of updating formulas given by Fletcher is the logical choice for such algorithms. It is shown that direct linear searches are more efficient than linear searches using directional derivatives at each point, particularly for objective functions of many variables. Features of algorithms using the convex class of formulas without exact searches are discussed. It is proven that effects on these algorithms of scaling of the variables of the objective function are identical to effects of transforming the initial H matrix. It is predicted that regions of a function where the Hessian is non-positive definite may be detrimental to convergence of these algorithms. A function is given for which Fletcher's recent algorithm (which does not use any linear search for a minimum) fails to converge. Attempts were made to devise a test to detect non-convexity of a function so that an algorithm using no linear search for a minimum in convex regions and an alternative strategy in non-convex regions could be developed to overcome this problem. Algorithms incorporating a test for non-convexity were coded as well as Fletcher's algorithm, an algorithm using a weak quadratic direct minimizing search, and an algorithm using the weak cubic minimizing search as used in the Fletcher Powell method. Numerical experiments were performed on functions from the literature and functions developed by the author. Fletcher's algorithm was found to be inefficient on all functions in comparison to the weak quadratic search algorithm where the initial H matrix had small eigenvalues. Where Fletcher's algorithm was relatively efficient, the former search was in all cases competitive. The value of a direct over derivative linear search is demonstrated. The algorithms using a test for convexity were not effective, since the best was not generally"effective in detecting non-convexity. It is concluded that algorithms without a form of one-dimensional weak minimizing search are not suitable for use on general minimization problems, and that the weak quadratic direct search proposed is a more efficient and reliable alternative.

Item Media

Item Citations and Data

License

For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.

Usage Statistics