- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Algorithm configuration landscapes : analysis and exploitation
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Algorithm configuration landscapes : analysis and exploitation Pushak, Yasha
Abstract
Algorithm designers are regularly faced with the tedious task of finding suitable default values for the parameters that impact the performance of algorithms. Thoroughly evaluating even a single parameter configuration typically requires running the algorithm on a large number of problem instances, which can make the process very slow. To address this problem, many automated algorithm configuration procedures have been proposed. The vast majority of these are based on powerful meta-heuristics with strong diversification mechanisms, thereby ensuring that they sufficiently explore the parameter configuration space. However, despite the prominence of automated algorithm configuration, relatively little is known about the algorithm configuration landscapes searched by these procedures, which relate parameter values to algorithm performance. As a result, while these strong diversification mechanisms make existing configurators robust, it is unclear whether or not they are actually required or simply increase the running time of the configurators. One particularly notable early work in the field showed evidence suggesting that the algorithm configuration landscapes of two optimization algorithms are, in fact, close to uni-modal. However, existing fitness landscape analysis techniques are unable to account for the stochasticity in the performance measurements of algorithms in a statistically principled way, which is a major barrier to their application to studying algorithm configuration scenarios. We address this gap by developing the first statistically principled method for detecting significant deviations from uni-modality in a stochastic landscape. We apply this method, along with other (new and existing) landscape analysis techniques, to a variety of algorithm configuration scenarios arising in automated machine learning (AutoML) and the minimization of the running time of algorithms for solving NP-hard problems. We show that algorithm configuration landscapes are most often highly structured and relatively simple. Using the intuition from this analysis, we develop two prototype algorithm configuration procedures designed for AutoML. We show that the methods make assumptions that are too strong, leading to mixed results. However, we build on this intuition and develop another procedure for the configuration of NP-hard algorithms. Compared to state-of-the-art baselines, we show that our new method often finds similar or better configurations in the same or less time.
Item Metadata
Title |
Algorithm configuration landscapes : analysis and exploitation
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2022
|
Description |
Algorithm designers are regularly faced with the tedious task of finding suitable
default values for the parameters that impact the performance of algorithms. Thoroughly
evaluating even a single parameter configuration typically requires running
the algorithm on a large number of problem instances, which can make the process
very slow. To address this problem, many automated algorithm configuration
procedures have been proposed. The vast majority of these are based on powerful
meta-heuristics with strong diversification mechanisms, thereby ensuring that they
sufficiently explore the parameter configuration space.
However, despite the prominence of automated algorithm configuration, relatively
little is known about the algorithm configuration landscapes searched by
these procedures, which relate parameter values to algorithm performance. As a
result, while these strong diversification mechanisms make existing configurators
robust, it is unclear whether or not they are actually required or simply increase the
running time of the configurators.
One particularly notable early work in the field showed evidence suggesting
that the algorithm configuration landscapes of two optimization algorithms are, in
fact, close to uni-modal. However, existing fitness landscape analysis techniques
are unable to account for the stochasticity in the performance measurements of
algorithms in a statistically principled way, which is a major barrier to their application
to studying algorithm configuration scenarios. We address this gap by
developing the first statistically principled method for detecting significant deviations
from uni-modality in a stochastic landscape.
We apply this method, along with other (new and existing) landscape analysis
techniques, to a variety of algorithm configuration scenarios arising in automated machine learning (AutoML) and the minimization of the running time of
algorithms for solving NP-hard problems. We show that algorithm configuration
landscapes are most often highly structured and relatively simple.
Using the intuition from this analysis, we develop two prototype algorithm
configuration procedures designed for AutoML. We show that the methods make
assumptions that are too strong, leading to mixed results. However, we build on
this intuition and develop another procedure for the configuration of NP-hard algorithms.
Compared to state-of-the-art baselines, we show that our new method
often finds similar or better configurations in the same or less time.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2022-05-11
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0413565
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2022-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International