This example illustrate how to create a custom optimizer using Hyperopt. Follow the patterns outlined below to use other sequential tuning algorithms with your project.
|guild.yml||Project Guild file|
|train.py||Sample training script|
|tpe.py||Optimizer support using Tree of Parzen Estimators with Hyperopt|
|requirements.txt||List of required libraries|
An optimizer is a Guild operation that specializes in running multiple trials based on a batch prototype.
In this example, we create a custom optimizer named
tpe. The optimizer is used to find optimal hyperparameter values using the Tree of Parzen Estimators algorithm from the Hyperopt library.
Here’s the Guild file that defines both the training operation and the optimizer:
Note Each operation uses the Python module corresponding to its name. If a module uses a different name, specify it using the
Start an optimization batch by running:
guild run train x=[-2:2] --optimize
train operation is configured to use the
tpe optimizer by default (see above).
tpe optimizer is configured to run 10 trials by default. Change this value using
You can view operation help for
tpe to show supported flags. These are set using
-Fo options with
guild run tpe --help-op
tpe operation is implemented in tpe.py.
Implementation conists of the following steps:
- Read settings for the batch run and the batch prototype.
- Convert prototype flag values into a Hyperopt search space
- Define a function to minimize that runs a batch trial using hyperparameter values provided by Hyperopt.
- Use Hyperopt
fminto start a sequential optimization process using the search space and function to minimize.
Read Batch Settings
batch_util.batch_run() to get the current batch run. This is used to read the batch settings.
Convert Prototype Flags to Search Space
The batch proto run defines the flag values used for trials. These may contain search specs like
uniform[-2:2]. This syntax represents a Guild search space function. Decode search space functions to create a Hyperopt search space.
Read the proto flags and uses them to create a search space:
Decode each flag function to create a corresponding Hyperopt search space expression:
Function to Minimize
Hyperopt calls a function with a dictionary of suggested flag values. Use
batch_util.run_trial to generate a trial for the batch using the suggested values:
The result is the objective specified in
guild run — or
loss by default.
Minimize the Objective
fmin to minimize the objective:
Configure TPE Algorithm
The value for
algo uses a local function that supports the TPE hyperparameters. The hyperparameters in this case are defined as module global variables. These are imported by Guild and can be customized using
-Fo options with
Perform Summary Actions
This implementation labels the “best” run. Run any number of summary actions in your custom optimzer.