Error while running gp optimizer for Hyperparameter optimization: Cannot find objective 'loss'


I am trying to use Guild to do hyperparameter optimization using the gp optimizer. The first three runs started using random initializations as expected. However, even after the 3rd run, it continues to perform random initializations. On examining the output, I noticed that the following information was posted:

INFO: [guild] Random start for optimization (cannot find objective ‘loss’)

Is this expected? Or am I missing something?

I looked at the scalars using guild runs info and found that all the scalars that I am logging using Tensorboard are displayed correctly.

I would appreciate any help in this matter.


By default the optimizers look for a scalar named ‘loss’ to minimize. If that’s what you want, make sure that ‘loss’ shows up for the generated runs by running guild runs info. If you want to minimize (or maximize) a different scalar, specify its name with --min (or --max) for the run command.

1 Like

Thanks. I used the --minimize and specified the scalar value that I was logging using TensorBoard and wanted to minimize as part of the hyperparameter optimization. The error disappeared as a result.

Great! You can specify this scalar in the Guild file so you don’t need to remember to set it each time with the run command.

  objective: <name of scalar you want to minimize>

If you want to maximize, use a negative sign (’-’) in front of the scalar name.

1 Like