I am trying to use Guild to do hyperparameter optimization using the gp optimizer. The first three runs started using random initializations as expected. However, even after the 3rd run, it continues to perform random initializations. On examining the output, I noticed that the following information was posted:
INFO: [guild] Random start for optimization (cannot find objective ‘loss’)
Is this expected? Or am I missing something?
I looked at the scalars using guild runs info and found that all the scalars that I am logging using Tensorboard are displayed correctly.
I would appreciate any help in this matter.