Guild and pytorch lightning integration

Both GuildAI and Pytorch Lightning aim to be lightweight wrappers.

I will try them out together and report here how well they fit. Has anybody else already done so? If so, how was it?

There looks to be overlapping parts that might conflict. But otherwise it looks like they complement each other well.

Lightning is a great candidate for Guild’s auto-support features. It’s very much in Guild’s spirit.

One thing PL does is aggregate args from several sources. I had no idea that argparse could do this. Relevant code snippet from Argparser Best Practices

class LitModel(LightningModule):

    @staticmethod
    def add_model_specific_args(parent_parser):
        parser = ArgumentParser(parents=[parent_parser], add_help=False)
        parser.add_argument('--encoder_layers', type=int, default=12)
        parser.add_argument('--data_path', type=str, default='/some/path')
        return parser

So the args are defined in several different source files. Will Guild capture all the args?
I like that this is can be done. You can then compose a project from several components, each providing their own args.

Yes, Guild supports parents in argparse. Future releases will also support flags definitions across different types of sources — e.g. from command line args and also config files and also globals, etc.

1 Like