riteshghorse commented on code in PR #25561: URL: https://github.com/apache/beam/pull/25561#discussion_r1122237012
########## website/www/site/content/en/documentation/ml/large-language-modeling.md: ########## @@ -47,9 +47,30 @@ torch.save(model.state_dict(), "path/to/save/state_dict.pth") You can view the code on [GitHub](https://github.com/apache/beam/tree/master/sdks/python/apache_beam/examples/inference/large_language_modeling/main.py) -1. Locally on your machine: `python main.py --runner DirectRunner --model_state_dict_path <local or remote path to state_dict>`. You need to have 45 GB of disk space available to run this example. -2. On Google Cloud using Dataflow: `python main.py --runner DataflowRunner --model_state_dict_path <gs://path/to/saved/state_dict.pth> --project <PROJECT_ID> ---region <REGION> --requirements_file requirements.txt --temp_location <gs://path/to/temp/location> --experiments "use_runner_v2,no_use_multiple_sdk_containers" --machine_type=n2-standard-16`. You can also pass other configuration parameters as described [here](https://cloud.google.com/dataflow/docs/guides/setting-pipeline-options#setting_required_options). +1. Locally on your machine: +``` +python main.py --runner DirectRunner \ + --model_state_dict_path <local or remote path to state_dict>` \ + --model_name t5-11b + Review Comment: add ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
