kennknowles opened a new issue, #19171:
URL: https://github.com/apache/beam/issues/19171
I cannot deploy an apache beam job to Cloud Dataflow that contains runtime
value parameters.
The standard use case is with Cloud Dataflow Templates which use
RuntimeValueProvider to get template parameters.
When trying to call `get` on the parameter, I always get an error like:
```
apache_beam.error.RuntimeValueProviderError: RuntimeValueProvider(option:
myparam, type: str, default_value:
'defalut-value').get() not called from a runtime context
```
A minimal example:
```
class UserOptions(PipelineOptions):
@classmethod
def _add_argparse_args(cls, parser):
parser.add_value_provider_argument('--myparam', type=str,
default='default-value')
def run(argv=None):
parser = argparse.ArgumentParser()
known_args, pipeline_args = parser.parse_known_args(argv)
pipeline_options = PipelineOptions(pipeline_args)
pipeline_options.view_as(SetupOptions).save_main_session
= True
google_cloud_options = pipeline_options.view_as(GoogleCloudOptions)
# insert google
cloud options here, or pass them in arguments
standard_options = pipeline_options.view_as(StandardOptions)
standard_options.runner = 'DataflowRunner'
user_options = pipeline_options.view_as(UserOptions)
p = beam.Pipeline(options=pipeline_options)
param = user_options.myparam.get() # This line is
the issue
result = p.run()
result.wait_until_finish()
if __name__ == '__main__':
run()
```
I would expect that the runtime context would be ignored when running the
script locally.
Imported from Jira
[BEAM-5466](https://issues.apache.org/jira/browse/BEAM-5466). Original Jira may
contain additional context.
Reported by: mackenzie-orange.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]