kumgaurav opened a new issue, #26755:
URL: https://github.com/apache/beam/issues/26755

   ### What happened?
   
   **This my code for writing into datastore :** 
   def write_to_datastore(gcloud_options, consumer_options, pipeline_options):
     #input_patterns = 
['gs://itd-aia-dp/dataflow/pipelines/datastore/input/Cortex-Model-Coefficients.csv']
 
     input_patterns = [] 
     input_patterns.append(consumer_options.input)
     logging.info("user_options.datastore_project : "+ 
str(consumer_options.datastore_project))
     logging.info("user_options.namespace : "+ str(consumer_options.namespace))
     logging.info("user_options.kind : "+ str(consumer_options.kind))
     logging.info("gcloud_options.project : "+ str(gcloud_options.project))
     #datastore_project = if 
(consumer_options.datastore_project.is_accessible()) 
     """Creates a pipeline that writes entities to Cloud Datastore."""
     with beam.Pipeline(options=pipeline_options) as p:
       _ = (
           p
           | 'read' >> ReadCsvFiles(input_patterns)
           | 'create entity' >> beam.ParDo(EntityWrapperPardo(consumer_options))
           | 'write to datastore' >> 
WriteToDatastore(**consumer_options.datastore_project.get()**))
   
   **using direct runner:**
   python -m testconsumer \
   --runner DirectRunner \
   --project itd-aia-dp \
   --input 
gs://itd-aia-dp/dataflow/pipelines/datastore/input/Cortex-Model-Coefficients-2.csv
 \
   --kind product \
   --namespace datapipeline \
   --datastore_project itd-aia-dp \
   --output gs://itd-aia-dp/dataflow/pipelines/datastore/output/
   
   **using dataflow runner**
   
   python3 gcs_to_datastore_consumer_main.py \
     --project=itd-aia-dp \
     --runner=DataflowRunner \
     --staging_location=gs://itd-aia-dp/staging \
     --temp_location=gs://itd-aia-dp/temp \
     --template_location=gs://itd-aia-dp/templates \
     --requirements_file=requirements.txt \
     --region=REGION \
     --experiments=shuffle_mode=service \
     --setup_file ./setup.py \
     --input 
gs://itd-aia-dp/dataflow/pipelines/datastore/input/Cortex-Model-Coefficients.csv
 \
     --kind product \
     --output gs://itd-aia-dp/dataflow/pipelines/datastore/output/
   
   stack trace :
   ERROR:apache_beam.runners.runner:Error while visiting write to 
datastore/Write Batch to Datastore
   Traceback (most recent call last):
     File 
"/Users/gkumargaur/workspace/python/office/datastoreconsumer/pysparks/gcs_to_datastore_consumer_main.py",
 line 53, in <module>
       consumertemplate.run()
     File 
"/Users/gkumargaur/workspace/python/office/datastoreconsumer/pysparks/panw/paloalto/consumertemplate.py",
 line 176, in run
       write_to_datastore(gcloud_options, consumer_options, pipeline_options)
     File 
"/Users/gkumargaur/workspace/python/office/datastoreconsumer/pysparks/panw/paloalto/consumertemplate.py",
 line 131, in write_to_datastore
       _ = (
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 600, in __exit__
       self.result = self.run()
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 550, in run
       return Pipeline.from_runner_api(
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 577, in run
       return self.runner.run_pipeline(self, self._options)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/dataflow/dataflow_runner.py",
 line 524, in run_pipeline
       self.visit_transforms(pipeline, options)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/runner.py",
 line 211, in visit_transforms
       pipeline.visit(RunVisitor(self))
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 626, in visit
       self._root_transform().visit(visitor, self, visited)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 1260, in visit
       part.visit(visitor, pipeline, visited)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 1260, in visit
       part.visit(visitor, pipeline, visited)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/pipeline.py",
 line 1263, in visit
       visitor.visit_transform(self)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/runner.py",
 line 206, in visit_transform
       self.runner.run_transform(transform_node, options)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/runner.py",
 line 233, in run_transform
       return m(transform_node, options)
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/dataflow/dataflow_runner.py",
 line 877, in run_ParDo
       step = self._add_step(
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/dataflow/dataflow_runner.py",
 line 652, in _add_step
       [
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/runners/dataflow/dataflow_runner.py",
 line 653, in <listcomp>
       item.get_dict()
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/transforms/display.py",
 line 370, in get_dict
       self.is_valid()
     File 
"/Users/gkumargaur/opt/miniconda3/envs/beam/lib/python3.9/site-packages/apache_beam/transforms/display.py",
 line 336, in is_valid
       raise ValueError(
   ValueError: Invalid DisplayDataItem. Value RuntimeValueProvider(option: 
datastore_project, type: str, default_value: None) is of an unsupported type.
   
   ### Issue Priority
   
   Priority: 2 (default / most bugs should be filed as P2)
   
   ### Issue Components
   
   - [X] Component: Python SDK
   - [ ] Component: Java SDK
   - [ ] Component: Go SDK
   - [ ] Component: Typescript SDK
   - [ ] Component: IO connector
   - [ ] Component: Beam examples
   - [ ] Component: Beam playground
   - [ ] Component: Beam katas
   - [ ] Component: Website
   - [ ] Component: Spark Runner
   - [ ] Component: Flink Runner
   - [ ] Component: Samza Runner
   - [ ] Component: Twister2 Runner
   - [ ] Component: Hazelcast Jet Runner
   - [X] Component: Google Cloud Dataflow Runner


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to