kunaljubce commented on issue #40232:
URL: https://github.com/apache/airflow/issues/40232#issuecomment-2227391308

   One more thing @githubwua. Regarding the original requirement here to add 
`tags` to `DataprocCreateBatchOperator`, can you try to pass your network tags 
in the below syntax to the batch config and try once?
   
   ```
   BATCH_CONFIG = {
       "environment_config": {
           "execution_config": {
               "subnetwork_uri": "<your-subnet>",
               "network_tags": "<your-network-tags>",
           }
       },
       "spark_batch": {
           "jar_file_uris": 
["file:///usr/lib/spark/examples/jars/spark-examples.jar"],
           "main_class": "org.apache.spark.examples.SparkPi",
       },
   }
   ```
   and then pass this `BATCH_CONFIG` while instantiating the operator?
   
   ```
   create_batch = DataprocCreateBatchOperator(
           task_id="create_batch",
           project_id=PROJECT_ID,
           region=REGION,
           batch=BATCH_CONFIG,
           batch_id=BATCH_ID,
       )
   ```
   
   I dug deep into the [GCP 
docs](https://cloud.google.com/python/docs/reference/dataproc/latest/google.cloud.dataproc_v1.types.ExecutionConfig)
 for the `BatchControllerClient()` and found this out, better to try if this 
works before making a code change on airflow side. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to