akakakakakaa opened a new issue, #27160:
URL: https://github.com/apache/airflow/issues/27160

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   ## Airflow Version
   airflow 2.3.2
   
   ## Problem
   `WorkflowsCreateExecutionOperator` triggers google cloud workflows and 
execution param receives argument as `{"argument": {"key": "val", "key", 
"val"...}`
   
   But, When I passed argument as dict using 
`render_template_as_native_obj=True`, protobuf error occured `TypeError: 
{'projectId': 'project-id', 'location': 'us-east1'} has type dict, but expected 
one of: bytes, unicode`.
   
   When I passed argument as bytes `{"argument": b'{\n  "projectId": 
"project-id",\n  "location": "us-east1"\n}'` It working.
   
   ### What you think should happen instead
   
   execution argument should be `Dict` instead of `bytes`.
   
   ### How to reproduce
   
   ```python
   from airflow import DAG
   from airflow.models.param import Param
   from airflow.operators.dummy_operator import DummyOperator
   from airflow.providers.google.cloud.operators.workflows import 
WorkflowsCreateExecutionOperator
   
   with DAG(
       dag_id="continual_learning_deid_norm_h2h_test",
       params={
           "location": Param(type="string", default="us-east1"),
           "project_id": Param(type="string", default="project-id"),
           "workflow_id": Param(type="string", default="orkflow"),
           "workflow_execution_info": {
               "argument": {
                   "projectId": "project-id",
                   "location": "us-east1"
               }
           }
       },
       render_template_as_native_obj=True
   ) as dag:
       execution = "{{ params.workflow_execution_info }}"
       create_execution = WorkflowsCreateExecutionOperator(
           task_id="create_execution",
           location="{{ params.location }}",
           project_id="{{ params.project_id }}",
           workflow_id="{{ params.workflow_id }}",
           execution="{{ params.workflow_execution_info }}"
       )
   
       start_operator = DummyOperator(task_id='test_task')
   
       start_operator >> create_execution
   ```
   
   ### Operating System
   
   Ubuntu 20.04.5 LTS (Focal Fossa)
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon          | 3.4.0
   apache-airflow-providers-apache-beam     | 3.4.0
   apache-airflow-providers-celery          | 2.1.4
   apache-airflow-providers-cncf-kubernetes | 4.0.2
   apache-airflow-providers-docker          | 2.7.0
   apache-airflow-providers-elasticsearch   | 3.0.3
   apache-airflow-providers-ftp             | 2.1.2
   apache-airflow-providers-google          | 7.0.0
   apache-airflow-providers-grpc            | 2.0.4
   apache-airflow-providers-hashicorp       | 2.2.0
   apache-airflow-providers-http            | 2.1.2
   apache-airflow-providers-imap            | 2.2.3
   apache-airflow-providers-microsoft-azure | 3.9.0
   apache-airflow-providers-mysql           | 2.2.3
   apache-airflow-providers-odbc            | 2.0.4
   apache-airflow-providers-postgres        | 4.1.0
   apache-airflow-providers-redis           | 2.0.4
   apache-airflow-providers-sendgrid        | 2.0.4
   apache-airflow-providers-sftp            | 2.6.0
   apache-airflow-providers-slack           | 4.2.3
   apache-airflow-providers-sqlite          | 2.1.3
   apache-airflow-providers-ssh             | 2.4.4
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to