gustavohwulee opened a new issue, #35197:
URL: https://github.com/apache/airflow/issues/35197

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   We upgraded from airflow 2.5.1 to airflow 2.6.3 and noticed that tuples 
returned in Operators are not being parsed as native objects as expected 
before. They are being parsed as string even when the parameter 
render_template_as_native_obj is True.
   We noticed this behaviour when using the `DmsDescribeTasksOperator` from 
`airflow.providers.amazon.aws.operators.dms`. That returns a tuple as shown 
below:
   
   
![image](https://github.com/apache/airflow/assets/38598129/7328bbab-0182-468a-99af-7ae9ac5cd1db)
   
   
   ### What you think should happen instead
   
   The response should be parsed as a native object instead of string.
   
   ### How to reproduce
   
   I've created a script to test this case:
   
   Result on Airflow 2.5.1 (parsed as list, as expected):
   
![image](https://github.com/apache/airflow/assets/38598129/99682be9-283b-4b0a-9ec1-5ff813c0fde2)
   
   Result on Airflow 2.6.3 (parsed as string):
   
![image](https://github.com/apache/airflow/assets/38598129/b9a9562b-6d38-48f2-82a6-bb976e016336)
   
   
   Operator
   ```python
   """Debug Operators"""
   from airflow.models import BaseOperator
   
   
   class TestTupleReturnOperator(BaseOperator):
       """Test Tuple Return Operator"""
   
       def __init__(self, *args, **kwargs):
           super().__init__(*args, **kwargs)
   
       def execute(self, context):
           return None, {"key1": "value1", "key2": "value2"}
   
   
   class TestTupleReturnOperatorV2(BaseOperator):
       """Test Tuple Return Operator"""
   
       def __init__(self, *args, **kwargs):
           super().__init__(*args, **kwargs)
   
       def execute(self, context):
           return 1, 2, 3
   
   
   class TestTupleReturnOperatorV3(BaseOperator):
       """Test Tuple Return Operator"""
   
       def __init__(self, *args, **kwargs):
           super().__init__(*args, **kwargs)
   
       def execute(self, context):
           return 1, 2, 3, None
   
   
   class TestDictReturnOperator(BaseOperator):
       """Test Dict Return Operator"""
   
       def __init__(self, *args, **kwargs):
           super().__init__(*args, **kwargs)
   
       def execute(self, context):
           return {"key1": "value1", "key2": "value2"}
   
   ```
   
   Dag
   ```python
   """
   # Dag to test the case
   """
   import logging
   import os
   
   from airflow import DAG
   from airflow.utils.dates import days_ago
   from airflow_data.operators.debug_operator import (
       TestTupleReturnOperator,
       TestTupleReturnOperatorV2,
       TestTupleReturnOperatorV3,
       TestDictReturnOperator
   )
   from airflow.operators.python import PythonOperator
   
   DAG_ID = os.path.basename(__file__).replace(".py", "")
   logger = logging.getLogger(__name__)
   
   default_args = {
       "owner": "Gustavo Lee",
       "depends_on_past": False,
       "retries": 2,
       "email_on_retry": False,
       "start_date": days_ago(1),
   }
   
   init_dag = {
       "dag_id": DAG_ID,
       "default_args": default_args,
       "description": "Test Operator Return",
       "schedule_interval": None,
       "max_active_runs": 1,
       "tags": ["dbricks", "data"],
       "catchup": False,
       "render_template_as_native_obj": True,
   }
   
   
   def print_xcom_debug(**kwargs):
       """print logs"""
       task_instance = kwargs['ti']
       tuple_test = task_instance.xcom_pull(task_ids='test_tuple_return')
       tuple_v2_test = task_instance.xcom_pull(task_ids='tuple_v2_return_test')
       tuple_v3_test = task_instance.xcom_pull(task_ids='tuple_v3_return_test')
       dict_test = task_instance.xcom_pull(task_ids='test_dict_return')
       print("Dict Type: " + str(type(dict_test)))
       print("Tuple Type: " + str(type(tuple_test)))
       print("Tuple v2 Type: " + str(type(tuple_v2_test)))
       print("Tuple v3 Type: " + str(type(tuple_v3_test)))
       print("="*20)
       print("Dict Value: " + str(dict_test))
       print("Tuple Value: " + str(tuple_test))
       print("Tuple v2 Type: " + str(tuple_v2_test))
       print("Tuple v3 Type: " + str(tuple_v3_test))
   
   with DAG(**init_dag) as dag:
       dag.doc_md = __doc__
   
       tuple_return_test = TestTupleReturnOperator(
           task_id='test_tuple_return'
       )
       tuple_v2_return_test = TestTupleReturnOperatorV2(
           task_id='tuple_v2_return_test'
       )
       tuple_v3_return_test = TestTupleReturnOperatorV3(
           task_id='tuple_v3_return_test'
       )
   
       dict_return_test = TestDictReturnOperator(
           task_id='test_dict_return'
       )
   
       print_debug = PythonOperator(
           task_id='print_debug',
           python_callable=print_xcom_debug,
           provide_context=True
       )
   
       tuple_return_test >> print_debug
       dict_return_test >> print_debug
       tuple_v2_return_test >> print_debug
       tuple_v3_return_test >> print_debug
   
   ```
   
   ### Operating System
   
   MWAA
   
   ### Versions of Apache Airflow Providers
   
   2.5.1
   
   ```
   apache-airflow-providers-amazon==7.1.0
   apache-airflow-providers-celery==3.1.0
   apache-airflow-providers-common-sql==1.3.3
   apache-airflow-providers-ftp==3.3.0
   apache-airflow-providers-http==4.1.1
   apache-airflow-providers-imap==3.1.1
   apache-airflow-providers-postgres==5.4.0
   apache-airflow-providers-sqlite==3.3.1
   apache-airflow-providers-databricks==4.0.0
   apache-airflow-providers-slack==7.2.0
   apache-airflow-providers-mongo==3.1.1
   apache-airflow-providers-mysql==4.0.0
   apache-airflow-providers-airbyte==3.2.0
   apache-airflow-providers-google==8.8.0
   databricks-api==0.9.0
   glom==23.3.0
   virtualenv==20.17.1
   ```
   
   
   
   2.6.3
   
   ```
   apache-airflow-providers-amazon==8.2.0
   apache-airflow-providers-celery==3.2.1
   apache-airflow-providers-common-sql==1.5.2
   apache-airflow-providers-ftp==3.4.2
   apache-airflow-providers-http==4.4.2
   apache-airflow-providers-imap==3.2.2
   apache-airflow-providers-postgres==5.5.1
   apache-airflow-providers-sqlite==3.4.2
   apache-airflow-providers-databricks==4.3.0
   apache-airflow-providers-slack==7.3.1
   apache-airflow-providers-mongo==3.2.1
   apache-airflow-providers-mysql==5.1.1
   apache-airflow-providers-airbyte==3.3.1
   apache-airflow-providers-google==10.2.0
   virtualenv==20.23.1
   databricks-api==0.9.0
   glom==23.3.0
   ```
   
   ### Deployment
   
   Amazon (AWS) MWAA
   
   ### Deployment details
   
   MWAA
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to