odykstra commented on issue #36839:
URL: https://github.com/apache/airflow/issues/36839#issuecomment-1922153837

   Hi there. I'm running into this same issue. I'm running a merge statement 
that only returns one row with four columns. 
   
   Appreciate your attention to this.
   
   Old versions: no error
   apache-airflow-providers-databricks -- 5.0.1
   Airflow -- v2.5.0
   
   New versions; getting error
   apache-airflow-providers-databricks  -- 6.1.0
   Airflow -- v2.7.3
   
   ```
   [2024-02-01, 15:28:06 UTC] {client.py:258} INFO - Closing session 
01eec116-6bad-1ab1-9d48-941cb79ab654
   [2024-02-01, 15:28:07 UTC] {xcom.py:661} ERROR - Object of type tuple is not 
JSON serializable. If you are using pickle instead of JSON for XCom, then you 
need to enable pickle support for XCom in your airflow config or make sure to 
decorate your object with attr.
   [2024-02-01, 15:28:07 UTC] {base.py:73} INFO - Using connection ID 
'databricks' for task execution.
   [2024-02-01, 15:28:07 UTC] {taskinstance.py:1937} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 
91, in default
       return serialize(o)
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
145, in serialize
       return encode(classname, version, serialize(data, depth + 1))
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
124, in serialize
       return [serialize(d, depth + 1) for d in o]
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
124, in <listcomp>
       return [serialize(d, depth + 1) for d in o]
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
124, in serialize
       return [serialize(d, depth + 1) for d in o]
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
124, in <listcomp>
       return [serialize(d, depth + 1) for d in o]
     File 
"/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 
178, in serialize
       raise TypeError(f"cannot serialize object of type {cls}")
   TypeError: cannot serialize object of type <class 
'airflow.providers.databricks.hooks.databricks_sql.Row'>
   During handling of the above exception, another exception occurred:
   Traceback (most recent call last):
     File "/usr/local/lib/python3.10/site-packages/airflow/utils/session.py", 
line 76, in wrapper
       return func(*args, **kwargs)
     File 
"/usr/local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 
2479, in xcom_push
       XCom.set(
     File "/usr/local/lib/python3.10/site-packages/airflow/utils/session.py", 
line 76, in wrapper
       return func(*args, **kwargs)
     File "/usr/local/lib/python3.10/site-packages/airflow/models/xcom.py", 
line 244, in set
       value = cls.serialize_value(
     File "/usr/local/lib/python3.10/site-packages/airflow/models/xcom.py", 
line 659, in serialize_value
       return json.dumps(value, cls=XComEncoder).encode("UTF-8")
     File "/usr/local/lib/python3.10/json/__init__.py", line 238, in dumps
       **kw).encode(obj)
     File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 
102, in encode
       o = self.default(o)
     File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 
93, in default
       return super().default(o)
     File "/usr/local/lib/python3.10/json/encoder.py", line 179, in default
       raise TypeError(f'Object of type {o.__class__.__name__} '
   TypeError: Object of type tuple is not JSON serializable
   [2024-02-01, 15:28:07 UTC] {taskinstance.py:1400} INFO - Marking task as 
FAILED. dag_id=Aggregates_Hit, task_id=update_tm_hit_perc, 
execution_date=20240201T152717, start_date=20240201T152732, 
end_date=20240201T152807
   [2024-02-01, 15:28:07 UTC] {standard_task_runner.py:104} ERROR - Failed to 
execute job 2054213 for task update_tm_hit_perc (Object of type tuple is not 
JSON serializable; 31)
   [2024-02-01, 15:28:07 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 1
   [2024-02-01, 15:28:07 UTC] {taskinstance.py:2778} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to