atul-astronomer opened a new issue, #47348:
URL: https://github.com/apache/airflow/issues/47348

   ### Apache Airflow version
   
   3.0.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   User defined macros are not working.
   
   **FastAPI logs:**
   
   ```typescript
   File "/opt/airflow/airflow/api_fastapi/execution_api/routes/xcoms.py", line 
228, in set_xcom
       BaseXCom.set(
     File "/opt/airflow/airflow/utils/session.py", line 98, in wrapper
       return func(*args, **kwargs)
     File "/opt/airflow/airflow/models/xcom.py", line 185, in set
       value = cls.serialize_value(
     File "/opt/airflow/airflow/models/xcom.py", line 452, in serialize_value
       return json.dumps(value, cls=XComEncoder)
     File "/usr/local/lib/python3.9/json/__init__.py", line 234, in dumps
       return cls(
     File "/opt/airflow/airflow/utils/json.py", line 99, in encode
       raise AttributeError(f"reserved key {CLASSNAME} found in dict to 
serialize")
   AttributeError: reserved key __classname__ found in dict to serialize
   
   ``` 
   
   **Celery worker logs:**
   
   ```typescript
   HTTPStatusError: Server error '500 Internal Server Error' for url 
   
'http://localhost:9091/execution/xcoms/user_defined_macros/manual__2025-03-04T13:57:20.179009+00:00_
   usc2Jsmo/check_user_defined_macros/return_value'
   
   ``` 
   
   ### What you think should happen instead?
   
   User defined macros should work as it was working in AF2
   
   ### How to reproduce
   
   Run the below DAG in Airflow 3 beta1 
   
   ```python
   from airflow.providers.standard.operators.python import PythonOperator
   from pendulum import today
   
   from airflow.models import DAG
   from dags.plugins.airflow_dag_introspection import log_checker
   
   docs = """
   ####Purpose
   This dag tests that the user_defined_macros which allow the user to define 
their own custom macros works properly.
   ####Expected Behavior
   This dag has 2 tasks that are both expected to succeed. If one or both tasks 
fail then there is a problem with 'user_defined_macros'.\n
   The first task returns the 'macro1' and 'macro2' functions.\n
   The second task checks the logs of the first task to ensure the custom macro 
functions printed to the logs
   """
   
   
   def user_macro1(num: int) -> str:
       squares = []
       for i in range(1, num):
           if i * i == num:
               squares.append(i)
               return f"{num} is a square number {i} and {i} are its roots"
       # return f"The squares between 1 and {num} are: {squares}"
   
   
   def user_macro2(num):
       primes = []
       for i in range(2, num):
           if i % 2 != 0 and i % 3 != 0 and i % 5 != 0 and i % 7 != 0:
               primes.append(i)
       return f"The prime numbers between 1 and {num} are {primes}"
   
   
   def check_macros(func1, func2):
       return func1, func2
   
   
   with DAG(
       dag_id="user_defined_macros",
       schedule=None,
       start_date=today('UTC').add(days=-2),
       user_defined_macros={"macro1": user_macro1(225), "macro2": 
user_macro2(100)},
       doc_md=docs,
       tags=["core"],
   ) as dag:
   
       py0 = PythonOperator(
           task_id="check_user_defined_macros",
           python_callable=check_macros,
           op_args=["{{ macro1 }}", "{{ macro2 }}"],
       )
   
       py1 = PythonOperator(
           task_id="check_the_logs",
           python_callable=log_checker,
           op_args=[
               "check_user_defined_macros",
               "Returned value was: ('225 is a square number 15 and 15 are its 
roots', 'The prime numbers between 1 and 100 are [11, 13, 17, 19, 23, 29, 31, 
37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]')",
               "Done. Returned value was: None",
           ],
       )
   
   py0 >> py1
   
   ``` 
   
   ### Operating System
   
   Linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to