vatsrahul1001 opened a new issue, #46645:
URL: https://github.com/apache/airflow/issues/46645

   ### Apache Airflow version
   
   3.0.0a1
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   We had a DAG that is working fine with AF2, now giving an import error.
   
   ```
   
   Traceback (most recent call last):
     File "/opt/airflow/airflow/dag_processing/collection.py", line 193, in 
_serialize_dag_capturing_errors
       dag_was_updated = SerializedDagModel.write_dag(
     File "/opt/airflow/airflow/utils/session.py", line 98, in wrapper
       return func(*args, **kwargs)
     File "/opt/airflow/airflow/models/serialized_dag.py", line 198, in 
write_dag
       new_serialized_dag = cls(dag)
     File "<string>", line 4, in __init__
     File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/state.py", 
line 482, in _initialize_instance
       manager.dispatch.init_failure(self, args, kwargs)
     File 
"/usr/local/lib/python3.9/site-packages/sqlalchemy/util/langhelpers.py", line 
70, in __exit__
       compat.raise_(
     File "/usr/local/lib/python3.9/site-packages/sqlalchemy/util/compat.py", 
line 211, in raise_
       raise exception
     File "/usr/local/lib/python3.9/site-packages/sqlalchemy/orm/state.py", 
line 479, in _initialize_instance
       return manager.original_init(*mixed[1:], **kwargs)
     File "/opt/airflow/airflow/models/serialized_dag.py", line 120, in __init__
       self.dag_hash = SerializedDagModel.hash(dag_data)
     File "/opt/airflow/airflow/models/serialized_dag.py", line 142, in hash
       dag_data = cls._sort_serialized_dag_dict(dag_data)
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 155, in 
_sort_serialized_dag_dict
       [cls._sort_serialized_dag_dict(i) for i in serialized_dag],
     File "/opt/airflow/airflow/models/serialized_dag.py", line 155, in 
<listcomp>
       [cls._sort_serialized_dag_dict(i) for i in serialized_dag],
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
_sort_serialized_dag_dict
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 150, in 
<dictcomp>
       return {k: cls._sort_serialized_dag_dict(v) for k, v in 
sorted(serialized_dag.items())}
     File "/opt/airflow/airflow/models/serialized_dag.py", line 153, in 
_sort_serialized_dag_dict
       if all("task_id" in i.get("__var", {}) for i in serialized_dag):
     File "/opt/airflow/airflow/models/serialized_dag.py", line 153, in 
<genexpr>
       if all("task_id" in i.get("__var", {}) for i in serialized_dag):
   TypeError: argument of type 'float' is not iterable
   ```
   
   
   
   
   ### What you think should happen instead?
   
   DAG should not give an import error
   
   I think below is failing
   
   `delta=list(map(lambda x: timedelta(seconds=x), [30, 60, 90]))`
   
   ### How to reproduce
   
   Trying running airflow instance with below DAG
   
   **DAG CODE**
   ```
   
   from datetime import datetime, timedelta
   from time import sleep
   
   from airflow import DAG
   from airflow.decorators import task
   from airflow.models.taskinstance import TaskInstance
   from airflow.providers.standard.operators.python import PythonOperator
   from airflow.providers.standard.sensors.date_time import DateTimeSensor, 
DateTimeSensorAsync
   from airflow.providers.standard.sensors.time_delta import TimeDeltaSensor, 
TimeDeltaSensorAsync
   
   delays = [30, 60, 90]
   
   
   @task
   def get_delays():
       return delays
   
   
   @task
   def get_wakes(delay, **context):
       "Wake {delay} seconds after the task starts"
       ti: TaskInstance = context["ti"]
       return (ti.start_date + timedelta(seconds=delay)).isoformat()
   
   
   with DAG(
       dag_id="datetime_mapped",
       start_date=datetime(1970, 1, 1),
       schedule=None,
       tags=["taskmap"]
   ) as dag:
   
       wake_times = get_wakes.expand(delay=get_delays())
   
       
DateTimeSensor.partial(task_id="expanded_datetime").expand(target_time=wake_times)
       TimeDeltaSensor.partial(task_id="expanded_timedelta").expand(
           delta=list(map(lambda x: timedelta(seconds=x), [30, 60, 90]))
       )
   
       DateTimeSensorAsync.partial(task_id="expanded_datetime_async").expand(
           target_time=wake_times
       )
       TimeDeltaSensorAsync.partial(task_id="expanded_timedelta_async").expand(
           delta=list(map(lambda x: timedelta(seconds=x), [30, 60, 90]))
       )
   
       TimeDeltaSensor(task_id="static_timedelta", delta=timedelta(seconds=90))
       DateTimeSensor(
           task_id="static_datetime",
           target_time="{{macros.datetime.now() + 
macros.timedelta(seconds=90)}}",
       )
   
       PythonOperator(task_id="op_sleep_90", python_callable=lambda: sleep(90))
   
   ```
   
   ### Operating System
   
   Linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to