nikhilcss97 opened a new issue, #56529:
URL: https://github.com/apache/airflow/issues/56529
### Apache Airflow version
3.1.0
### If "Other Airflow 2/3 version" selected, which one?
_No response_
### What happened?
After upgrading from Airflow 3.0.6 to Airflow 3.10 we started getting this
issue in the Dag processor which keeps restarting on its own
Logs:
```
2025-10-09T14:12:04.364297Z [error ] Failed to deserialize DAG
[airflow.serialization.serialized_objects] loc=serialized_objects.py:3890
return cls._deserialize_datetime(value) if value is not None else None
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/_shared/timezones/timezone.py",
line 302, in from_timestamp
result = coerce_datetime(dt.datetime.fromtimestamp(timestamp, tz=utc))
TypeError: 'str' object cannot be interpreted as an integer
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/jobs/dag_processor_job_runner.py",
line 61, in _execute
self.processor.run()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 272, in run
return self._run_parsing_loop()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 361, in _run_parsing_loop
self._collect_results()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/session.py",
line 100, in wrapper
return func(*args, session=session, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 827, in _collect_results
self._file_stats[file] = process_parse_results(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 1155, in process_parse_results
update_dag_parsing_results_in_db(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 372, in update_dag_parsing_results_in_db
for attempt in run_with_db_retries(logger=log):
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
443, in __iter__
do = self.iter(retry_state=retry_state)
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
376, in iter
result = action(retry_state)
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
File "/usr/python/lib/python3.10/concurrent/futures/_base.py", line 451,
in result
return self.__get_result()
File "/usr/python/lib/python3.10/concurrent/futures/_base.py", line 403,
in __get_result
raise self._exception
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 382, in update_dag_parsing_results_in_db
SerializedDAG.bulk_write_to_db(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/session.py",
line 98, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2868, in bulk_write_to_db
dag_op.update_dags(orm_dags, parse_duration, session=session)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 534, in update_dags
dm.calculate_dagrun_date_fields(dag, last_automated_data_interval) #
type: ignore[arg-type]
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/models/dag.py", line
680, in calculate_dagrun_date_fields
next_dagrun_info = dag.next_dagrun_info(last_automated_data_interval)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 3879, in next_dagrun_info
return self._real_dag.next_dagrun_info(*args, **kwargs)
File "/usr/python/lib/python3.10/functools.py", line 981, in __get__
val = self.func(instance)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 3888, in _real_dag
return SerializedDAG.from_dict(self.data)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2837, in from_dict
return cls.deserialize_dag(serialized_obj["dag"], client_defaults)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2528, in deserialize_dag
raise DeserializationError(dag_id) from err
airflow.exceptions.DeserializationError: An unexpected error occurred while
trying to deserialize Dag 'eFundamentals_mla_availability_report'
2025-10-09T14:12:04.407428Z [info ] Process exited
[supervisor] exit_code=<Negsignal.SIGTERM: -15> loc=supervisor.py:709 pid=2802
signal_sent=SIGTERM
2025-10-09T14:12:04.409223Z [info ] Waiting up to 5 seconds for
processes to exit... [airflow.utils.process_utils] loc=process_utils.py:285
Traceback (most recent call last):
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2522, in deserialize_dag
return cls._deserialize_dag_internal(encoded_dag, client_defaults)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2548, in _deserialize_dag_internal
deser = SerializedBaseOperator.deserialize_operator(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 1768, in deserialize_operator
cls.populate_operator(op, encoded_op, client_defaults)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 1610, in populate_operator
v = cls._deserialize_partial_kwargs(v, client_defaults)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2133, in _deserialize_partial_kwargs
deserialized[k] = cls._deserialize_field_value(k, v)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2101, in _deserialize_field_value
return cls._deserialize_datetime(value) if value is not None else None
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/_shared/timezones/timezone.py",
line 302, in from_timestamp
result = coerce_datetime(dt.datetime.fromtimestamp(timestamp, tz=utc))
TypeError: 'str' object cannot be interpreted as an integer
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.venv/bin/airflow", line 10, in <module>
sys.exit(main())
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/__main__.py", line
55, in main
args.func(args)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/cli/cli_config.py",
line 49, in command
return func(*args, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/cli.py", line
114, in wrapper
return f(*args, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/providers_configuration_loader.py",
line 54, in wrapped_function
return func(*args, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/cli/commands/dag_processor_command.py",
line 53, in dag_processor
run_command_with_daemon_option(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/cli/commands/daemon_utils.py",
line 86, in run_command_with_daemon_option
callback()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/cli/commands/dag_processor_command.py",
line 56, in <lambda>
callback=lambda: run_job(job=job_runner.job,
execute_callable=job_runner._execute),
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/session.py",
line 100, in wrapper
return func(*args, session=session, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/jobs/job.py", line
368, in run_job
return execute_job(job, execute_callable=execute_callable)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/jobs/job.py", line
397, in execute_job
ret = execute_callable()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/jobs/dag_processor_job_runner.py",
line 61, in _execute
self.processor.run()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 272, in run
return self._run_parsing_loop()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 361, in _run_parsing_loop
self._collect_results()
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/session.py",
line 100, in wrapper
return func(*args, session=session, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 827, in _collect_results
self._file_stats[file] = process_parse_results(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
line 1155, in process_parse_results
update_dag_parsing_results_in_db(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 372, in update_dag_parsing_results_in_db
for attempt in run_with_db_retries(logger=log):
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
443, in __iter__
do = self.iter(retry_state=retry_state)
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
376, in iter
result = action(retry_state)
File
"/home/airflow/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line
398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
File "/usr/python/lib/python3.10/concurrent/futures/_base.py", line 451,
in result
return self.__get_result()
File "/usr/python/lib/python3.10/concurrent/futures/_base.py", line 403,
in __get_result
raise self._exception
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 382, in update_dag_parsing_results_in_db
SerializedDAG.bulk_write_to_db(
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/utils/session.py",
line 98, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2868, in bulk_write_to_db
dag_op.update_dags(orm_dags, parse_duration, session=session)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/dag_processing/collection.py",
line 534, in update_dags
dm.calculate_dagrun_date_fields(dag, last_automated_data_interval) #
type: ignore[arg-type]
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/models/dag.py", line
680, in calculate_dagrun_date_fields
next_dagrun_info = dag.next_dagrun_info(last_automated_data_interval)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 3879, in next_dagrun_info
return self._real_dag.next_dagrun_info(*args, **kwargs)
File "/usr/python/lib/python3.10/functools.py", line 981, in __get__
val = self.func(instance)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 3888, in _real_dag
return SerializedDAG.from_dict(self.data)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2837, in from_dict
return cls.deserialize_dag(serialized_obj["dag"], client_defaults)
File
"/home/airflow/.venv/lib/python3.10/site-packages/airflow/serialization/serialized_objects.py",
line 2528, in deserialize_dag
raise DeserializationError(dag_id) from err
airflow.exceptions.DeserializationError: An unexpected error occurred while
trying to deserialize Dag 'eFundamentals_mla_availability_report'
```
### What you think should happen instead?
_No response_
### How to reproduce
We are not exactly sure why this started happening.
### Operating System
Ubuntu
### Versions of Apache Airflow Providers
We are pretty much on the latest versions of most of the providers.
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
We are on K8s
### Anything else?
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]