amoghrajesh commented on issue #57792:
URL: https://github.com/apache/airflow/issues/57792#issuecomment-3490015628
Thanks @tirkarthi, good investigation.
I was able to achieve something similar in terms on investigation and was
suspecting similar issue too:
```python
os.environ['AIRFLOW__CORE__DEFAULT_TASK_RETRIES'] = '3'
def return_deserialized_task_after_serializng(retries):
from datetime import datetime
from airflow.sdk import DAG
from airflow.providers.standard.operators.bash import BashOperator
from airflow.serialization.serialized_objects import SerializedDAG
from airflow.sdk.bases.operator import OPERATOR_DEFAULTS
dag_0 = DAG(f'test_{retries}', default_args={'retries': retries},
start_date=datetime(2024, 1, 1))
task_0 = BashOperator(task_id=f't{retries}', bash_command='echo',
dag=dag_0)
serialized_0 = SerializedDAG.to_dict(dag_0)
task_data_0 = serialized_0["dag"]["tasks"][0]["__var"]
client_defaults = serialized_0.get("client_defaults", {}).get("tasks",
{})
deserialized_dag_0 = SerializedDAG.from_dict(serialized_0)
deserialized_task_0 = deserialized_dag_0.get_task(f't{retries}')
return deserialized_task_0
```
When I call this with retries = 0 and 1, I get:
```shell script
ds_task_0 = return_deserialized_task_after_serializng(0)
ds_task_0.retries
Out[8]: 3
ds_task_1 = return_deserialized_task_after_serializng(1)
ds_task_1.retries
Out[16]: 1
```
It is overriden by conf at runtime. Let me try a fix
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]