gianniskod opened a new issue, #34889:
URL: https://github.com/apache/airflow/issues/34889
### Apache Airflow version
2.7.1
### What happened
Trying to test **EcsRunTaskOperator** in deferrable mode resulted in an
unexpected error at the `_start_task()` step of the Operator's `execute`
method. The return error log was
`{standard_task_runner.py:104} ERROR - Failed to execute job 28 for task
hello-world-defer (date value out of range; 77)`
After a lot of research to understand the `date value out of range` specific
error, I found [this PR](https://github.com/apache/airflow/pull/33712) in which
I found from the [change
log](https://github.com/apache/airflow/pull/33712/files#diff-4dba25d07d7d8c4cb47ef85e814f123c9171072b240d605fffd59b29ee3b31eb)
that the `waiter_max_attempts` was switched to `1000000 * 365 * 24 * 60 * 10`
(Which results in 1M years). This change cannot work properly with an internal
Airflow date calculation, related to the Waiter's retries.
### What you think should happen instead
Unfortunately, I haven't been able to track the error further but by
changing to a lower limit of 100000 waiter_max_attempts it worked as expected.
My suggestion would be to decrease the default value of
**waiter_max_attempts**, maybe 1000000 (1M) retries is a valid number of
retries. These results will set the default value of the expected running
attempt time to 1000000 * 6 ~ 70days
### How to reproduce
By keeping the default values of **EcsRunTaskOperator** while trying to use
it in deferrable mode.
### Operating System
Debian
### Versions of Apache Airflow Providers
apache-airflow-providers-airbyte==3.3.2
apache-airflow-providers-amazon==8.7.1
apache-airflow-providers-celery==3.3.4
apache-airflow-providers-common-sql==1.7.2
apache-airflow-providers-docker==3.7.5
apache-airflow-providers-ftp==3.1.0
apache-airflow-providers-http==4.5.2
apache-airflow-providers-imap==3.3.0
apache-airflow-providers-postgres==5.6.1
apache-airflow-providers-redis==3.3.2
apache-airflow-providers-snowflake==4.4.2
apache-airflow-providers-sqlite==3.2.1
### Deployment
Other Docker-based deployment
### Deployment details
- Custom Deploy using ECS and Task Definition Services on EC2 for running
AIrflow services.
- Extending Base Airflow Image to run on each Container Service
(_apache/airflow:latest-python3.11_)
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]