jedcunningham commented on a change in pull request #21501:
URL: https://github.com/apache/airflow/pull/21501#discussion_r804345052
##########
File path: airflow/settings.py
##########
@@ -209,6 +209,18 @@ def get_airflow_context_vars(context):
return {}
+def get_dagbag_import_timeout(dag_file_path: str) -> float:
+ """
+ This setting allows to dynamically control the dag file parsing timeout.
Review comment:
```suggestion
This setting allows for dynamic control of the dag file parsing timeout
based on the DAG file path.
```
##########
File path: docs/apache-airflow/faq.rst
##########
@@ -119,6 +119,30 @@ How do I trigger tasks based on another task's failure?
You can achieve this with :ref:`concepts:trigger-rules`.
+How to control DAG file parsing timeout for different DAG files?
+---------------------------------------------------------------------------------
+(only valid for Airflow >= 2.3.0)
+
+You can add a function ``get_dagbag_import_timeout`` in the
``airflow_local_settings.py``. This function gets
+called right before a DAG file is parsed. You can return different timeout
value based on the DAG file.
Review comment:
```suggestion
called right before a DAG file is parsed. You can return a different timeout
value based on the DAG file.
```
##########
File path: docs/apache-airflow/faq.rst
##########
@@ -119,6 +119,30 @@ How do I trigger tasks based on another task's failure?
You can achieve this with :ref:`concepts:trigger-rules`.
+How to control DAG file parsing timeout for different DAG files?
+---------------------------------------------------------------------------------
+(only valid for Airflow >= 2.3.0)
Review comment:
```suggestion
How to control DAG file parsing timeout for different DAG files?
----------------------------------------------------------------
(only valid for Airflow >= 2.3.0)
```
nit
##########
File path: airflow/settings.py
##########
@@ -209,6 +209,18 @@ def get_airflow_context_vars(context):
return {}
+def get_dagbag_import_timeout(dag_file_path: str) -> float:
+ """
+ This setting allows to dynamically control the dag file parsing timeout.
Review comment:
Was this missed?
##########
File path: docs/apache-airflow/faq.rst
##########
@@ -119,6 +119,30 @@ How do I trigger tasks based on another task's failure?
You can achieve this with :ref:`concepts:trigger-rules`.
+How to control DAG file parsing timeout for different DAG files?
+---------------------------------------------------------------------------------
+(only valid for Airflow >= 2.3.0)
+
+You can add a function ``get_dagbag_import_timeout`` in the
``airflow_local_settings.py``. This function gets
+called right before a DAG file is parsed. You can return different timeout
value based on the DAG file.
+When the return value is less than or equal to 0, it means no timeout during
the DAG parsing.
+
+.. code-block:: python
+ :caption: airflow_local_settings.py
+ :name: airflow_local_settings.py
+
+ def get_dagbag_import_timeout(dag_file_path: str) -> Union[int, float]:
+ """
+ This setting allows to dynamically control the DAG file parsing
timeout.
+
+ It is useful when there are a few DAG files requiring longer parsing
times, while others do not.
+ You can control them separately instead of having one value for all
DAG files.
+
+ If the return value is less than or equal to 0, it means no timeout
during the DAG parsing.
+ """
Review comment:
Maybe have a simple example instead?
```suggestion
if "slow" in dag_file_path:
return 90
return conf.getfloat('core', 'DAGBAG_IMPORT_TIMEOUT')
```
##########
File path: docs/apache-airflow/faq.rst
##########
@@ -119,6 +119,30 @@ How do I trigger tasks based on another task's failure?
You can achieve this with :ref:`concepts:trigger-rules`.
+How to control DAG file parsing timeout for different DAG files?
+---------------------------------------------------------------------------------
+(only valid for Airflow >= 2.3.0)
+
+You can add a function ``get_dagbag_import_timeout`` in the
``airflow_local_settings.py``. This function gets
+called right before a DAG file is parsed. You can return different timeout
value based on the DAG file.
+When the return value is less than or equal to 0, it means no timeout during
the DAG parsing.
+
+.. code-block:: python
+ :caption: airflow_local_settings.py
+ :name: airflow_local_settings.py
+
+ def get_dagbag_import_timeout(dag_file_path: str) -> Union[int, float]:
+ """
+ This setting allows to dynamically control the DAG file parsing
timeout.
+
+ It is useful when there are a few DAG files requiring longer parsing
times, while others do not.
+ You can control them separately instead of having one value for all
DAG files.
+
+ If the return value is less than or equal to 0, it means no timeout
during the DAG parsing.
+ """
+
+
+
Review comment:
```suggestion
```
nit: one less empty line to be consistent
##########
File path: docs/apache-airflow/faq.rst
##########
@@ -119,6 +119,30 @@ How do I trigger tasks based on another task's failure?
You can achieve this with :ref:`concepts:trigger-rules`.
+How to control DAG file parsing timeout for different DAG files?
+---------------------------------------------------------------------------------
+(only valid for Airflow >= 2.3.0)
+
+You can add a function ``get_dagbag_import_timeout`` in the
``airflow_local_settings.py``. This function gets
Review comment:
```suggestion
You can add a ``get_dagbag_import_timeout`` function in your
``airflow_local_settings.py`` which gets
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]