This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
     new 0eb21e6  Add docs on using DAGRun.conf (#9578)
0eb21e6 is described below

commit 0eb21e63f32a272d058ee039b62dc7b73deb3e76
Author: Kaxil Naik <[email protected]>
AuthorDate: Tue Jun 30 00:59:40 2020 +0100

    Add docs on using DAGRun.conf (#9578)
    
    closes https://github.com/apache/airflow/issues/8900
    
    (cherry picked from commit ce4c2297c6c30ccae4222633b923e1a1cda98ec6)
---
 airflow/models/dag.py             |   2 ++
 docs/dag-run.rst                  |  37 +++++++++++++++++++++++++++++++++++++
 docs/img/example_passing_conf.png | Bin 0 -> 97482 bytes
 3 files changed, 39 insertions(+)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 54bc87e..e24c164 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1463,6 +1463,8 @@ class DAG(BaseDag, LoggingMixin):
         :type start_date: datetime
         :param external_trigger: whether this dag run is externally triggered
         :type external_trigger: bool
+        :param conf: Dict containing configuration/parameters to pass to the 
DAG
+        :type conf: dict
         :param session: database session
         :type session: sqlalchemy.orm.session.Session
         """
diff --git a/docs/dag-run.rst b/docs/dag-run.rst
index 2477d6a..9ea79ea 100644
--- a/docs/dag-run.rst
+++ b/docs/dag-run.rst
@@ -189,6 +189,43 @@ The default is the current date in the UTC timezone.
 
 In addition, you can also manually trigger a DAG Run using the web UI (tab 
**DAGs** -> column **Links** -> button **Trigger Dag**)
 
+Passing Parameters when triggering dags
+------------------------------------------
+
+When triggering a DAG from the CLI, the REST API or the UI, it is possible to 
pass configuration for a DAGRun as
+a JSON blob.
+
+Example of a parameterized DAG:
+
+.. code-block:: python
+
+    from airflow import DAG
+    from airflow.operators.bash_operator import BashOperator
+    from airflow.utils.dates import days_ago
+
+    dag = DAG("example_parametrized_dag", schedule_interval=None, 
start_date=days_ago(2))
+
+    parameterized_task = BashOperator(
+        task_id='parameterized_task',
+        bash_command="echo value: {{ dag_run.conf['conf1'] }}",
+        dag=dag,
+    )
+
+
+**Note**: The parameters from ``dag_run.conf`` can only be used in a template 
field of an operator.
+
+Using CLI
+^^^^^^^^^^^
+
+.. code-block:: bash
+
+    airflow dags trigger --conf '{"conf1": "value1"}' example_parametrized_dag
+
+Using UI
+^^^^^^^^^^
+
+.. image:: img/example_passing_conf.png
+
 To Keep in Mind
 ''''''''''''''''
 * Marking task instances as failed can be done through the UI. This can be 
used to stop running task instances.
diff --git a/docs/img/example_passing_conf.png 
b/docs/img/example_passing_conf.png
new file mode 100644
index 0000000..411cae7
Binary files /dev/null and b/docs/img/example_passing_conf.png differ

Reply via email to