Hi All,

 

Could any one please help me with airflow Dag Dependency . Attached are my 2 Dags.  It seems the dependency is working but below parameter making the Task to reschedule with out executing the second task in Dag2.

 

Any one help on solving this issue.

 

Thanks

Sai Prasad Golla

 

Sent from Mail for Windows 10

 

From: Kaxil Naik
Sent: January 18, 2020 9:33 AM
Subject: Re: Airflow webserver can't turn on DAG?

 

I will add some docs on using airflow_local_settings.py when I am back from holidays.

 

On Sat, Jan 18, 2020, 18:19 Ash Berlin-Taylor <[email protected]> wrote:

Setting that JSON setting is optional (and it defaults to the built in JSON lib)

But yeah, that feature isn't well documented. If that file is found in your python include path Airflow will load it it can override a few more advanced settings not easily controllable from a config file.

The `confog/` folder under airflow home is added to the search path and is a good place for this file.

Ash

On 17 January 2020 02:57:21 GMT+03:00, Daniel Standish <[email protected]> wrote:

I think it's in general not advisable to mess with your site packages directory.  It would get overwritten with an upgrade.

 

The configurable stuff usually gets deposited in your AIRFLOW_HOME 

or at least it ought to i would think?

like these:

airflow.cfg
unittests.cfg
webserver_config.py

 

I suspect this is probably supposed to go in AIRFLOW_HOME but it's just not getting autocreated.  And maybe there is some documentation missing.  I can find a few references to this file in docs but nothing explicitly discussing it like there is with airflow.cfg IIRC

 

 

 

 

On Thu, Jan 16, 2020 at 3:41 PM Reed Villanueva <[email protected]> wrote:

@daniel

 

That file is part of the airflow package (not the AIRFLOW_HOME dir as I (and I am guessing, you) assumed). Can find on your machine with something like...

 

[airflow@airflowetl]$ find ~/ -name airflow_local_settings.py
/home/airflow/.local/lib/python3.6/site-packages/airflow/config_templates/airflow_local_settings.py

 

I think this is what the docs are referring to.

 

On Thu, Jan 16, 2020 at 1:17 PM Daniel Standish <[email protected]> wrote:

@ash reading that page i notice a reference to airflow_local_settings.py

 

you need to define a json variable in local Airflow settings (airflow_local_settings.py

 

is that a thing?  do we have settings configurability with a python file?  or does this mean to point us to airflow.cfg?

 

i tried searching docs but found nothing indicating how to use airflow_local_settings.py to configure airflow.


This electronic message is intended only for the named
recipient, and may contain information that is confidential or
privileged. If you are not the intended recipient, you are
hereby notified that any disclosure, copying, distribution or
use of the contents of this message is strictly prohibited. If
you have received this message in error or are not the named
recipient, please notify us immediately by contacting the
sender at the electronic mail address noted above, and delete
and destroy all copies of this message. Thank you.

 

import datetime
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime(2020,1,18,1,15),
    'email': ['[email protected]'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    # 'retry_delay': timedelta(minutes=5),
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
}



dag = DAG(
    'Hello1', catchup=False, 
schedule_interval=timedelta(minutes=5),start_date=datetime(2020, 1, 
18,1,15),default_args=default_args )


t1 = BashOperator(
    task_id='T1',
    bash_command='/home/radmin/AIStratBuilder/script/Test_Script.sh ',
    dag=dag
)

import datetime
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from airflow.operators.sensors import ExternalTaskSensor
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.models import TaskInstance
from airflow.utils.db import provide_session
from airflow.utils.decorators import apply_defaults
from airflow.utils.state import State
from airflow.operators.dummy_operator import DummyOperator


dag = DAG('Hello2', description='DAG with sensor', 
schedule_interval=timedelta(minutes=5),start_date=datetime(2020, 1, 18,1,15)
          )

sensor=ExternalTaskSensor(
    task_id='Wait_Task',
    external_dag_id='Hello1',
    external_task_id='T1',
    execution_delta=timedelta(minutes=5),
    mode = 'reschedule',
    allowed_states='None',
    dag=dag)


task = BashOperator(
       task_id='T2',
       bash_command='echo Hi ',
       dag=dag
)

task.set_upstream(sensor)

Reply via email to