Hi Aaron, It looks like you hit a bug: https://issues.apache.org/jira/browse/AIRFLOW-1977
Currently I don't have a solution for this at hand, I have to dig into the code. If the DAG is scheduled one, you might want to try setting it to None instead of @once: https://airflow.apache.org/scheduler.html#dag-runs Cheers, Fokko 2018-02-12 18:30 GMT+01:00 Aaron Polhamus <[email protected]>: > Question on StackOverflow. Has anyone else dealt with this > https://stackoverflow.com/questions/48752087/odd- > typeerror-from-the-airflow-scheduler-has-usage-of-once-for-scheduler-int > > I have a super simple test DAG that looks like this: > > from datetime import datetime > from airflow.models import DAGfrom airflow.operators.python_operator import > PythonOperator > > > DAG = DAG( > dag_id='scheduler_test_dag', > start_date=datetime(2017, 9, 9, 4, 0, 0, 0), #..EC2 time. Equal to 11pm > hora México > max_active_runs=1, > schedule_interval='@once' #externally triggered > ) > def ticker_function(): > with open('/tmp/ticker', 'a') as outfile: > outfile.write('{}\n'.format(datetime.now())) > > time_ticker = PythonOperator( > task_id='time_ticker', > python_callable=ticker_function, > dag=DAG) > > Since upgrading to apache-airflow v1.9 this DAG is hung and won't run. > Digging into the scheduler logs I found the error trace: > > [2018-02-12 17:03:06,259] {jobs.py:1754} INFO - DAG(s) > dict_keys(['scheduler_test_dag']) retrieved from > /home/ubuntu/airflow/dags/scheduler_test_dag.py[2018-02-12 17:03:06,315] > {jobs.py:1386} INFO - Processing scheduler_test_dag[2018-02-12 17:03:06,320] > {jobs.py:379} ERROR - Got an exception! Propagating...Traceback (most recent > call last): > File "/usr/local/lib/python3.5/dist-packages/airflow/jobs.py", line 371, in > helper > pickle_dags) > File "/usr/local/lib/python3.5/dist-packages/airflow/utils/db.py", line 50, > in wrapper > result = func(*args, **kwargs) > File "/usr/local/lib/python3.5/dist-packages/airflow/jobs.py", line 1792, > in process_file > self._process_dags(dagbag, dags, ti_keys_to_schedule) > File "/usr/local/lib/python3.5/dist-packages/airflow/jobs.py", line 1388, > in _process_dags > dag_run = self.create_dag_run(dag) > File "/usr/local/lib/python3.5/dist-packages/airflow/utils/db.py", line 50, > in wrapper > result = func(*args, **kwargs) > File "/usr/local/lib/python3.5/dist-packages/airflow/jobs.py", line 807, in > create_dag_runif next_start <= now:TypeError: unorderable types: NoneType() > <= datetime.datetime() > > Where is this error coming from? The only thing that I can think of is > that the usage of scheduler_interval='@once' has changed, which is the > one thing that this DAG has in common with one other broken DAG on my > server since the v1.9 upgrade. Otherwise it's the most basic DAG > ever--doesn't seem like there should be a problem. Previously I was using > the basic pip install before switching to the apache-airflow repo. > > Here's a screenshot of the Web UI. Everything seems to be working alright, > except the top and bottom DAGS which have scheduling interval set to @once > and are indefinitely hung: > > > Any thoughts? > > -- > > > *Aaron Polhamus* > *Chief Technology Officer * > > Cel (México): +52 (55) 1951-5612 <+52%2055%201951%205612> > Cell (USA): +1 (206) 380-3948 <+1%20206-380-3948> > Tel: +52 (55) 1168 9757 <+52%2055%201168%209757> - Ext. 181 > > ***Por favor referirse a nuestra página web > <https://www.credijusto.com/aviso-de-privacidad/> para más información > acerca de nuestras políticas de privacidad.* > >
