[ https://issues.apache.org/jira/browse/AIRFLOW-217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15321180#comment-15321180 ]
Navjot commented on AIRFLOW-217: -------------------------------- We already resolved this problem. We were running different versions of Airflow on master and worker. Now we are using Ubuntu instances with same versions of Airflow on master and workers. I don't know if that was the problem but its working now, worker is picking up jobs. > Getting AttributeError: 'DAG' object has no attribute 'task_dict' while using > Celery Executor > --------------------------------------------------------------------------------------------- > > Key: AIRFLOW-217 > URL: https://issues.apache.org/jira/browse/AIRFLOW-217 > Project: Apache Airflow > Issue Type: Bug > Components: aws, celery, executor > Affects Versions: Airflow 1.7.1.2 > Environment: Airflow running on Amazon Linux AMI > Worker running on ubuntu > Reporter: Navjot > Priority: Minor > Attachments: errorCelery.PNG > > > We are trying to implement airflow to run some data intensive processess > I am trying to run a job on worker using Celery executor. I have configured > all the settings but still it is not running the job. I am getting some weird > error, I couldn't find information about it on forums. > I am attaching trace of that error. > {code} > [2016-06-07 19:46:50,857] {__init__.py:36} INFO - Using executor > CeleryExecutor > [2016-06-07 19:46:50,939] {driver.py:120} INFO - Generating grammar tables > from /usr/lib/python2.7/lib2to3/Grammar.txt > [2016-06-07 19:46:50,963] {driver.py:120} INFO - Generating grammar tables > from /usr/lib/python2.7/lib2to3/PatternGrammar.txt > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 15, in <module> > args.func(args) > File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 215, > in run > task = dag.get_task(task_id=args.task_id) > File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 2896, > in get_task > if task_id in self.task_dict: > AttributeError: 'DAG' object has no attribute 'task_dict' > [2016-06-07 19:46:51,206: ERROR/Worker-14] Command 'airflow run montanaprod > pulldatamontana 2016-06-03T00:00:00 --pickle 26 --local > -s 2016-06-03T00:00:00 ' returned non-zero exit status 1 > [2016-06-07 19:46:51,544: ERROR/MainProcess] Task > airflow.executors.celery_executor.execute_command[71a26a48-4fbb-4a61-ad1c-2d4a50df4c > 58] raised unexpected: > AirflowException('Celery command failed',) > Traceback (most recent call last): > File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line > 240, in trace_task > R = retval = fun(*args, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line > 438, in __protected_call__ > return self.run(*args, **kwargs) > File > "/usr/local/lib/python2.7/dist-packages/airflow/executors/celery_executor.py", > line 45, in execute_command > raise AirflowException('Celery command failed') > AirflowException: Celery command failed > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)