[jira] [Work started] (AIRFLOW-2845) Remove asserts from the contrib code (change to legal exceptions with speaking names)

2018-08-03 Thread Yuliya Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-2845 started by Yuliya Volkova.
---
> Remove asserts from the contrib code (change to legal exceptions with 
> speaking names) 
> --
>
> Key: AIRFLOW-2845
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2845
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 1.10.1
>Reporter: Yuliya Volkova
>Assignee: Yuliya Volkova
>Priority: Minor
>  Labels: easyfix
> Fix For: 1.9.0
>
>
> Hi guys. A lot of `asserts` is used in Airflow code .  And from point of view 
> for which purposes asserts are really is, it's not correct.
> If we look at documentation we could find information what asserts is debug 
> tool: 
> [https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
> and also it is could be disabled globally by default. 
> If you do not mind, I will be happy to prepare PR for remove asserts from the 
> contrib module with changing it to raising errors with correct Exceptions and 
> messages and not just "Assertion Error".
> I talk only about src (not about asserts in tests). 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2845) Remove asserts from the contrib code (change to legal exceptions with speaking names)

2018-08-03 Thread Yuliya Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-2845:

Summary: Remove asserts from the contrib code (change to legal exceptions 
with speaking names)   (was: Remove asserts from the code (change to legal 
exceptions with speaking names) )

> Remove asserts from the contrib code (change to legal exceptions with 
> speaking names) 
> --
>
> Key: AIRFLOW-2845
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2845
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 1.10.1
>Reporter: Yuliya Volkova
>Assignee: Yuliya Volkova
>Priority: Minor
>  Labels: easyfix
> Fix For: 1.9.0
>
>
> Hi guys. A lot of `asserts` is used in Airflow code .  And from point of view 
> for which purposes asserts are really is, it's not correct.
> If we look at documentation we could find information what asserts is debug 
> tool: 
> [https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
> and also it is could be disabled globally by default. 
> If you do not mind, I will be happy to prepare PR for remove asserts from the 
> contrib module with changing it to raising errors with correct Exceptions and 
> messages and not just "Assertion Error".
> I talk only about src (not about asserts in tests). 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2845) Remove asserts from the code (change to legal exceptions with speaking names)

2018-08-03 Thread Yuliya Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-2845:

Description: 
Hi guys. A lot of `asserts` is used in Airflow code .  And from point of view 
for which purposes asserts are really is, it's not correct.

If we look at documentation we could find information what asserts is debug 
tool: 
[https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
and also it is could be disabled globally by default. 

If you do not mind, I will be happy to prepare PR for remove asserts from the 
contrib module with changing it to raising errors with correct Exceptions and 
messages and not just "Assertion Error".

I talk only about src (not about asserts in tests). 

 

  was:
Hi guys. A lot of `asserts` is used in Airflow's code .  And from point of view 
for which purposes asserts are really is, it's not correct.

If we look at documentation we could find information what asserts is debug 
tool: 
[https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
and also it is could be disabled globally by default. 

If you do not mind, I will be happy to prepare PR for remove asserts from the 
contrib module with changing it to raising errors with correct Exceptions and 
messages and not just "Assertion Error".

I talk only about src (not about asserts in tests). 

 


> Remove asserts from the code (change to legal exceptions with speaking names) 
> --
>
> Key: AIRFLOW-2845
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2845
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 1.10.1
>Reporter: Yuliya Volkova
>Assignee: Yuliya Volkova
>Priority: Minor
>  Labels: easyfix
> Fix For: 1.9.0
>
>
> Hi guys. A lot of `asserts` is used in Airflow code .  And from point of view 
> for which purposes asserts are really is, it's not correct.
> If we look at documentation we could find information what asserts is debug 
> tool: 
> [https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
> and also it is could be disabled globally by default. 
> If you do not mind, I will be happy to prepare PR for remove asserts from the 
> contrib module with changing it to raising errors with correct Exceptions and 
> messages and not just "Assertion Error".
> I talk only about src (not about asserts in tests). 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2845) Remove asserts from the code (change to legal exceptions with speaking names)

2018-08-03 Thread Yuliya Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-2845:

Description: 
Hi guys. A lot of `asserts` is used in Airflow's code .  And from point of view 
for which purposes asserts are really is, it's not correct.

If we look at documentation we could find information what asserts is debug 
tool: 
[https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
and also it is could be disabled globally by default. 

If you do not mind, I will be happy to prepare PR for remove asserts from the 
contrib module with changing it to raising errors with correct Exceptions and 
messages and not just "Assertion Error".

I talk only about src (not about asserts in tests). 

 

  was:
Hi guys. In Airflow code base used a lot of asserts. And from point of view for 
which purposes asserts are really is, it's not correct.

If we look at documentation we could find information what asserts is debug 
tool: 
[https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
and also it is could be disabled globally by default. 

If you do not mind, I will be happy to prepare PR for remove asserts from the 
contrib module with changing it to raising errors with correct Exceptions and 
messages and not just "Assertion Error".

I talk only about src (not about asserts in tests). 

 


> Remove asserts from the code (change to legal exceptions with speaking names) 
> --
>
> Key: AIRFLOW-2845
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2845
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 1.10.1
>Reporter: Yuliya Volkova
>Assignee: Yuliya Volkova
>Priority: Minor
>  Labels: easyfix
> Fix For: 1.9.0
>
>
> Hi guys. A lot of `asserts` is used in Airflow's code .  And from point of 
> view for which purposes asserts are really is, it's not correct.
> If we look at documentation we could find information what asserts is debug 
> tool: 
> [https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
> and also it is could be disabled globally by default. 
> If you do not mind, I will be happy to prepare PR for remove asserts from the 
> contrib module with changing it to raising errors with correct Exceptions and 
> messages and not just "Assertion Error".
> I talk only about src (not about asserts in tests). 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2845) Remove asserts from the code (change to legal exceptions with speaking names)

2018-08-03 Thread Yuliya Volkova (JIRA)
Yuliya Volkova created AIRFLOW-2845:
---

 Summary: Remove asserts from the code (change to legal exceptions 
with speaking names) 
 Key: AIRFLOW-2845
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2845
 Project: Apache Airflow
  Issue Type: Improvement
  Components: contrib
Affects Versions: 1.10.1
Reporter: Yuliya Volkova
Assignee: Yuliya Volkova
 Fix For: 1.9.0


Hi guys. In Airflow code base used a lot of asserts. And from point of view for 
which purposes asserts are really is, it's not correct.

If we look at documentation we could find information what asserts is debug 
tool: 
[https://docs.python.org/3/reference/simple_stmts.html#the-assert-statement] 
and also it is could be disabled globally by default. 

If you do not mind, I will be happy to prepare PR for remove asserts from the 
contrib module with changing it to raising errors with correct Exceptions and 
messages and not just "Assertion Error".

I talk only about src (not about asserts in tests). 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:54 AM:
--

[~paymahn], standard python import do not support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run 

[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:54 AM:
--

[~paymahn], standard python import do not support possible to load modules with 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import do not support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow 

[jira] [Commented] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova commented on AIRFLOW-2119:
-

[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with {{ 
__import__("") but it's anyway terrible, I don't think what need to support 
space inside file names? why you need it? }}

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-16T22:59:26.096432 
> --local -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:27,840] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:27,877] {driver.py:120} INFO - Generating grammar tables 
> 

[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:52 AM:
--

[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with {{ 
__import__("") but it's anyway terrible, I don't think what need to support 
space inside file names? why you need it? }}

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow 

[jira] [Commented] (AIRFLOW-1945) Pass --autoscale to celery workers

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1945?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369063#comment-16369063
 ] 

Yuliya Volkova commented on AIRFLOW-1945:
-

[~mosthege], you can do it with Airflow Plugin way, for example: 
[https://github.com/xnuinside/airflow_plugin_custom_cli/blob/master/custom_cli_plugin.py]
 

> Pass --autoscale to celery workers
> --
>
> Key: AIRFLOW-1945
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1945
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: celery, cli
>Reporter: Michael O.
>Priority: Trivial
>  Labels: easyfix
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> Celery supports autoscaling of the worker pool size (number of tasks that can 
> parallelize within one worker node).  I'd like to propose to support passing 
> the --autoscale parameter to {{airflow worker}}.
> Since this is a trivial change, I am not sure if there's any reason for not 
> being supported already.(?)
> For example
> {{airflow worker --concurrency=4}} will set a fixed pool size of 4.
> With minimal changes in 
> [https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855]
>  it could support
> {{airflow worker --autoscale=2,10}} to set an autoscaled pool size of 2 to 10
> Some references:
> * 
> http://docs.celeryproject.org/en/latest/internals/reference/celery.worker.autoscale.html
> * 
> https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2105) Exception on known event creation

2018-02-15 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366681#comment-16366681
 ] 

Yuliya Volkova commented on AIRFLOW-2105:
-

[~paymahn], try to airflow upgradedb, will it gone correct? 

It's first start after migration to 1.9.0 or not? 

> Exception on known event creation
> -
>
> Key: AIRFLOW-2105
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2105
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> I tried to create a known event through the UI and was shown the following 
> error:
> {noformat}
> ---
> Node: PaymahnSolvvy.local
> ---
> Traceback (most recent call last):
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/app.py",
>  line 1988, in wsgi_app
> response = self.full_dispatch_request()
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/app.py",
>  line 1641, in full_dispatch_request
> rv = self.handle_user_exception(e)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/app.py",
>  line 1544, in handle_user_exception
> reraise(exc_type, exc_value, tb)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/_compat.py",
>  line 33, in reraise
> raise value
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/app.py",
>  line 1639, in full_dispatch_request
> rv = self.dispatch_request()
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/app.py",
>  line 1625, in dispatch_request
> return self.view_functions[rule.endpoint](**req.view_args)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/base.py",
>  line 69, in inner
> return self._run_view(f, *args, **kwargs)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/base.py",
>  line 368, in _run_view
> return fn(self, *args, **kwargs)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/model/base.py",
>  line 1947, in create_view
> return_url=return_url)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/base.py",
>  line 308, in render
> return render_template(template, **kwargs)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/templating.py",
>  line 134, in render_template
> context, ctx.app)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask/templating.py",
>  line 116, in _render
> rv = template.render(context)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/jinja2/environment.py",
>  line 989, in render
> return self.environment.handle_exception(exc_info, True)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/jinja2/environment.py",
>  line 754, in handle_exception
> reraise(exc_type, exc_value, tb)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/jinja2/_compat.py",
>  line 37, in reraise
> raise value.with_traceback(tb)
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
>  line 18, in top-level template code
> {% extends 'admin/model/create.html' %}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
>  line 3, in top-level template code
> {% from 'admin/lib.html' import extra with context %} {# backward 
> compatible #}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
>  line 18, in top-level template code
> {% extends 'admin/base.html' %}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/base.html",
>  line 30, in top-level template code
> {% block page_body %}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
>  line 104, in block "page_body"
> {% block body %}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
>  line 28, in block "body"
> {{ super() }}
>   File 
> "/Users/paymahn/solvvy/scheduler/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
>  line 22, in block "body"
> {% block create_form %}
>   File 
> 

[jira] [Commented] (AIRFLOW-215) Airflow worker (CeleryExecutor) needs to be restarted to pick up tasks

2018-02-15 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366680#comment-16366680
 ] 

Yuliya Volkova commented on AIRFLOW-215:


[~bolke], is this issue fixed already? If is, maybe close the task? 

Didn't catch this behavior on 1.9.0 

> Airflow worker (CeleryExecutor) needs to be restarted to pick up tasks
> --
>
> Key: AIRFLOW-215
> URL: https://issues.apache.org/jira/browse/AIRFLOW-215
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, subdag
>Affects Versions: Airflow 1.7.1.2
>Reporter: Cyril Scetbon
>Priority: Major
>
> We have a main dag that dynamically creates subdags containing tasks using 
> BashOperator. Using CeleryExecutor we see Celery tasks been created with 
> *STARTED* status but they are not picked up by our worker. However, if we 
> restart our worker, then tasks are picked up. 
> Here you can find code if you want to try to reproduce it 
> https://www.dropbox.com/s/8u7xf8jt55v8zio/dags.zip.
> We also tested using LocalExecutor and everything worked fine.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-336) extra variable in celery_executor.py

2018-02-15 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366670#comment-16366670
 ] 

Yuliya Volkova edited comment on AIRFLOW-336 at 2/16/18 7:23 AM:
-

I think, question is, why this: 

!Screen Shot 2018-02-16 at 101648 AM.png!

in celery_executor.py, still in master (above 1.9.0)

This variable seems like redundant and not using inside CeleryExecutor

To remove this?


was (Author: xnuinside):
I think, question is, why this: 

!Screen Shot 2018-02-16 at 101648 AM.png!

in celery_executor.py

This variable seems like redundant and not using inside CeleryExecutor

To remove this?

> extra variable in celery_executor.py
> 
>
> Key: AIRFLOW-336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-336
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Armando Fandango
>Priority: Major
> Attachments: Screen Shot 2018-02-16 at 101648 AM.png
>
>
> PARALLELISM = configuration.get('core', 'PARALLELISM')
> Why is this extra variable in celery_executor.py ? 
> Is there a plan to use it in celeryConfig object ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-336) extra variable in celery_executor.py

2018-02-15 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366670#comment-16366670
 ] 

Yuliya Volkova commented on AIRFLOW-336:


I think, question is, why this: 

!Screen Shot 2018-02-16 at 101648 AM.png!

in celery_executor.py

This variable seems like redundant and not using inside CeleryExecutor

To remove this?

> extra variable in celery_executor.py
> 
>
> Key: AIRFLOW-336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-336
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Armando Fandango
>Priority: Major
> Attachments: Screen Shot 2018-02-16 at 101648 AM.png
>
>
> PARALLELISM = configuration.get('core', 'PARALLELISM')
> Why is this extra variable in celery_executor.py ? 
> Is there a plan to use it in celeryConfig object ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-336) extra variable in celery_executor.py

2018-02-15 Thread Yuliya Volkova (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-336?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-336:
---
Attachment: Screen Shot 2018-02-16 at 101648 AM.png

> extra variable in celery_executor.py
> 
>
> Key: AIRFLOW-336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-336
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Armando Fandango
>Priority: Major
> Attachments: Screen Shot 2018-02-16 at 101648 AM.png
>
>
> PARALLELISM = configuration.get('core', 'PARALLELISM')
> Why is this extra variable in celery_executor.py ? 
> Is there a plan to use it in celeryConfig object ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1405) Airflow v 1.8.1 unable to properly initialize with MySQL

2018-02-14 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16363809#comment-16363809
 ] 

Yuliya Volkova commented on AIRFLOW-1405:
-

Maybe, possible to close this task? It's not bug. Documentation issue solved 
here - https://issues.apache.org/jira/browse/AIRFLOW-1405 

> Airflow v 1.8.1 unable to properly initialize with MySQL
> 
>
> Key: AIRFLOW-1405
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1405
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Affects Versions: 1.8.1
> Environment: CentOS7
>Reporter: Aakash Bhardwaj
>Priority: Major
> Fix For: 1.8.1
>
> Attachments: error_log.txt
>
>
> While working on a CentOS7 system, I was trying to configure Airflow version 
> 1.8.1 to run with MySql in the backend.
> I have installed Airflow in a Virtual Environment, and the MySQL has a 
> database named airflow (default).
> But on running the command -
> {code:none}
> airflow initdb
> {code}
> the following error is reported
> {noformat}
> [2017-07-12 13:22:36,558] {__init__.py:57} INFO - Using executor LocalExecutor
> DB: mysql://airflow:***@localhost/airflow
> [2017-07-12 13:22:37,218] {db.py:287} INFO - Creating tables
> INFO  [alembic.runtime.migration] Context impl MySQLImpl.
> INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
> INFO  [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 
> 4addfa1236f1, Add fractional seconds to mysql tables
> Traceback (most recent call last):
>   File "/opt/airflow_virtual_environment/airflow_venv/bin/airflow", line 28, 
> in 
> args.func(args)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/bin/cli.py",
>  line 951, in initdb
> db_utils.initdb()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 106, in initdb
> upgradedb()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 294, in upgradedb
> command.upgrade(config, 'heads')
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/command.py",
>  line 174, in upgrade
> script.run_env()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/script/base.py",
>  line 416, in run_env
> util.load_python_file(self.dir, 'env.py')
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/util/pyfiles.py",
>  line 93, in load_python_file
> module = load_module_py(module_id, path)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/util/compat.py",
>  line 79, in load_module_py
> mod = imp.load_source(module_id, path, fp)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/env.py",
>  line 86, in 
> run_migrations_online()
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/env.py",
>  line 81, in run_migrations_online
> context.run_migrations()
>   File "", line 8, in run_migrations
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/runtime/environment.py",
>  line 807, in run_migrations
> self.get_context().run_migrations(**kw)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/runtime/migration.py",
>  line 321, in run_migrations
> step.migration_fn(**kw)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/airflow/migrations/versions/4addfa1236f1_add_fractional_seconds_to_mysql_tables.py",
>  line 36, in upgrade
> op.alter_column(table_name='dag', column_name='last_scheduler_run', 
> type_=mysql.DATETIME(fsp=6))
>   File "", line 8, in alter_column
>   File "", line 3, in alter_column
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/ops.py",
>  line 1420, in alter_column
> return operations.invoke(alt)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/base.py",
>  line 318, in invoke
> return fn(self, operation)
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/operations/toimpl.py",
>  line 53, in alter_column
> **operation.kw
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/ddl/mysql.py",
>  line 67, in alter_column
> else existing_autoincrement
>   File 
> "/opt/airflow_virtual_environment/airflow_venv/lib/python2.7/site-packages/alembic/ddl/impl.py",
>  line 118, in 

[jira] [Commented] (AIRFLOW-908) Airflow run should print worker name at top of log

2018-02-13 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-908?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16363592#comment-16363592
 ] 

Yuliya Volkova commented on AIRFLOW-908:


[~aoen], pls close task if it's already done ) to much issues in Jira :)

> Airflow run should print worker name at top of log
> --
>
> Key: AIRFLOW-908
> URL: https://issues.apache.org/jira/browse/AIRFLOW-908
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Dan Davydov
>Assignee: Dan Davydov
>Priority: Major
>  Labels: beginner, starter
>
> Airflow run should log the worker hostname at top of log.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova edited comment on AIRFLOW-1979 at 2/12/18 12:42 PM:
---

[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow on start airflow 
commands


was (Author: xnuinside):
[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow worker

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova edited comment on AIRFLOW-1979 at 2/12/18 12:40 PM:
---

[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow worker


was (Author: xnuinside):
[~redtree1112], hello, did you see 
[https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#celery-config]
 ? 

 {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND could not be in 1.9.0, it was changed 
from CELERY_RESULT_BACKEND to {{result_backend 

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1650) Celery custom config is broken

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360667#comment-16360667
 ] 

Yuliya Volkova commented on AIRFLOW-1650:
-

[https://github.com/apache/incubator-airflow/pull/2639] didn't fix it? 

> Celery custom config is broken
> --
>
> Key: AIRFLOW-1650
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1650
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, configuration
>Reporter: Bolke de Bruin
>Priority: Major
> Fix For: 1.10.0
>
>
> Celery custom config loading is broken as is just loads a string instead of 
> loading a config



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova commented on AIRFLOW-1979:
-

[~redtree1112], hello, did you see 
[https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#celery-config]
 ? 

 {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND could not be in 1.9.0, it was changed 
from CELERY_RESULT_BACKEND to {{result_backend 

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova edited comment on AIRFLOW-2072 at 2/12/18 8:15 AM:
--

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and use i in import

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 


was (Author: xnuinside):
[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova edited comment on AIRFLOW-2072 at 2/12/18 8:14 AM:
--

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 


was (Author: xnuinside):
[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
https://gitter.im/apache/incubator-airflow there you can get answers more 
faster \{#emotions_dlg.smile}

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova commented on AIRFLOW-2072:
-

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
https://gitter.im/apache/incubator-airflow there you can get answers more 
faster \{#emotions_dlg.smile}

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-923) airflow webserver -D flag doesn't daemonize anymore

2018-02-11 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360386#comment-16360386
 ] 

Yuliya Volkova edited comment on AIRFLOW-923 at 2/12/18 7:34 AM:
-

checked in version 1.9.0

daemonization, seems work correct : airflow webserver -D 

screens attached

 


was (Author: xnuinside):
!Screen Shot 2018-02-12 at 10.32.23 AM.png!!Screen Shot 2018-02-12 at 10.32.33 
AM.png!

> airflow webserver -D flag doesn't daemonize anymore
> ---
>
> Key: AIRFLOW-923
> URL: https://issues.apache.org/jira/browse/AIRFLOW-923
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: joyce chan
>Priority: Trivial
> Fix For: Airflow 1.8
>
> Attachments: Screen Shot 2018-02-12 at 10.32.23 AM.png, Screen Shot 
> 2018-02-12 at 10.32.33 AM.png
>
>
> Airflow 1.8 rc4
> airflow webserver -D flag doesn't daemonize anymore



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-923) airflow webserver -D flag doesn't daemonize anymore

2018-02-11 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360384#comment-16360384
 ] 

Yuliya Volkova commented on AIRFLOW-923:


[~joyceschan], please check version 1.9.0

> airflow webserver -D flag doesn't daemonize anymore
> ---
>
> Key: AIRFLOW-923
> URL: https://issues.apache.org/jira/browse/AIRFLOW-923
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: joyce chan
>Priority: Trivial
> Fix For: Airflow 1.8
>
> Attachments: Screen Shot 2018-02-12 at 10.32.23 AM.png, Screen Shot 
> 2018-02-12 at 10.32.33 AM.png
>
>
> Airflow 1.8 rc4
> airflow webserver -D flag doesn't daemonize anymore



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-923) airflow webserver -D flag doesn't daemonize anymore

2018-02-11 Thread Yuliya Volkova (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-923:
---
Attachment: Screen Shot 2018-02-12 at 10.32.33 AM.png
Screen Shot 2018-02-12 at 10.32.23 AM.png

> airflow webserver -D flag doesn't daemonize anymore
> ---
>
> Key: AIRFLOW-923
> URL: https://issues.apache.org/jira/browse/AIRFLOW-923
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: joyce chan
>Priority: Trivial
> Fix For: Airflow 1.8
>
> Attachments: Screen Shot 2018-02-12 at 10.32.23 AM.png, Screen Shot 
> 2018-02-12 at 10.32.33 AM.png
>
>
> Airflow 1.8 rc4
> airflow webserver -D flag doesn't daemonize anymore



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-923) airflow webserver -D flag doesn't daemonize anymore

2018-02-11 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360386#comment-16360386
 ] 

Yuliya Volkova commented on AIRFLOW-923:


!Screen Shot 2018-02-12 at 10.32.23 AM.png!!Screen Shot 2018-02-12 at 10.32.33 
AM.png!

> airflow webserver -D flag doesn't daemonize anymore
> ---
>
> Key: AIRFLOW-923
> URL: https://issues.apache.org/jira/browse/AIRFLOW-923
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: joyce chan
>Priority: Trivial
> Fix For: Airflow 1.8
>
> Attachments: Screen Shot 2018-02-12 at 10.32.23 AM.png, Screen Shot 
> 2018-02-12 at 10.32.33 AM.png
>
>
> Airflow 1.8 rc4
> airflow webserver -D flag doesn't daemonize anymore



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2013) initdb problem on airflow - python 2.7 - centos

2018-02-08 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16357599#comment-16357599
 ] 

Yuliya Volkova commented on AIRFLOW-2013:
-

[~venergiac], hello! why you try to use old version 
[https://pypi.python.org/pypi/airflow] (with pip install airflow) and not used 
apache-airflow - [https://pypi.python.org/pypi/apache-airflow] actual version? 
If you don't want to use 1.9.0 from 
[apache-airflow|https://pypi.python.org/pypi/apache-airflow] , you can load 
stable 1.8.2 from it

> initdb problem on airflow - python 2.7 - centos
> ---
>
> Key: AIRFLOW-2013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2013
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: giacomo veneri
>Priority: Major
>
> {{export AIRFLOW_HOME=/opt/airflow/}}
> {{pip install --ignore-installed setuptools }}
> {{pip install --ignore-installed celery}}
> {{pip install --ignore-installed fernet}}
> {{pip install --ignore-installed airflow}}
> {{pip install --ignore-installed airflow[hive]}}
> {{pip install --ignore-installed airflow[celery]}}
>  
> I get
>  
> {{[2018-01-19 08:00:36,703] \{__init__.py:57} INFO - Using executor 
> SequentialExecutor}}
> {{DB: sqlite:opt/airflow//airflow.db}}
> {{This will drop existing tables if they exist. Proceed? (y/n)y}}
> {{[2018-01-19 08:00:53,097] \{db.py:303} INFO - Dropping tables that exist}}
> {{[2018-01-19 08:00:53,169] \{migration.py:116} INFO - Context impl 
> SQLiteImpl.}}
> {{[2018-01-19 08:00:53,169] \{migration.py:121} INFO - Will assume 
> non-transactional DDL.}}
> {{[2018-01-19 08:00:53,173] \{db.py:287} INFO - Creating tables}}
> {{INFO [alembic.runtime.migration] Context impl SQLiteImpl.}}
> {{INFO [alembic.runtime.migration] Will assume non-transactional DDL.}}
> {{INFO [alembic.runtime.migration] Running upgrade -> e3a246e0dc1, current 
> schema}}
> {{INFO [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 
> 1507a7289a2f, create is_encrypted}}
> {{/usr/lib/python2.7/site-packages/alembic/util/messaging.py:69: UserWarning: 
> Skipping unsupported ALTER for creation of implicit constraint}}
> {{ warnings.warn(msg)}}
> {{INFO [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 
> 13eb55f81627, maintain history for compatibility with earlier migrations}}
> {{INFO [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 
> 338e90f54d61, More logging into task_isntance}}
> {{INFO [alembic.runtime.migration] Running upgrade 338e90f54d61 -> 
> 52d714495f0, job_id indices}}
> {{INFO [alembic.runtime.migration] Running upgrade 52d714495f0 -> 
> 502898887f84, Adding extra to Log}}
> {{INFO [alembic.runtime.migration] Running upgrade 502898887f84 -> 
> 1b38cef5b76e, add dagrun}}
> {{INFO [alembic.runtime.migration] Running upgrade 1b38cef5b76e -> 
> 2e541a1dcfed, task_duration}}
> {{INFO [alembic.runtime.migration] Running upgrade 2e541a1dcfed -> 
> 40e67319e3a9, dagrun_config}}
> {{INFO [alembic.runtime.migration] Running upgrade 40e67319e3a9 -> 
> 561833c1c74b, add password column to user}}
> {{INFO [alembic.runtime.migration] Running upgrade 561833c1c74b -> 
> 4446e08588, dagrun start end}}
> {{INFO [alembic.runtime.migration] Running upgrade 4446e08588 -> 
> bbc73705a13e, Add notification_sent column to sla_miss}}
> {{INFO [alembic.runtime.migration] Running upgrade bbc73705a13e -> 
> bba5a7cfc896, Add a column to track the encryption state of the 'Extra' field 
> in connection}}
> {{INFO [alembic.runtime.migration] Running upgrade bba5a7cfc896 -> 
> 1968acfc09e3, add is_encrypted column to variable table}}
> {{INFO [alembic.runtime.migration] Running upgrade 1968acfc09e3 -> 
> 2e82aab8ef20, rename user table}}
> {{INFO [alembic.runtime.migration] Running upgrade 2e82aab8ef20 -> 
> 211e584da130, add TI state index}}
> {{INFO [alembic.runtime.migration] Running upgrade 211e584da130 -> 
> 64de9cddf6c9, add task fails journal table}}
> {{INFO [alembic.runtime.migration] Running upgrade 64de9cddf6c9 -> 
> f2ca10b85618, add dag_stats table}}
> {{INFO [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 
> 4addfa1236f1, Add fractional seconds to mysql tables}}
> {{INFO [alembic.runtime.migration] Running upgrade 4addfa1236f1 -> 
> 8504051e801b, xcom dag task indices}}
> {{INFO [alembic.runtime.migration] Running upgrade 8504051e801b -> 
> 5e7d17757c7a, add pid field to TaskInstance}}
> {{INFO [alembic.runtime.migration] Running upgrade 5e7d17757c7a -> 
> 127d2bf2dfa7, Add dag_id/state index on dag_run table}}
> {{INFO [alembic.runtime.migration] Running upgrade 127d2bf2dfa7 -> 
> cc1e65623dc7, add max tries column to task instance}}
> {{Traceback (most recent call last):}}
> {{ File "/usr/bin/airflow", line 28, in }}
> {{ args.func(args)}}
> {{ File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 919, in 
> resetdb}}

[jira] [Commented] (AIRFLOW-2073) FileSensor always return True

2018-02-08 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16356890#comment-16356890
 ] 

Yuliya Volkova commented on AIRFLOW-2073:
-

You right, I had been delete my comment, understand that I was wrong

> FileSensor always return True
> -
>
> Key: AIRFLOW-2073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2073
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.9.0
> Environment: Ubuntu 16.04
>Reporter: Pierre Payet
>Priority: Trivial
>
> When using a FileSensor, the path is tested with os.walk. However, this 
> function never raise an error if the path does not exist and the poke will 
> always return True.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Issue Comment Deleted] (AIRFLOW-2073) FileSensor always return True

2018-02-08 Thread Yuliya Volkova (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuliya Volkova updated AIRFLOW-2073:

Comment: was deleted

(was: [~tikok], hello, can you clarify which version of airflow are you using? 

Seems like you talk about version from master branch, because inside stable 
release 1.9.0 there are no FileSensor 
[https://github.com/apache/incubator-airflow/tree/d760d63e1a141a43a4a43daee9abd54cf11c894b/airflow/contrib/sensors]
 , but it exist on master branch)

> FileSensor always return True
> -
>
> Key: AIRFLOW-2073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2073
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.9.0
> Environment: Ubuntu 16.04
>Reporter: Pierre Payet
>Priority: Trivial
>
> When using a FileSensor, the path is tested with os.walk. However, this 
> function never raise an error if the path does not exist and the poke will 
> always return True.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2073) FileSensor always return True

2018-02-08 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16356845#comment-16356845
 ] 

Yuliya Volkova commented on AIRFLOW-2073:
-

[~tikok], hello, can you clarify which version of airflow are you using? 

Seems like you talk about version from master branch, because inside stable 
release 1.9.0 there are no FileSensor 
[https://github.com/apache/incubator-airflow/tree/d760d63e1a141a43a4a43daee9abd54cf11c894b/airflow/contrib/sensors]
 , but it exist on master branch

> FileSensor always return True
> -
>
> Key: AIRFLOW-2073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2073
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.9.0
> Environment: Ubuntu 16.04
>Reporter: Pierre Payet
>Priority: Trivial
>
> When using a FileSensor, the path is tested with os.walk. However, this 
> function never raise an error if the path does not exist and the poke will 
> always return True.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)