[jira] [Commented] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova commented on AIRFLOW-2119:
-

[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with {{ 
__import__("") but it's anyway terrible, I don't think what need to support 
space inside file names? why you need it? }}

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-16T22:59:26.096432 
> --local -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:27,840] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:27,877] {driver.py:120} INFO - Generating grammar tables 
> 

[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:52 AM:
--

[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with {{ 
__import__("") but it's anyway terrible, I don't think what need to support 
space inside file names? why you need it? }}

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow 

[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:54 AM:
--

[~paymahn], standard python import do not support possible to load modules with 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import do not support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow 

[jira] [Comment Edited] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369766#comment-16369766
 ] 

Yuliya Volkova edited comment on AIRFLOW-2119 at 2/20/18 6:54 AM:
--

[~paymahn], standard python import do not support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?


was (Author: xnuinside):
[~paymahn], standard python import didn't support possible to load modules 
whitespace inside names and to do this must be construction with __import__("") 
but it's anyway terrible, I don't think what need to support space inside file 
names? why you need it?

> Celery worker fails when dag has space in filename
> --
>
> Key: AIRFLOW-2119
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
>
> A dag whose filename has a space will cause celery workers to fail as follows:
>  
> {noformat}
> [2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> Starting flask
> [2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
> http://0.0.0.0:8793/ (Press CTRL+C to quit)
> [2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
> -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
> [2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/Grammar.txt
> [2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
> [2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> usage: airflow [-h]
>
> {flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
>...
> airflow: error: unrecognized arguments: world.py
> [2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow 
> run broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> [2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
> airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
>  raised unexpected: AirflowException('Celery command failed',)
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 52, in execute_command
> subprocess.check_call(command, shell=True)
>   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
> dummy_task 2018-02-15T12:00:00 --local -sd 
> /home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
> exit status 2
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 374, in trace_task
> R = retval = fun(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py",
>  line 629, in __protected_call__
> return self.run(*args, **kwargs)
>   File 
> "/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
>  line 55, in execute_command
> raise AirflowException('Celery command failed')
> airflow.exceptions.AirflowException: Celery command failed
> [2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
> Celery: airflow run 

[jira] [Commented] (AIRFLOW-2105) Exception on known event creation

2018-02-19 Thread Simon Dubois (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16368961#comment-16368961
 ] 

Simon Dubois commented on AIRFLOW-2105:
---

Hi,
Same issue here when i try to create a user.
[~xnuinside], Yes for my case i launch `{{airflow initdb` after 1.9.0 
instalation.}}

{{Here my issue:}}
{{}}
{code:java}
Traceback (most recent call last):
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1988, in wsgi_app
response = self.full_dispatch_request()
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1641, in full_dispatch_request
rv = self.handle_user_exception(e)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1544, in handle_user_exception
reraise(exc_type, exc_value, tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/_compat.py", 
line 33, in reraise
raise value
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1639, in full_dispatch_request
rv = self.dispatch_request()
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1625, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 69, in inner
return self._run_view(f, *args, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 368, in _run_view
return fn(self, *args, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/model/base.py",
 line 1947, in create_view
return_url=return_url)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 308, in render
return render_template(template, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/templating.py",
 line 134, in render_template
context, ctx.app)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/templating.py",
 line 116, in _render
rv = template.render(context)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/environment.py",
 line 989, in render
return self.environment.handle_exception(exc_info, True)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/environment.py",
 line 754, in handle_exception
reraise(exc_type, exc_value, tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/_compat.py",
 line 37, in reraise
raise value.with_traceback(tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
 line 18, in top-level template code
{% extends 'admin/model/create.html' %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 3, in top-level template code
{% from 'admin/lib.html' import extra with context %} {# backward 
compatible #}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
 line 18, in top-level template code
{% extends 'admin/base.html' %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/base.html",
 line 30, in top-level template code
{% block page_body %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
 line 104, in block "page_body"
{% block body %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
 line 28, in block "body"
{{ super() }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 22, in block "body"
{% block create_form %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 23, in block "create_form"
{{ lib.render_form(form, return_url, extra(), form_opts) }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 202, in template
{% call form_tag(action=action) %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 182, in template
{{ caller() }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 203, in template
{{ render_form_fields(form, form_opts=form_opts) }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 175, in template
  

[jira] [Commented] (AIRFLOW-2122) SSHOperator throws an error

2018-02-19 Thread sam sen (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369146#comment-16369146
 ] 

sam sen commented on AIRFLOW-2122:
--

That did it!!! Thanks

> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
> args.func(args)
> [2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
> pool=args.pool,
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = func(*args, **kwargs)
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
> _run_raw_task
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = task_copy.execute(context=context)
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
> line 146, in execute
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: 
> raise AirflowException("SSH operator error: {0}".format(str(e)))
> [2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
> airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
> attribute 'lower'
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-1618] Add feature to create GCS bucket

2018-02-19 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master faaf0b8b4 -> 3fe06e9ff


[AIRFLOW-1618] Add feature to create GCS bucket

- Added `create_bucket` method to `gcs_hook` and
created corresponding operator
`GoogleCloudStorageCreateBucket`
- Added tests
- Added documentation

Closes #3044 from kaxil/AIRFLOW-1618


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/3fe06e9f
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/3fe06e9f
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/3fe06e9f

Branch: refs/heads/master
Commit: 3fe06e9fff795a5f33035d32098074e6303907ab
Parents: faaf0b8
Author: Kaxil Naik 
Authored: Mon Feb 19 15:21:04 2018 +0100
Committer: Fokko Driesprong 
Committed: Mon Feb 19 15:21:04 2018 +0100

--
 airflow/contrib/hooks/gcs_hook.py|  85 
 airflow/contrib/operators/gcs_operator.py| 117 ++
 docs/code.rst|   1 +
 docs/integration.rst |   8 ++
 tests/contrib/operators/test_gcs_operator.py |  50 +
 5 files changed, 261 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/3fe06e9f/airflow/contrib/hooks/gcs_hook.py
--
diff --git a/airflow/contrib/hooks/gcs_hook.py 
b/airflow/contrib/hooks/gcs_hook.py
index 5312daa..8c1e7bb 100644
--- a/airflow/contrib/hooks/gcs_hook.py
+++ b/airflow/contrib/hooks/gcs_hook.py
@@ -354,6 +354,91 @@ class GoogleCloudStorageHook(GoogleCloudBaseHook):
 if ex.resp['status'] == '404':
 raise ValueError('Object Not Found')
 
+def create_bucket(self,
+  bucket_name,
+  storage_class='MULTI_REGIONAL',
+  location='US',
+  project_id=None,
+  labels=None
+  ):
+"""
+Creates a new bucket. Google Cloud Storage uses a flat namespace, so
+you can't create a bucket with a name that is already in use.
+
+.. seealso::
+For more information, see Bucket Naming Guidelines:
+
https://cloud.google.com/storage/docs/bucketnaming.html#requirements
+
+:param bucket_name: The name of the bucket.
+:type bucket_name: string
+:param storage_class: This defines how objects in the bucket are stored
+and determines the SLA and the cost of storage. Values include
+
+- ``MULTI_REGIONAL``
+- ``REGIONAL``
+- ``STANDARD``
+- ``NEARLINE``
+- ``COLDLINE``.
+If this value is not specified when the bucket is
+created, it will default to STANDARD.
+:type storage_class: string
+:param location: The location of the bucket.
+Object data for objects in the bucket resides in physical storage
+within this region. Defaults to US.
+
+.. seealso::
+https://developers.google.com/storage/docs/bucket-locations
+
+:type location: string
+:param project_id: The ID of the GCP Project.
+:type project_id: string
+:param labels: User-provided labels, in key/value pairs.
+:type labels: dict
+:return: If successful, it returns the ``id`` of the bucket.
+"""
+
+project_id = project_id if project_id is not None else self.project_id
+storage_classes = [
+'MULTI_REGIONAL',
+'REGIONAL',
+'NEARLINE',
+'COLDLINE',
+'STANDARD',  # alias for MULTI_REGIONAL/REGIONAL, based on location
+]
+
+self.log.info('Creating Bucket: %s; Location: %s; Storage Class: %s',
+  bucket_name, location, storage_class)
+assert storage_class in storage_classes, \
+'Invalid value ({}) passed to storage_class. Value should be ' \
+'one of {}'.format(storage_class, storage_classes)
+
+service = self.get_conn()
+bucket_resource = {
+'name': bucket_name,
+'location': location,
+'storageClass': storage_class
+}
+
+self.log.info('The Default Project ID is %s', self.project_id)
+
+if labels is not None:
+bucket_resource['labels'] = labels
+
+try:
+response = service.buckets().insert(
+project=project_id,
+body=bucket_resource
+).execute()
+
+self.log.info('Bucket: %s created successfully.', bucket_name)
+
+return response['id']
+
+except errors.HttpError as ex:
+   

[jira] [Commented] (AIRFLOW-2124) Allow local mainPythonFileUri

2018-02-19 Thread Fokko Driesprong (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369156#comment-16369156
 ] 

Fokko Driesprong commented on AIRFLOW-2124:
---

[~kaxilnaik] [~fenglu]

We're moving from the bash-operators to the very nice DataProc* operators, but 
we're running into this. What would we the best practice to solve this? Maybe 
upload it to a temporary bucket. Previous the gcloud this was handled for us: 
`gcloud dataproc jobs submit pyspark ../submit_job.py`

Any thoughts on this?

 

> Allow local mainPythonFileUri
> -
>
> Key: AIRFLOW-2124
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2124
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: robbert van waardhuizen
>Priority: Major
>
> For our workflow, we currently are in the transition from using BashOperator 
> to using the DataProcPySparkOperators. While rewriting the DAG we came to the 
> conclusion that it is not possible to submit a (local) path as our main 
> Python file, and a Hadoop Compatible Filesystem (HCFS) is required.
> Our main Python drivers are located in a Git repository. Putting our main 
> Python files in a GS bucket would require manual updating/overwriting these 
> files.
> In terms of code, this works using the BashOperator:
>  
> {code:java}
> gcloud dataproc jobs submit pyspark \
>  /usr/local/airflow/git/airflow-dags/jobs/main_python_driver.py \
>  --cluster {cluster_name}{code}
>  
>  
> But cannot be replicated using the DataProcPySparkOperator:
> {code:java}
> DataProcPySparkOperator(main="/usr/local/airflow/git/airflow-dags/jobs/main_python_driver.py",
> cluster_name=cluster_name)
> {code}
> Error:
> {code:java}
> === Cloud Dataproc Agent Error ===
> java.lang.NullPointerException
> at sun.nio.fs.UnixPath.normalizeAndCheck(UnixPath.java:77)
> at sun.nio.fs.UnixPath.(UnixPath.java:71)
> at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
> at 
> com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.registerResourceForDownload(AbstractJobHandler.java:442)
> at 
> com.google.cloud.hadoop.services.agent.job.PySparkJobHandler.buildCommand(PySparkJobHandler.java:93)
> at 
> com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:538)
> at 
> com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:532)
> at 
> com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:127)
> at 
> com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
> at 
> com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:80)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:748)
>  End of Cloud Dataproc Agent Error 
> {code}
> What would be best practice in this case?
> Is it possible to add the ability to submit local paths as main Python file?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-2122) SSHOperator throws an error

2018-02-19 Thread sam sen (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2122?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sam sen closed AIRFLOW-2122.

Resolution: Not A Bug

> SSHOperator throws an error
> ---
>
> Key: AIRFLOW-2122
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2122
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: sam sen
>Priority: Major
>
> Here's my code: 
> {code:java}
> dag = DAG('transfer_ftp_s3', 
> default_args=default_args,schedule_interval=None) }}
> task = SSHOperator(ssh_conn_id='ssh_node', 
>                    task_id="check_ftp_for_new_files", 
>                    command="echo 'hello world'", 
>                    do_xcom_push=True, dag=dag,)
> {code}
>  
> Here's the error
> {code:java}
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask: 
> Traceback (most recent call last):
> [2018-02-19 06:48:02,691] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/bin/airflow", line 27, in 
> [2018-02-19 06:48:02,692] {{base_task_runner.py:98}} INFO - Subtask: 
> args.func(args)
> [2018-02-19 06:48:02,693] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask: 
> pool=args.pool,
> [2018-02-19 06:48:02,695] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = func(*args, **kwargs)
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/models.py", line 1496, in 
> _run_raw_task
> [2018-02-19 06:48:02,696] {{base_task_runner.py:98}} INFO - Subtask: 
> result = task_copy.execute(context=context)
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask:   File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/ssh_operator.py", 
> line 146, in execute
> [2018-02-19 06:48:02,697] {{base_task_runner.py:98}} INFO - Subtask: 
> raise AirflowException("SSH operator error: {0}".format(str(e)))
> [2018-02-19 06:48:02,698] {{base_task_runner.py:98}} INFO - Subtask: 
> airflow.exceptions.AirflowException: SSH operator error: 'bool' object has no 
> attribute 'lower'
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2124) Allow local mainPythonFileUri

2018-02-19 Thread robbert van waardhuizen (JIRA)
robbert van waardhuizen created AIRFLOW-2124:


 Summary: Allow local mainPythonFileUri
 Key: AIRFLOW-2124
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2124
 Project: Apache Airflow
  Issue Type: Wish
Reporter: robbert van waardhuizen


For our workflow, we currently are in the transition from using BashOperator to 
using the DataProcPySparkOperators. While rewriting the DAG we came to the 
conclusion that it is not possible to submit a (local) path as our main Python 
file, and a Hadoop Compatible Filesystem (HCFS) is required.

Our main Python drivers are located in a Git repository. Putting our main 
Python files in a GS bucket would require manual updating/overwriting these 
files.

In terms of code, this works using the BashOperator:

 
{code:java}
gcloud dataproc jobs submit pyspark \
 /usr/local/airflow/git/airflow-dags/jobs/main_python_driver.py \
 --cluster {cluster_name}{code}
 

 

But cannot be replicated using the DataProcPySparkOperator:
{code:java}
DataProcPySparkOperator(main="/usr/local/airflow/git/airflow-dags/jobs/main_python_driver.py",
cluster_name=cluster_name)
{code}
Error:
{code:java}
=== Cloud Dataproc Agent Error ===
java.lang.NullPointerException
at sun.nio.fs.UnixPath.normalizeAndCheck(UnixPath.java:77)
at sun.nio.fs.UnixPath.(UnixPath.java:71)
at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
at 
com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.registerResourceForDownload(AbstractJobHandler.java:442)
at 
com.google.cloud.hadoop.services.agent.job.PySparkJobHandler.buildCommand(PySparkJobHandler.java:93)
at 
com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:538)
at 
com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:532)
at 
com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:127)
at 
com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at 
com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:80)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
 End of Cloud Dataproc Agent Error 
{code}
What would be best practice in this case?

Is it possible to add the ability to submit local paths as main Python file?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-1618) Allow creating of Storage buckets through Google Cloud Storage Hook

2018-02-19 Thread Fokko Driesprong (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-1618.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #3044
[https://github.com/apache/incubator-airflow/pull/3044]

> Allow creating of Storage buckets through Google Cloud Storage Hook
> ---
>
> Key: AIRFLOW-1618
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1618
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp, hooks
>Reporter: Daniel
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 2.0.0
>
>   Original Estimate: 96h
>  Remaining Estimate: 96h
>
> The current way that gcs_hook.py is written does not allow for the addition 
> of a new Storage bucket.
> It is possible to create a bucket with config from the storage api, and this 
> is what is being used in the hook. However it does require a small rethink or 
> addition to the current way the service is returned and used.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2105) Exception on known event creation

2018-02-19 Thread Simon Dubois (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16368961#comment-16368961
 ] 

Simon Dubois edited comment on AIRFLOW-2105 at 2/19/18 10:26 AM:
-

Hi,
 Same issue here when i try to create a user.
 [~xnuinside], Yes for my case i launch `{{airflow initdb` after 1.9.0 
instalation.}}

{{Here my issue:}}
{code:java}
Traceback (most recent call last):
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1988, in wsgi_app
response = self.full_dispatch_request()
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1641, in full_dispatch_request
rv = self.handle_user_exception(e)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1544, in handle_user_exception
reraise(exc_type, exc_value, tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/_compat.py", 
line 33, in reraise
raise value
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1639, in full_dispatch_request
rv = self.dispatch_request()
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/app.py", 
line 1625, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 69, in inner
return self._run_view(f, *args, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 368, in _run_view
return fn(self, *args, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/model/base.py",
 line 1947, in create_view
return_url=return_url)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/base.py",
 line 308, in render
return render_template(template, **kwargs)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/templating.py",
 line 134, in render_template
context, ctx.app)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask/templating.py",
 line 116, in _render
rv = template.render(context)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/environment.py",
 line 989, in render
return self.environment.handle_exception(exc_info, True)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/environment.py",
 line 754, in handle_exception
reraise(exc_type, exc_value, tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/jinja2/_compat.py",
 line 37, in reraise
raise value.with_traceback(tb)
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
 line 18, in top-level template code
{% extends 'admin/model/create.html' %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 3, in top-level template code
{% from 'admin/lib.html' import extra with context %} {# backward 
compatible #}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
 line 18, in top-level template code
{% extends 'admin/base.html' %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/base.html",
 line 30, in top-level template code
{% block page_body %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/admin/master.html",
 line 104, in block "page_body"
{% block body %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/airflow/www/templates/airflow/model_create.html",
 line 28, in block "body"
{{ super() }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 22, in block "body"
{% block create_form %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/model/create.html",
 line 23, in block "create_form"
{{ lib.render_form(form, return_url, extra(), form_opts) }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 202, in template
{% call form_tag(action=action) %}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 182, in template
{{ caller() }}
  File 
"/home/sdubois/work/airflow/venv/lib/python3.6/site-packages/flask_admin/templates/bootstrap3/admin/lib.html",
 line 203, in template
{{ render_form_fields(form, form_opts=form_opts) }}
  File 

[jira] [Commented] (AIRFLOW-1945) Pass --autoscale to celery workers

2018-02-19 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1945?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16369063#comment-16369063
 ] 

Yuliya Volkova commented on AIRFLOW-1945:
-

[~mosthege], you can do it with Airflow Plugin way, for example: 
[https://github.com/xnuinside/airflow_plugin_custom_cli/blob/master/custom_cli_plugin.py]
 

> Pass --autoscale to celery workers
> --
>
> Key: AIRFLOW-1945
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1945
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: celery, cli
>Reporter: Michael O.
>Priority: Trivial
>  Labels: easyfix
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> Celery supports autoscaling of the worker pool size (number of tasks that can 
> parallelize within one worker node).  I'd like to propose to support passing 
> the --autoscale parameter to {{airflow worker}}.
> Since this is a trivial change, I am not sure if there's any reason for not 
> being supported already.(?)
> For example
> {{airflow worker --concurrency=4}} will set a fixed pool size of 4.
> With minimal changes in 
> [https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855]
>  it could support
> {{airflow worker --autoscale=2,10}} to set an autoscaled pool size of 2 to 10
> Some references:
> * 
> http://docs.celeryproject.org/en/latest/internals/reference/celery.worker.autoscale.html
> * 
> https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2123) Install CI Dependencies from setup.py

2018-02-19 Thread Fokko Driesprong (JIRA)
Fokko Driesprong created AIRFLOW-2123:
-

 Summary: Install CI Dependencies from setup.py
 Key: AIRFLOW-2123
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2123
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Fokko Driesprong


Right now we have two places where we keep our dependencies. This is setup.py 
for installation and requirements.txt for the CI. These files run terribly out 
of sync and therefore I think it is a good idea to install the CI's 
dependencies using this setup.py so we have everything in one single place.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2125) psycopg2 version 2.7.4 was released and the wheel package was renamed to psycopg2-binary

2018-02-19 Thread Marcos Bernardelli (JIRA)
Marcos Bernardelli created AIRFLOW-2125:
---

 Summary: psycopg2 version 2.7.4 was released and the wheel package 
was renamed to psycopg2-binary
 Key: AIRFLOW-2125
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2125
 Project: Apache Airflow
  Issue Type: Improvement
  Components: db
Reporter: Marcos Bernardelli


A new version of the *psycopg2* was released, and the wheel packages was 
renamed to *psycopg2-binary*. More details on:

[http://initd.org/psycopg/docs/news.html]

[https://pypi.python.org/pypi/psycopg2-binary]

 

When running Airflow with the old package installed, the warning is printed:

 
{code}
UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in 
order to keep installing from binary please use "pip install psycopg2-binary" 
instead. For details see: 
.
{code}
 

Should we continue using the binary package, even that the recommendation is to 
build it from source:

"The binary package is a practical choice for development and testing but in 
production it is advised to use the package built from sources."

 

Source: https://pypi.python.org/pypi/psycopg2-binary

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)