[jira] [Created] (AIRFLOW-2119) Celery worker fails when dag has space in filename

2018-02-16 Thread Paymahn Moghadasian (JIRA)
Paymahn Moghadasian created AIRFLOW-2119:


 Summary: Celery worker fails when dag has space in filename
 Key: AIRFLOW-2119
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2119
 Project: Apache Airflow
  Issue Type: Bug
  Components: celery, worker
Affects Versions: 1.9.0
Reporter: Paymahn Moghadasian


A dag whose filename has a space will cause celery workers to fail as follows:

 
{noformat}
[2018-02-16 22:58:55,976] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/Grammar.txt
[2018-02-16 22:58:56,021] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/PatternGrammar.txt
[2018-02-16 22:58:56,322] {configuration.py:206} WARNING - section/key 
[celery/celery_ssl_active] not found in config
[2018-02-16 22:58:56,322] {default_celery.py:41} WARNING - Celery Executor will 
run without SSL
[2018-02-16 22:58:56,323] {__init__.py:45} INFO - Using executor CeleryExecutor
Starting flask
[2018-02-16 22:58:56,403] {_internal.py:88} INFO -  * Running on 
http://0.0.0.0:8793/ (Press CTRL+C to quit)
[2018-02-16 22:59:25,181] {celery_executor.py:50} INFO - Executing command in 
Celery: airflow run broken_hello_world dummy_task 2018-02-15T12:00:00 --local 
-sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
[2018-02-16 22:59:25,569] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/Grammar.txt
[2018-02-16 22:59:25,610] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/PatternGrammar.txt
[2018-02-16 22:59:25,885] {configuration.py:206} WARNING - section/key 
[celery/celery_ssl_active] not found in config
[2018-02-16 22:59:25,885] {default_celery.py:41} WARNING - Celery Executor will 
run without SSL
[2018-02-16 22:59:25,886] {__init__.py:45} INFO - Using executor CeleryExecutor
usage: airflow [-h]
   
{flower,kerberos,upgradedb,worker,render,serve_logs,backfill,task_state,dag_state,test,connections,pause,unpause,list_tasks,scheduler,run,list_dags,webserver,trigger_dag,version,pool,resetdb,clear,variables,initdb,task_failed_deps}
   ...
airflow: error: unrecognized arguments: world.py
[2018-02-16 22:59:26,055] {celery_executor.py:54} ERROR - Command 'airflow run 
broken_hello_world dummy_task 2018-02-15T12:00:00 --local -sd 
/home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
exit status 2
[2018-02-16 22:59:26,117: ERROR/ForkPoolWorker-16] Task 
airflow.executors.celery_executor.execute_command[114e9ff2-380e-4ba6-84e4-c913ea85189c]
 raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
  File 
"/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
 line 52, in execute_command
subprocess.check_call(command, shell=True)
  File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'airflow run broken_hello_world 
dummy_task 2018-02-15T12:00:00 --local -sd 
/home/paymahn/scheduler/airflow-home/dags/hello world.py' returned non-zero 
exit status 2

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py", 
line 374, in trace_task
R = retval = fun(*args, **kwargs)
  File 
"/home/paymahn/scheduler/venv/lib/python3.5/site-packages/celery/app/trace.py", 
line 629, in __protected_call__
return self.run(*args, **kwargs)
  File 
"/home/paymahn/scheduler/venv/lib/python3.5/site-packages/airflow/executors/celery_executor.py",
 line 55, in execute_command
raise AirflowException('Celery command failed')
airflow.exceptions.AirflowException: Celery command failed
[2018-02-16 22:59:27,420] {celery_executor.py:50} INFO - Executing command in 
Celery: airflow run broken_hello_world dummy_task 2018-02-16T22:59:26.096432 
--local -sd /home/paymahn/scheduler/airflow-home/dags/hello world.py
[2018-02-16 22:59:27,840] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/Grammar.txt
[2018-02-16 22:59:27,877] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/PatternGrammar.txt
[2018-02-16 22:59:28,148] {configuration.py:206} WARNING - section/key 
[celery/celery_ssl_active] not found in config
[2018-02-16 22:59:28,148] {default_celery.py:41} WARNING - Celery Executor will 
run without SSL
[2018-02-16 22:59:28,149] {__init__.py:45} INFO - Using executor CeleryExecutor
usage: airflow [-h]
   
{resetdb,version,dag_state,trigger_dag,connections,task_state,variables,upgradedb,webserver,kerberos,pool,serve_logs,run,list_dags,scheduler,render,flower,task_failed_deps,worker,unpause,backfill,test,initdb,list_tasks,pause,clear}
   ...
airflow: 

[jira] [Created] (AIRFLOW-2118) get_pandas_df does always pass a list of rows to be parsed

2018-02-16 Thread Diane Ivy (JIRA)
Diane Ivy created AIRFLOW-2118:
--

 Summary: get_pandas_df does always pass a list of rows to be parsed
 Key: AIRFLOW-2118
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2118
 Project: Apache Airflow
  Issue Type: Bug
  Components: contrib, hooks
Affects Versions: 1.9.0
 Environment: pandas-gbp 0.3.1

Reporter: Diane Ivy


While trying to parse the pages in get_pandas_df if only one page is returned 
it starts popping off each row and then the gbq_parse_data works incorrectly.


{{while len(pages) > 0:}}
{{    page = pages.pop()}}
{{    dataframe_list.append(gbq_parse_data(schema, page))}}

Possible solution:

{{from google.cloud import bigquery}}

{{if isinstance(pages[0], bigquery.table.Row):}}
{{    pages = [pages]}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2117) Unable to login to webserver

2018-02-16 Thread Brian Hoover (JIRA)
Brian Hoover created AIRFLOW-2117:
-

 Summary: Unable to login to webserver
 Key: AIRFLOW-2117
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2117
 Project: Apache Airflow
  Issue Type: Bug
  Components: webserver
Affects Versions: 1.9.0
Reporter: Brian Hoover


Unable to login to webserver after upgrade to 1.9.  

Worth noting is that if I modify 
'/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py' and change the 
calls to the following:

current_user.is_anonymous,
current_user.is_authenticated,
current_user.is_superuser,
current_user.data_profiling

and remove the parenthesis, I am able to log in.  However, I am then unable to 
see many of the menu items at the top of UI.

Once logged in, I can revert those changes and am able to see the menu items 
again.

Here is the reported error:
---
Node: localhost
---
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1982, in 
wsgi_app
response = self.full_dispatch_request()
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1614, in 
full_dispatch_request
rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1517, in 
handle_user_exception
reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line 33, in 
reraise
raise value
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1612, in 
full_dispatch_request
rv = self.dispatch_request()
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1598, in 
dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 69, 
in inner
return self._run_view(f, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 368, 
in _run_view
return fn(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line 645, 
in login
return airflow.login.login(self, request)
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/contrib/auth/backends/password_auth.py",
 line 124, in login
form=form)
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 308, 
in render
return render_template(template, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/flask/templating.py", line 134, 
in render_template
context, ctx.app)
  File "/usr/local/lib/python3.5/dist-packages/flask/templating.py", line 116, 
in _render
rv = template.render(context)
  File "/usr/local/lib/python3.5/dist-packages/jinja2/environment.py", line 
1008, in render
return self.environment.handle_exception(exc_info, True)
  File "/usr/local/lib/python3.5/dist-packages/jinja2/environment.py", line 
780, in handle_exception
reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.5/dist-packages/jinja2/_compat.py", line 37, in 
reraise
raise value.with_traceback(tb)
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/www/templates/airflow/login.html",
 line 18, in top-level template code
{% extends "airflow/master.html" %}
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/www/templates/airflow/master.html",
 line 18, in top-level template code
{% extends "admin/master.html" %}
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/www/templates/admin/master.html",
 line 18, in top-level template code
{% extends 'admin/base.html' %}
  File 
"/usr/local/lib/python3.5/dist-packages/flask_admin/templates/bootstrap3/admin/base.html",
 line 37, in top-level template code
{% block page_body %}
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/www/templates/admin/master.html",
 line 74, in block "page_body"
{% block main_menu %}
  File 
"/usr/local/lib/python3.5/dist-packages/airflow/www/templates/admin/master.html",
 line 76, in block "main_menu"
{{ layout.menu() }}
  File "/usr/local/lib/python3.5/dist-packages/jinja2/runtime.py", line 579, in 
_invoke
rv = self._func(*arguments)
  File 
"/usr/local/lib/python3.5/dist-packages/flask_admin/templates/bootstrap3/admin/layout.html",
 line 21, in template
{% set children = item.get_children() %}
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/menu.py", line 52, 
in get_children
return [c for c in self._children if c.is_accessible() and c.is_visible()]
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/menu.py", line 52, 
in 
return [c for c in self._children if c.is_accessible() and c.is_visible()]
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/menu.py", line 126, 
in is_accessible
return self._view.is_accessible()
  File 

[jira] [Commented] (AIRFLOW-2115) Default Airflow.cfg & Airflow Web UI contains links to pythonhosted

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366883#comment-16366883
 ] 

ASF subversion and git services commented on AIRFLOW-2115:
--

Commit 1c76e1b63a8e4861e9404f2e86f2b447ae796955 in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=1c76e1b ]

[AIRFLOW-2115] Fix doc links to PythonHosted

Replaced `http://pythonhosted.org/airflow/` links
to `https://airflow.incubator.apache.org/` in:
- Airflow Web UI
- `default_airflow.cfg` file
- Tutorial
- `CONTRIBUTING.md`

Closes #3050 from kaxil/AIRFLOW-2115


> Default Airflow.cfg & Airflow Web UI contains links to pythonhosted
> ---
>
> Key: AIRFLOW-2115
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2115
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Critical
> Fix For: 2.0.0
>
>
> The default `airflow.cfg` file points to pythonhosted docs which should be 
> fixed
> https://github.com/apache/incubator-airflow/blob/1e36b37b68ab354d1d7d1d1d3abd151ce2a7cac7/airflow/config_templates/default_airflow.cfg#L227
> Also Airflow WebUI contains links to old documentations located at 
> PythonHosted.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2115) Default Airflow.cfg & Airflow Web UI contains links to pythonhosted

2018-02-16 Thread Fokko Driesprong (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2115?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-2115.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #3050
[https://github.com/apache/incubator-airflow/pull/3050]

> Default Airflow.cfg & Airflow Web UI contains links to pythonhosted
> ---
>
> Key: AIRFLOW-2115
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2115
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Critical
> Fix For: 2.0.0
>
>
> The default `airflow.cfg` file points to pythonhosted docs which should be 
> fixed
> https://github.com/apache/incubator-airflow/blob/1e36b37b68ab354d1d7d1d1d3abd151ce2a7cac7/airflow/config_templates/default_airflow.cfg#L227
> Also Airflow WebUI contains links to old documentations located at 
> PythonHosted.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2115) Default Airflow.cfg & Airflow Web UI contains links to pythonhosted

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366882#comment-16366882
 ] 

ASF subversion and git services commented on AIRFLOW-2115:
--

Commit 1c76e1b63a8e4861e9404f2e86f2b447ae796955 in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=1c76e1b ]

[AIRFLOW-2115] Fix doc links to PythonHosted

Replaced `http://pythonhosted.org/airflow/` links
to `https://airflow.incubator.apache.org/` in:
- Airflow Web UI
- `default_airflow.cfg` file
- Tutorial
- `CONTRIBUTING.md`

Closes #3050 from kaxil/AIRFLOW-2115


> Default Airflow.cfg & Airflow Web UI contains links to pythonhosted
> ---
>
> Key: AIRFLOW-2115
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2115
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Critical
> Fix For: 2.0.0
>
>
> The default `airflow.cfg` file points to pythonhosted docs which should be 
> fixed
> https://github.com/apache/incubator-airflow/blob/1e36b37b68ab354d1d7d1d1d3abd151ce2a7cac7/airflow/config_templates/default_airflow.cfg#L227
> Also Airflow WebUI contains links to old documentations located at 
> PythonHosted.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2115] Fix doc links to PythonHosted

2018-02-16 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 60abb6054 -> 1c76e1b63


[AIRFLOW-2115] Fix doc links to PythonHosted

Replaced `http://pythonhosted.org/airflow/` links
to `https://airflow.incubator.apache.org/` in:
- Airflow Web UI
- `default_airflow.cfg` file
- Tutorial
- `CONTRIBUTING.md`

Closes #3050 from kaxil/AIRFLOW-2115


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/1c76e1b6
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/1c76e1b6
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/1c76e1b6

Branch: refs/heads/master
Commit: 1c76e1b63a8e4861e9404f2e86f2b447ae796955
Parents: 60abb60
Author: Kaxil Naik 
Authored: Fri Feb 16 12:38:54 2018 +0100
Committer: Fokko Driesprong 
Committed: Fri Feb 16 12:38:54 2018 +0100

--
 CONTRIBUTING.md  | 2 +-
 airflow/config_templates/default_airflow.cfg | 2 +-
 airflow/example_dags/tutorial.py | 2 +-
 airflow/www/app.py   | 5 +++--
 docs/plugins.rst | 2 +-
 tests/plugins/test_plugin.py | 3 ++-
 6 files changed, 9 insertions(+), 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/1c76e1b6/CONTRIBUTING.md
--
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index da82805..6ac8c43 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -65,7 +65,7 @@ If you are proposing a feature:
 
 ## Documentation
 
-The latest API documentation is usually available 
[here](http://pythonhosted.org/airflow).
+The latest API documentation is usually available 
[here](https://airflow.incubator.apache.org/).
 To generate a local version, you need to have installed airflow with
 the `doc` extra. In that case you can generate the doc by running:
 

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/1c76e1b6/airflow/config_templates/default_airflow.cfg
--
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 59ff740..0f6d56f 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -224,7 +224,7 @@ error_logfile = -
 expose_config = False
 
 # Set to true to turn on authentication:
-# http://pythonhosted.org/airflow/security.html#web-authentication
+# https://airflow.incubator.apache.org/security.html#web-authentication
 authenticate = False
 
 # Filter the list of dags by owner name (requires authentication to be enabled)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/1c76e1b6/airflow/example_dags/tutorial.py
--
diff --git a/airflow/example_dags/tutorial.py b/airflow/example_dags/tutorial.py
index 5229b45..1275638 100644
--- a/airflow/example_dags/tutorial.py
+++ b/airflow/example_dags/tutorial.py
@@ -15,7 +15,7 @@
 """
 ### Tutorial Documentation
 Documentation that goes along with the Airflow tutorial located
-[here](http://pythonhosted.org/airflow/tutorial.html)
+[here](https://airflow.incubator.apache.org/tutorial.html)
 """
 import airflow
 from airflow import DAG

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/1c76e1b6/airflow/www/app.py
--
diff --git a/airflow/www/app.py b/airflow/www/app.py
index d46935b..e7b4ca6 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -103,10 +103,11 @@ def create_app(config=None, testing=False):
 
 admin.add_link(base.MenuLink(
 category='Docs', name='Documentation',
-url='http://pythonhosted.org/airflow/'))
+url='https://airflow.incubator.apache.org/'))
 admin.add_link(
 base.MenuLink(category='Docs',
-
name='Github',url='https://github.com/apache/incubator-airflow'))
+  name='Github',
+  url='https://github.com/apache/incubator-airflow'))
 
 av(vs.VersionView(name='Version', category="About"))
 

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/1c76e1b6/docs/plugins.rst
--
diff --git a/docs/plugins.rst b/docs/plugins.rst
index 9fb9c0e..feccb5b 100644
--- a/docs/plugins.rst
+++ b/docs/plugins.rst
@@ -130,7 +130,7 @@ definitions in Airflow.
 ml = MenuLink(
 category='Test Plugin',
 name='Test Menu Link',
-url='http://pythonhosted.org/airflow/')
+url='https://airflow.incubator.apache.org/')
 
 # 

incubator-airflow git commit: [AIRFLOW-XXX] Add contributor from Easy company

2018-02-16 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master c739adc62 -> 60abb6054


[AIRFLOW-XXX] Add contributor from Easy company

Closes #3052 from diraol/add-contributor


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/60abb605
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/60abb605
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/60abb605

Branch: refs/heads/master
Commit: 60abb605477a7a8223a690ca773d2dfb2fcb8ec9
Parents: c739adc
Author: Diego Rabatone Oliveira 
Authored: Fri Feb 16 12:37:41 2018 +0100
Committer: Fokko Driesprong 
Committed: Fri Feb 16 12:37:45 2018 +0100

--
 README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/60abb605/README.md
--
diff --git a/README.md b/README.md
index ef2e3ed..602b5f0 100644
--- a/README.md
+++ b/README.md
@@ -118,7 +118,7 @@ Currently **officially** using Airflow:
 1. [Data Reply](https://www.datareply.co.uk/) 
[[@kaxil](https://github.com/kaxil)]
 1. [Digital First Media](http://www.digitalfirstmedia.com/) 
[[@duffn](https://github.com/duffn) & [@mschmo](https://github.com/mschmo) & 
[@seanmuth](https://github.com/seanmuth)]
 1. [Drivy](https://www.drivy.com) 
[[@AntoineAugusti](https://github.com/AntoineAugusti)]
-1. [Easy Taxi](http://www.easytaxi.com/) 
[[@caique-lima](https://github.com/caique-lima) & 
[@WesleyBatista](https://github.com/WesleyBatista)]
+1. [Easy Taxi](http://www.easytaxi.com/) 
[[@caique-lima](https://github.com/caique-lima) & 
[@WesleyBatista](https://github.com/WesleyBatista) & 
[@diraol](https://github.com/diraol)]
 1. [eRevalue](https://www.datamaran.com) 
[[@hamedhsn](https://github.com/hamedhsn)]
 1. [evo.company](https://evo.company/) 
[[@orhideous](https://github.com/orhideous)]
 1. [FreshBooks](https://github.com/freshbooks) 
[[@DinoCow](https://github.com/DinoCow)]



[jira] [Commented] (AIRFLOW-1882) Add ignoreUnknownValues option to gcs_to_bq operator

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366877#comment-16366877
 ] 

ASF subversion and git services commented on AIRFLOW-1882:
--

Commit c739adc623818287e8e7de7017aa3a2af085912e in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=c739adc ]

[AIRFLOW-1882] Add ignoreUnknownValues option to gcs_to_bq operator

- Added `ignore_unknown_values` to
`run_load` method in `BigQuery Hook`
- Added `ignore_unknown_values` to
`GoogleCloudStorageToBigQueryOperator`

Closes #3042 from kaxil/AIRFLOW-1882


> Add ignoreUnknownValues option to gcs_to_bq operator
> 
>
> Key: AIRFLOW-1882
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1882
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, gcp
>Affects Versions: 1.8.2
>Reporter: Yannick Einsweiler
>Assignee: Kaxil Naik
>Priority: Major
>  Labels: gcp
>
> Would allow to load csv's that have columns not defined in schema. For 
> instance when lines end with a dummy/extra separator. BigQuery considers it 
> as an extra column and won't load the file if option is not passed. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-1882] Add ignoreUnknownValues option to gcs_to_bq operator

2018-02-16 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 074295129 -> c739adc62


[AIRFLOW-1882] Add ignoreUnknownValues option to gcs_to_bq operator

- Added `ignore_unknown_values` to
`run_load` method in `BigQuery Hook`
- Added `ignore_unknown_values` to
`GoogleCloudStorageToBigQueryOperator`

Closes #3042 from kaxil/AIRFLOW-1882


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/c739adc6
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/c739adc6
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/c739adc6

Branch: refs/heads/master
Commit: c739adc623818287e8e7de7017aa3a2af085912e
Parents: 0742951
Author: Kaxil Naik 
Authored: Fri Feb 16 12:36:42 2018 +0100
Committer: Fokko Driesprong 
Committed: Fri Feb 16 12:36:42 2018 +0100

--
 airflow/contrib/hooks/bigquery_hook.py | 10 ++
 airflow/contrib/operators/gcs_to_bq.py | 10 ++
 2 files changed, 20 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/c739adc6/airflow/contrib/hooks/bigquery_hook.py
--
diff --git a/airflow/contrib/hooks/bigquery_hook.py 
b/airflow/contrib/hooks/bigquery_hook.py
index ce7e2c3..cd0318f 100644
--- a/airflow/contrib/hooks/bigquery_hook.py
+++ b/airflow/contrib/hooks/bigquery_hook.py
@@ -733,6 +733,7 @@ class BigQueryBaseCursor(LoggingMixin):
  field_delimiter=',',
  max_bad_records=0,
  quote_character=None,
+ ignore_unknown_values=False,
  allow_quoted_newlines=False,
  allow_jagged_rows=False,
  schema_update_options=(),
@@ -776,6 +777,12 @@ class BigQueryBaseCursor(LoggingMixin):
 :param quote_character: The value that is used to quote data sections 
in a CSV
 file.
 :type quote_character: string
+:param ignore_unknown_values: [Optional] Indicates if BigQuery should 
allow
+extra values that are not represented in the table schema.
+If true, the extra values are ignored. If false, records with 
extra columns
+are treated as bad records, and if there are too many bad records, 
an
+invalid error is returned in the job result.
+:type ignore_unknown_values: bool
 :param allow_quoted_newlines: Whether to allow quoted newlines (true) 
or not
 (false).
 :type allow_quoted_newlines: boolean
@@ -842,6 +849,7 @@ class BigQueryBaseCursor(LoggingMixin):
 'sourceFormat': source_format,
 'sourceUris': source_uris,
 'writeDisposition': write_disposition,
+'ignoreUnknownValues': ignore_unknown_values
 }
 }
 
@@ -885,6 +893,8 @@ class BigQueryBaseCursor(LoggingMixin):
 src_fmt_configs['skipLeadingRows'] = skip_leading_rows
 if 'fieldDelimiter' not in src_fmt_configs:
 src_fmt_configs['fieldDelimiter'] = field_delimiter
+if 'ignoreUnknownValues' not in src_fmt_configs:
+src_fmt_configs['ignoreUnknownValues'] = ignore_unknown_values
 if quote_character is not None:
 src_fmt_configs['quote'] = quote_character
 if allow_quoted_newlines:

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/c739adc6/airflow/contrib/operators/gcs_to_bq.py
--
diff --git a/airflow/contrib/operators/gcs_to_bq.py 
b/airflow/contrib/operators/gcs_to_bq.py
index 97b1ef0..be15e52 100644
--- a/airflow/contrib/operators/gcs_to_bq.py
+++ b/airflow/contrib/operators/gcs_to_bq.py
@@ -66,6 +66,12 @@ class GoogleCloudStorageToBigQueryOperator(BaseOperator):
 :type max_bad_records: int
 :param quote_character: The value that is used to quote data sections in a 
CSV file.
 :type quote_character: string
+:param ignore_unknown_values: [Optional] Indicates if BigQuery should allow
+extra values that are not represented in the table schema.
+If true, the extra values are ignored. If false, records with extra 
columns
+are treated as bad records, and if there are too many bad records, an
+invalid error is returned in the job result.
+:type ignore_unknown_values: bool
 :param allow_quoted_newlines: Whether to allow quoted newlines (true) or 
not (false).
 :type allow_quoted_newlines: boolean
 :param allow_jagged_rows: Accept rows that are missing trailing optional 
columns.
@@ -124,6 +130,7 @@ class GoogleCloudStorageToBigQueryOperator(BaseOperator):
  field_delimiter=',',
  max_bad_records=0,
 

[jira] [Resolved] (AIRFLOW-2089) Add on kill for SparkSubmit in Standalone Cluster

2018-02-16 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2089?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2089.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3023
[https://github.com/apache/incubator-airflow/pull/3023]

> Add on kill for SparkSubmit in Standalone Cluster
> -
>
> Key: AIRFLOW-2089
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2089
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Milan van der Meer
>Assignee: Milan van der Meer
>Priority: Major
> Fix For: 1.10.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2089] Add on kill for SparkSubmit in Standalone Cluster

2018-02-16 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 4745e4e58 -> 074295129


[AIRFLOW-2089] Add on kill for SparkSubmit in Standalone Cluster

adds a kill command in case of running a Standalone Cluster for Spark

Closes #3023 from milanvdm/milanvdm/improve-spark-
on-kill


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/07429512
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/07429512
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/07429512

Branch: refs/heads/master
Commit: 0742951299b58a06cde9befff199ef05a7533a9c
Parents: 4745e4e
Author: milanvdm 
Authored: Fri Feb 16 10:59:06 2018 +0100
Committer: Bolke de Bruin 
Committed: Fri Feb 16 10:59:08 2018 +0100

--
 airflow/contrib/hooks/spark_submit_hook.py| 41 ++
 tests/contrib/hooks/test_spark_submit_hook.py | 22 
 2 files changed, 63 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/07429512/airflow/contrib/hooks/spark_submit_hook.py
--
diff --git a/airflow/contrib/hooks/spark_submit_hook.py 
b/airflow/contrib/hooks/spark_submit_hook.py
index 0c89a9c..a7a083a 100644
--- a/airflow/contrib/hooks/spark_submit_hook.py
+++ b/airflow/contrib/hooks/spark_submit_hook.py
@@ -410,8 +410,49 @@ class SparkSubmitHook(BaseHook, LoggingMixin):
 .format(returncode)
 )
 
+def _build_spark_driver_kill_command(self):
+"""
+Construct the spark-submit command to kill a driver.
+:return: full command to kill a driver
+"""
+
+# If the spark_home is passed then build the spark-submit executable 
path using
+# the spark_home; otherwise assume that spark-submit is present in the 
path to
+# the executing user
+if self._connection['spark_home']:
+connection_cmd = [os.path.join(self._connection['spark_home'],
+   'bin',
+   self._connection['spark_binary'])]
+else:
+connection_cmd = [self._connection['spark_binary']]
+
+# The url ot the spark master
+connection_cmd += ["--master", self._connection['master']]
+
+# The actual kill command
+connection_cmd += ["--kill", self._driver_id]
+
+self.log.debug("Spark-Kill cmd: %s", connection_cmd)
+
+return connection_cmd
+
 def on_kill(self):
 
+self.log.debug("Kill Command is being called")
+
+if self._should_track_driver_status:
+if self._driver_id:
+self.log.info('Killing driver {} on cluster'
+  .format(self._driver_id))
+
+kill_cmd = self._build_spark_driver_kill_command()
+driver_kill = subprocess.Popen(kill_cmd,
+   stdout=subprocess.PIPE,
+   stderr=subprocess.PIPE)
+
+self.log.info("Spark driver {} killed with return code: {}"
+  .format(self._driver_id, driver_kill.wait()))
+
 if self._submit_sp and self._submit_sp.poll() is None:
 self.log.info('Sending kill signal to %s', 
self._connection['spark_binary'])
 self._submit_sp.kill()

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/07429512/tests/contrib/hooks/test_spark_submit_hook.py
--
diff --git a/tests/contrib/hooks/test_spark_submit_hook.py 
b/tests/contrib/hooks/test_spark_submit_hook.py
index 6c55ce2..7821d02 100644
--- a/tests/contrib/hooks/test_spark_submit_hook.py
+++ b/tests/contrib/hooks/test_spark_submit_hook.py
@@ -456,6 +456,28 @@ class TestSparkSubmitHook(unittest.TestCase):
stderr=-1, stdout=-1),
   mock_popen.mock_calls)
 
+def test_standalone_cluster_process_on_kill(self):
+# Given
+log_lines = [
+'Running Spark using the REST application submission protocol.',
+'17/11/28 11:14:15 INFO RestSubmissionClient: Submitting a request 
' +
+'to launch an application in spark://spark-standalone-master:6066',
+'17/11/28 11:14:15 INFO RestSubmissionClient: Submission 
successfully ' +
+'created as driver-20171128111415-0001. Polling submission 
state...'
+]
+hook = SparkSubmitHook(conn_id='spark_standalone_cluster')
+hook._process_spark_submit_log(log_lines)
+
+# When
+kill_cmd = hook._build_spark_driver_kill_command()
+
+# Then
+

[jira] [Commented] (AIRFLOW-2089) Add on kill for SparkSubmit in Standalone Cluster

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366792#comment-16366792
 ] 

ASF subversion and git services commented on AIRFLOW-2089:
--

Commit 0742951299b58a06cde9befff199ef05a7533a9c in incubator-airflow's branch 
refs/heads/master from milanvdm
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=0742951 ]

[AIRFLOW-2089] Add on kill for SparkSubmit in Standalone Cluster

adds a kill command in case of running a Standalone Cluster for Spark

Closes #3023 from milanvdm/milanvdm/improve-spark-
on-kill


> Add on kill for SparkSubmit in Standalone Cluster
> -
>
> Key: AIRFLOW-2089
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2089
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Milan van der Meer
>Assignee: Milan van der Meer
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2113) Address missing DagRun callbacks

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366784#comment-16366784
 ] 

ASF subversion and git services commented on AIRFLOW-2113:
--

Commit 4745e4e58d8caf1676b3e24e7567d4d579a23735 in incubator-airflow's branch 
refs/heads/master from Alan Ma
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=4745e4e ]

[AIRFLOW-2113] Address missing DagRun callbacks
Given that the handle_callback method belongs to
the DAG object, we are able to get the list of
task directly with get_task and reduce the
communication with the database, making airflow
more lightweight.

Closes #3038 from wolfier/master


> Address missing DagRun callbacks
> 
>
> Key: AIRFLOW-2113
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2113
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Alan Ma
>Assignee: Alan Ma
>Priority: Critical
> Fix For: 1.10.0
>
>
> This originally arose from the missing notification from the on_failure and 
> on_success callback at the dag level. The stack trace is as follows:
> {code:java}
> [2018-02-07 07:00:08,145] \{models.py:2984} DagFileProcessor172 INFO - 
> Executing dag callback function: 
>  .GeneralNotifyFailed instance at 0x7fec9d8ad368>
> [2018-02-07 07:00:08,161] \{models.py:168} DagFileProcessor172 INFO - Filling 
> up the DagBag from /home/charon/.virtualenvs/airflow/airflow_home/dags
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> [2018-02-07 07:00:12,103] \{jobs.py:354} DagFileProcessor172 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 346, in helper
> pickle_dags)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 1586, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 1175, in _process_dags
> dag_run = self.create_dag_run(dag)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 747, in create_dag_run
> dag.handle_callback(dr, success=False, reason='dagrun_timeout', 
> session=session)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/models.py",
>  line 2990, in handle_callback
> d = dagrun.dag or DagBag().get_dag(dag_id=dagrun.dag_id)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/attributes.py",
>  line 237, in __get__
> return self.impl.get(instance_state(instance), dict_)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/attributes.py",
>  line 579, in get
> value = state._load_expired(state, passive)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/state.py",
>  line 592, in _load_expired
> self.manager.deferred_scalar_loader(self, toload)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/loading.py",
>  line 644, in load_scalar_attributes
> (state_str(state)))
> DetachedInstanceError: Instance  is not bound to a 
> Session; attribute refresh operation cannot proceed
> [2018-02-07 07:00:31,003] \{jobs.py:343} DagFileProcessor208 INFO - Started 
> process (PID=7813) to work on 
> /home/charon/.virtualenvs/airflow/airflow_home/dags/c
> haron-airflow/dags/inapp_vendor_sku_breakdown.py\
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2113) Address missing DagRun callbacks

2018-02-16 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2113.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3038
[https://github.com/apache/incubator-airflow/pull/3038]

> Address missing DagRun callbacks
> 
>
> Key: AIRFLOW-2113
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2113
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Alan Ma
>Assignee: Alan Ma
>Priority: Critical
> Fix For: 1.10.0
>
>
> This originally arose from the missing notification from the on_failure and 
> on_success callback at the dag level. The stack trace is as follows:
> {code:java}
> [2018-02-07 07:00:08,145] \{models.py:2984} DagFileProcessor172 INFO - 
> Executing dag callback function: 
>  .GeneralNotifyFailed instance at 0x7fec9d8ad368>
> [2018-02-07 07:00:08,161] \{models.py:168} DagFileProcessor172 INFO - Filling 
> up the DagBag from /home/charon/.virtualenvs/airflow/airflow_home/dags
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> Dag: , paused: False
> [2018-02-07 07:00:12,103] \{jobs.py:354} DagFileProcessor172 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 346, in helper
> pickle_dags)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 1586, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 1175, in _process_dags
> dag_run = self.create_dag_run(dag)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/jobs.py",
>  line 747, in create_dag_run
> dag.handle_callback(dr, success=False, reason='dagrun_timeout', 
> session=session)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/airflow/models.py",
>  line 2990, in handle_callback
> d = dagrun.dag or DagBag().get_dag(dag_id=dagrun.dag_id)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/attributes.py",
>  line 237, in __get__
> return self.impl.get(instance_state(instance), dict_)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/attributes.py",
>  line 579, in get
> value = state._load_expired(state, passive)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/state.py",
>  line 592, in _load_expired
> self.manager.deferred_scalar_loader(self, toload)
> File 
> "/home/charon/.virtualenvs/airflow/local/lib/python2.7/site-packages/sqlalchemy/orm/loading.py",
>  line 644, in load_scalar_attributes
> (state_str(state)))
> DetachedInstanceError: Instance  is not bound to a 
> Session; attribute refresh operation cannot proceed
> [2018-02-07 07:00:31,003] \{jobs.py:343} DagFileProcessor208 INFO - Started 
> process (PID=7813) to work on 
> /home/charon/.virtualenvs/airflow/airflow_home/dags/c
> haron-airflow/dags/inapp_vendor_sku_breakdown.py\
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2112) Recent Tasks UI cell does not show all items

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2112?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366778#comment-16366778
 ] 

ASF subversion and git services commented on AIRFLOW-2112:
--

Commit d56dcbdc615f39f99aab2d5941cc71cccac90116 in incubator-airflow's branch 
refs/heads/master from [~diraol]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=d56dcbd ]

[AIRFLOW-2112] Fix svg width for Recent Tasks on UI.

As we expect to have 8 elements with width between
25px and 30px, 30 px
were used.

Closes #3047 from diraol/fix2112-svg-width


> Recent Tasks UI cell does not show all items
> 
>
> Key: AIRFLOW-2112
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2112
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.9.1
>Reporter: Diego Rabatone Oliveira
>Assignee: Diego Rabatone Oliveira
>Priority: Minor
> Fix For: 1.10.0
>
> Attachments: 2018-02-15-182303_401x103_scrot.png
>
>
> Recent Tasks cell on UI shows only some of the items from the SVG file within 
> it.
> !2018-02-15-182303_401x103_scrot.png!
> This is because the SVG has its width hardcoded in 180px.
> As we expect to have 8 items, with around 25-27px each, changin the width to 
> 240px would solve the problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2116] Set CI Cloudant version to <2.0

2018-02-16 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 0ca6f92e2 -> 283b8d1b9


[AIRFLOW-2116] Set CI Cloudant version to <2.0

The python-cloudant release 2.8 is broken and
causes our CI to fail.
In the setup.py we install cloudant version <2.0
and in our CI pipeline
we install the latest version.

Closes #3051 from Fokko/fd-fix-cloudant


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/283b8d1b
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/283b8d1b
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/283b8d1b

Branch: refs/heads/master
Commit: 283b8d1b9754b97debd6de72eada65f63380f9b6
Parents: 0ca6f92
Author: Fokko Driesprong 
Authored: Fri Feb 16 10:45:38 2018 +0100
Committer: Bolke de Bruin 
Committed: Fri Feb 16 10:45:38 2018 +0100

--
 scripts/ci/requirements.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/283b8d1b/scripts/ci/requirements.txt
--
diff --git a/scripts/ci/requirements.txt b/scripts/ci/requirements.txt
index bba5d29..4bcc453 100644
--- a/scripts/ci/requirements.txt
+++ b/scripts/ci/requirements.txt
@@ -20,7 +20,7 @@ boto3
 celery
 cgroupspy
 chartkick
-cloudant
+cloudant<2.0
 coverage
 coveralls
 croniter>=0.3.17



[jira] [Commented] (AIRFLOW-2116) Python Cloudant 2.8 causes CI to fail

2018-02-16 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2116?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16366775#comment-16366775
 ] 

ASF subversion and git services commented on AIRFLOW-2116:
--

Commit 283b8d1b9754b97debd6de72eada65f63380f9b6 in incubator-airflow's branch 
refs/heads/master from [~Fokko]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=283b8d1 ]

[AIRFLOW-2116] Set CI Cloudant version to <2.0

The python-cloudant release 2.8 is broken and
causes our CI to fail.
In the setup.py we install cloudant version <2.0
and in our CI pipeline
we install the latest version.

Closes #3051 from Fokko/fd-fix-cloudant


> Python Cloudant 2.8 causes CI to fail
> -
>
> Key: AIRFLOW-2116
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2116
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Fokko Driesprong
>Priority: Major
>
> Python Cloudant 2.8 causes CI to fail:
> [https://github.com/cloudant/python-cloudant/issues/360]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2116) Python Cloudant 2.8 causes CI to fail

2018-02-16 Thread Fokko Driesprong (JIRA)
Fokko Driesprong created AIRFLOW-2116:
-

 Summary: Python Cloudant 2.8 causes CI to fail
 Key: AIRFLOW-2116
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2116
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Fokko Driesprong


Python Cloudant 2.8 causes CI to fail:

[https://github.com/cloudant/python-cloudant/issues/360]

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)