[jira] [Reopened] (AIRFLOW-2716) Replace new Python 3.7 keywords

2018-07-10 Thread Jacob Hayes (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jacob Hayes reopened AIRFLOW-2716:
--

The PR attached to this issue covers more than the linked duplicate and is not 
merged/resolved yet. The linked duplicate didn't fix all 3.7 async/await issues.

> Replace new Python 3.7 keywords
> ---
>
> Key: AIRFLOW-2716
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2716
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 2.0
>Reporter: Jacob Hayes
>Assignee: Jacob Hayes
>Priority: Major
>
> Python 3.7 added `async` and `await` as reserved keywords, so they need to be 
> replaced with alternative names.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2730) Airflow 1.9.0+ Web UI Fails to Load in IE11

2018-07-10 Thread Bolke de Bruin (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin updated AIRFLOW-2730:

Affects Version/s: 1.9.0

> Airflow 1.9.0+ Web UI Fails to Load in IE11
> ---
>
> Key: AIRFLOW-2730
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2730
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.9.0
>Reporter: Cameron Yick
>Priority: Minor
>
> As a developer, I would like to use Airflow in enterprise environments where 
> IE11 is the only browser available.
> Presently, the admin view doesn't load because some of the inlined javascript 
> on the page uses ES6 features, like the array spread operator. 
> Fixing this change will become a lot easier after AIRFLOW-2691 goes through, 
> because transpilation could happen as part of the build process.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2730) Airflow 1.9.0+ Web UI Fails to Load in IE11

2018-07-10 Thread Bolke de Bruin (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin updated AIRFLOW-2730:

Priority: Critical  (was: Blocker)

> Airflow 1.9.0+ Web UI Fails to Load in IE11
> ---
>
> Key: AIRFLOW-2730
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2730
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Reporter: Cameron Yick
>Priority: Critical
>
> As a developer, I would like to use Airflow in enterprise environments where 
> IE11 is the only browser available.
> Presently, the admin view doesn't load because some of the inlined javascript 
> on the page uses ES6 features, like the array spread operator. 
> Fixing this change will become a lot easier after AIRFLOW-2691 goes through, 
> because transpilation could happen as part of the build process.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2730) Airflow 1.9.0+ Web UI Fails to Load in IE11

2018-07-10 Thread Bolke de Bruin (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin updated AIRFLOW-2730:

Priority: Minor  (was: Critical)

> Airflow 1.9.0+ Web UI Fails to Load in IE11
> ---
>
> Key: AIRFLOW-2730
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2730
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Reporter: Cameron Yick
>Priority: Minor
>
> As a developer, I would like to use Airflow in enterprise environments where 
> IE11 is the only browser available.
> Presently, the admin view doesn't load because some of the inlined javascript 
> on the page uses ES6 features, like the array spread operator. 
> Fixing this change will become a lot easier after AIRFLOW-2691 goes through, 
> because transpilation could happen as part of the build process.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2745) Use k8s service account for Kube Pod Operator if in Cluster and using LocalExecutor

2018-07-10 Thread Eamon Keane (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eamon Keane updated AIRFLOW-2745:
-
Description: 
When deploying airflow on kubernetes and using LocalExecutor, currently the 
Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
scheduler and has no awareness of being inside a cluster.

There is no option to mount a kubeconfig file on a KubernetesExecutor worker as 
the KubernetesExecutor instead launches Kubenetes Pod Operator pods using the 
mounted RBAC account.

For users switching between the KubernetesExecutor and LocalExecutor in a helm 
chart (for example by using --set core.executor=LocalExecutor), an additional 
kubeconfig secret has to be managed and mounted on the scheduler if they want 
to debug a dag which uses the Kubernetes Pod Operator, while it could instead 
use the RBAC account.

An example where switching between local and kubernetes executor was useful was 
to discover that the reason a dag worked on Local but not on Kubernetes 
Executor was because the fernet key was not specified as an environment 
variable in the worker definition.

The suggested improvement would be to use the mounted RBAC account on the 
scheduler pod to launch pods if in a kubernetes environment, removing the need 
for a kubeconfig.

 

  was:
When deploying airflow on kubernetes and using LocalExecutor, currently the 
Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
scheduler and has no awareness of being inside a cluster.

There is no option to mount a kubeconfig file on a KubernetesExecutor worker as 
the KubernetesExecutor instead launches Kubenetes Pod Operator pods using the 
mounted RBAC account.

For users switching between the KubernetesExecutor and LocalExecutor in a helm 
chart (for example by using --set core.executor=LocalExecutor), an additional 
kubeconfig secret has to be managed and mounted on scheduler if they want to 
debug a dag which uses the Kubernetes Pod Operator, while it could instead use 
the RBAC account.

An example where switching between local and kubernetes executor was useful was 
to discover that the reason a dag worked on Local but not on Kubernetes 
Executor was because the fernet key was not specified as an environment 
variable in the worker definition.

The suggested improvement would be to use the mounted RBAC account on the 
scheduler pod to launch pods if in a kubernetes environment, removing the need 
for a kubeconfig.

 


> Use k8s service account for Kube Pod Operator if in Cluster and using 
> LocalExecutor
> ---
>
> Key: AIRFLOW-2745
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2745
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Eamon Keane
>Assignee: Daniel Imberman
>Priority: Minor
>  Labels: features
>
> When deploying airflow on kubernetes and using LocalExecutor, currently the 
> Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
> scheduler and has no awareness of being inside a cluster.
> There is no option to mount a kubeconfig file on a KubernetesExecutor worker 
> as the KubernetesExecutor instead launches Kubenetes Pod Operator pods using 
> the mounted RBAC account.
> For users switching between the KubernetesExecutor and LocalExecutor in a 
> helm chart (for example by using --set core.executor=LocalExecutor), an 
> additional kubeconfig secret has to be managed and mounted on the scheduler 
> if they want to debug a dag which uses the Kubernetes Pod Operator, while it 
> could instead use the RBAC account.
> An example where switching between local and kubernetes executor was useful 
> was to discover that the reason a dag worked on Local but not on Kubernetes 
> Executor was because the fernet key was not specified as an environment 
> variable in the worker definition.
> The suggested improvement would be to use the mounted RBAC account on the 
> scheduler pod to launch pods if in a kubernetes environment, removing the 
> need for a kubeconfig.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2190) base_url with a subpath generates TypeError

2018-07-10 Thread John Arnold (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538977#comment-16538977
 ] 

John Arnold commented on AIRFLOW-2190:
--

[~elgalu] Glad you got it working.  I think avoiding the proxy-pass issue 
basically avoids the TypeError, but the TypeError bug still exists.  Anyway, my 
airflow project got cancelled so not likely I'm going to have time to update 
the docs or the code here.  Maybe you can pick it up?

> base_url with a subpath generates TypeError
> ---
>
> Key: AIRFLOW-2190
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2190
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.9.0
>Reporter: John Arnold
>Priority: Major
>
> I'm running into what looks like a bug in airflow webserver. Running against 
> master:
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: 
> [2018-03-07 18:20:13 +] [102] [ERROR] Error handling request /
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: Traceback 
> (most recent call last):
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/gunicorn/workers/sync.py", line 135, 
> in handle
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: 
> self.handle_request(listener, req, client, addr)
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/gunicorn/workers/sync.py", line 176, 
> in handle_request
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: respiter = 
> self.wsgi(environ, resp.start_response)
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/werkzeug/wsgi.py", line 826, in call
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: return 
> app(environ, start_response)
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/airflow/www/app.py", line 166, in 
> root_app
> Mar 7 18:20:13 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: resp(b'404 
> Not Found', [(b'Content-Type', b'text/plain')])
> Mar 7 18:20:14 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/gunicorn/http/wsgi.py", line 261, in 
> start_response
> Mar 7 18:20:14 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: 
> self.process_headers(headers)
> Mar 7 18:20:14 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: File 
> "/usr/local/lib/python3.6/site-packages/gunicorn/http/wsgi.py", line 268, in 
> process_headers
> Mar 7 18:20:14 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: raise 
> TypeError('%r is not a string' % name)
> Mar 7 18:20:14 netdocker1-eastus2 daemon INFO ca5ce9db3af6[92630]: TypeError: 
> b'Content-Type' is not a string
>  
> I just started using the base_url to put the webserver behind nginx proxy 
> under a sub-path, elg [http://domain.com/airflow]
> I've tried following the docs for nginx proxy, i.e.
> [webserver]
> base_url = [http://localhost/airflow|http://airflow-web/airflow]
>  
> I've also tried setting the base_url to the fully-qualified endpoint:
> base_url = [https://example.com/airflow|https://domain.com/airflow]
>  
> Neither work, both give the TypeError exception.
>  
> If I remove the sub-path:
> base_url = [https://example.com|https://domain.com/]
> then the app starts and runs ok and i can access it on the host but not 
> through the proxy.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2745) Use k8s service account for Kube Pod Operator if in Cluster and using LocalExecutor

2018-07-10 Thread Eamon Keane (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538978#comment-16538978
 ] 

Eamon Keane commented on AIRFLOW-2745:
--

Hope I've got the above correct!

> Use k8s service account for Kube Pod Operator if in Cluster and using 
> LocalExecutor
> ---
>
> Key: AIRFLOW-2745
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2745
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Eamon Keane
>Assignee: Daniel Imberman
>Priority: Minor
>  Labels: features
>
> When deploying airflow on kubernetes and using LocalExecutor, currently the 
> Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
> scheduler and has no awareness of being inside a cluster.
> There is no option to mount a kubeconfig file on a KubernetesExecutor worker 
> as the KubernetesExecutor instead launches Kubenetes Pod Operator pods using 
> the mounted RBAC account.
> For users switching between the KubernetesExecutor and LocalExecutor in a 
> helm chart (for example by using --set core.executor=LocalExecutor), an 
> additional kubeconfig secret has to be managed and mounted on scheduler if 
> they want to debug a dag which uses the Kubernetes Pod Operator, while it 
> could instead use the RBAC account.
> An example where switching between local and kubernetes executor was useful 
> was to discover that the reason a dag worked on Local but not on Kubernetes 
> Executor was because the fernet key was not specified as an environment 
> variable in the worker definition.
> The suggested improvement would be to use the mounted RBAC account on the 
> scheduler pod to launch pods if in a kubernetes environment, removing the 
> need for a kubeconfig.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2745) Use k8s service account for Kube Pod Operator if in Cluster and using LocalExecutor

2018-07-10 Thread Eamon Keane (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eamon Keane updated AIRFLOW-2745:
-
Description: 
When deploying airflow on kubernetes and using LocalExecutor, currently the 
Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
scheduler and has no awareness of being inside a cluster.

There is no option to mount a kubeconfig file on a KubernetesExecutor worker as 
the KubernetesExecutor instead launches Kubenetes Pod Operator pods using the 
mounted RBAC account.

For users switching between the KubernetesExecutor and LocalExecutor in a helm 
chart (for example by using --set core.executor=LocalExecutor), an additional 
kubeconfig secret has to be managed and mounted on scheduler if they want to 
debug a dag which uses the Kubernetes Pod Operator, while it could instead use 
the RBAC account.

An example where switching between local and kubernetes executor was useful was 
to discover that the reason a dag worked on Local but not on Kubernetes 
Executor was because the fernet key was not specified as an environment 
variable in the worker definition.

The suggested improvement would be to use the mounted RBAC account on the 
scheduler pod to launch pods if in a kubernetes environment, removing the need 
for a kubeconfig.

 

  was:
When deploying airflow on kubernetes and using LocalExecutor, currently the 
Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
scheduler and has no awareness of being inside a cluster.

There is no option to mount a kubeconfig file on a KubernetesExecutor worker as 
the KubernetesExecutor instead launches Kubenetes Pod Operator pods using the 
mounted RBAC account.

For users switching between the KubernetesExecutor and LocalExecutor in a helm 
chart (for example by using --set core.executor=LocalExecutor), an additional 
kubeconfig secret has to be managed and mounted on scheduler if they want to 
debug a dag which uses the Kubernetes Pod Operator, which it could instead use 
the RBAC account.

An example where switching between local and kubernetes executor was useful was 
to discover that the reason a dag worked on Local but not on Kubernetes 
Executor was because the fernet key was not specified as an environment 
variable in the worker definition.

The suggested improvement would be to use the mounted RBAC account on the 
scheduler pod to launch pods if in a kubernetes environment, removing the need 
for a kubeconfig.

 


> Use k8s service account for Kube Pod Operator if in Cluster and using 
> LocalExecutor
> ---
>
> Key: AIRFLOW-2745
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2745
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Eamon Keane
>Assignee: Daniel Imberman
>Priority: Minor
>  Labels: features
>
> When deploying airflow on kubernetes and using LocalExecutor, currently the 
> Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
> scheduler and has no awareness of being inside a cluster.
> There is no option to mount a kubeconfig file on a KubernetesExecutor worker 
> as the KubernetesExecutor instead launches Kubenetes Pod Operator pods using 
> the mounted RBAC account.
> For users switching between the KubernetesExecutor and LocalExecutor in a 
> helm chart (for example by using --set core.executor=LocalExecutor), an 
> additional kubeconfig secret has to be managed and mounted on scheduler if 
> they want to debug a dag which uses the Kubernetes Pod Operator, while it 
> could instead use the RBAC account.
> An example where switching between local and kubernetes executor was useful 
> was to discover that the reason a dag worked on Local but not on Kubernetes 
> Executor was because the fernet key was not specified as an environment 
> variable in the worker definition.
> The suggested improvement would be to use the mounted RBAC account on the 
> scheduler pod to launch pods if in a kubernetes environment, removing the 
> need for a kubeconfig.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2745) Use k8s service account for Kube Pod Operator if in Cluster

2018-07-10 Thread Eamon Keane (JIRA)
Eamon Keane created AIRFLOW-2745:


 Summary: Use k8s service account for Kube Pod Operator if in 
Cluster
 Key: AIRFLOW-2745
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2745
 Project: Apache Airflow
  Issue Type: Improvement
  Components: operators
Affects Versions: 2.0.0
Reporter: Eamon Keane
Assignee: Daniel Imberman


When deploying airflow on kubernetes and using LocalExecutor, currently the 
Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
scheduler and has no awareness of being inside a cluster.

There is no option to mount a kubeconfig file on a KubernetesExecutor worker as 
the KubernetesExecutor instead launches Kubenetes Pod Operator pods using the 
mounted RBAC account.

For users switching between the KubernetesExecutor and LocalExecutor in a helm 
chart (for example by using --set core.executor=LocalExecutor), an additional 
kubeconfig secret has to be managed and mounted on scheduler if they want to 
debug a dag which uses the Kubernetes Pod Operator, which it could instead use 
the RBAC account.

An example where switching between local and kubernetes executor was useful was 
to discover that the reason a dag worked on Local but not on Kubernetes 
Executor was because the fernet key was not specified as an environment 
variable in the worker definition.

The suggested improvement would be to use the mounted RBAC account on the 
scheduler pod to launch pods if in a kubernetes environment, removing the need 
for a kubeconfig.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2745) Use k8s service account for Kube Pod Operator if in Cluster and using LocalExecutor

2018-07-10 Thread Eamon Keane (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eamon Keane updated AIRFLOW-2745:
-
Summary: Use k8s service account for Kube Pod Operator if in Cluster and 
using LocalExecutor  (was: Use k8s service account for Kube Pod Operator if in 
Cluster)

> Use k8s service account for Kube Pod Operator if in Cluster and using 
> LocalExecutor
> ---
>
> Key: AIRFLOW-2745
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2745
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 2.0.0
>Reporter: Eamon Keane
>Assignee: Daniel Imberman
>Priority: Minor
>  Labels: features
>
> When deploying airflow on kubernetes and using LocalExecutor, currently the 
> Kubernetes Pod Operator relies on a kubeconfig file being mounted on the 
> scheduler and has no awareness of being inside a cluster.
> There is no option to mount a kubeconfig file on a KubernetesExecutor worker 
> as the KubernetesExecutor instead launches Kubenetes Pod Operator pods using 
> the mounted RBAC account.
> For users switching between the KubernetesExecutor and LocalExecutor in a 
> helm chart (for example by using --set core.executor=LocalExecutor), an 
> additional kubeconfig secret has to be managed and mounted on scheduler if 
> they want to debug a dag which uses the Kubernetes Pod Operator, which it 
> could instead use the RBAC account.
> An example where switching between local and kubernetes executor was useful 
> was to discover that the reason a dag worked on Local but not on Kubernetes 
> Executor was because the fernet key was not specified as an environment 
> variable in the worker definition.
> The suggested improvement would be to use the mounted RBAC account on the 
> scheduler pod to launch pods if in a kubernetes environment, removing the 
> need for a kubeconfig.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-774) dagbag_size/collect_dags/dagbag_import_errors stats incorrect

2018-07-10 Thread Austin Hsu (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538973#comment-16538973
 ] 

Austin Hsu commented on AIRFLOW-774:


Hello!  Any update on the progress in fixing this issue or pointers on how to 
go about fixing it?  

We are currently sending metrics to datadog and experienced the same issue 
detailed here with the incorrect numbers.  Thanks!

> dagbag_size/collect_dags/dagbag_import_errors stats incorrect
> -
>
> Key: AIRFLOW-774
> URL: https://issues.apache.org/jira/browse/AIRFLOW-774
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Reporter: Dan Davydov
>Priority: Major
>
> After the multiprocessor change was made (dag folders are processed in 
> parallel), the number of dags reported by airflow is for each of these 
> subprocesses which is inaccurate, and potentially orders of magnitude less 
> than the actual number of dags. These individual processes stats should be 
> aggregated. The collect_dags/dagbag_import_errors stats should also be fixed 
> (time it takes to parse the dags).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2744) RBAC app doesn't integrate plugins (blueprints etc)

2018-07-10 Thread David Dossett (JIRA)
David Dossett created AIRFLOW-2744:
--

 Summary: RBAC app doesn't integrate plugins (blueprints etc)
 Key: AIRFLOW-2744
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2744
 Project: Apache Airflow
  Issue Type: Bug
  Components: webapp, webserver
Affects Versions: 1.10
Reporter: David Dossett


In the current 1.10.0rc tag, the new RBAC app doesn't integrate any plugins 
created by a user extending Airflow. In the old www/app.py you had the 
[integrate_plugins|https://github.com/apache/incubator-airflow/blob/f1083cbada337731ed0b7e27b09eee7a26c8189a/airflow/www/app.py#L126]
 function. But currently the 
[www_rbac/app.py|https://github.com/apache/incubator-airflow/blob/f1083cbada337731ed0b7e27b09eee7a26c8189a/airflow/www_rbac/app.py]
 doesn't pull in any plugins from the plugin_manager. So nothing you do to 
extend Airflow's webapp will work.

I think adding the code for registering the blueprints and menu links is a 
pretty simple fix. I'm not sure how the FAB system is handling the same 
functionality as Flask-Admin views though.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Bryński closed AIRFLOW-2743.
---
Resolution: Not A Problem

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538749#comment-16538749
 ] 

Maciej Bryński commented on AIRFLOW-2743:
-

You're right.
It was file.task.
Closing

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538749#comment-16538749
 ] 

Maciej Bryński edited comment on AIRFLOW-2743 at 7/10/18 3:11 PM:
--

You're right.
It was file.task.
Closing. Thank you for the help.


was (Author: maver1ck):
You're right.
It was file.task.
Closing

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538736#comment-16538736
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2743:


Are you using tip of that commit? Do you have an airflow.cfg other than the one 
in that repo? Cos 
https://github.com/gsemet/docker-airflow/commit/b01303f308c594de7ae03f4fdc8836dfd7683044#diff-500764aeca0d87e4244864271cffd099L101
 shows the fix that should work. But from the error you are using a different 
value.

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538716#comment-16538716
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2743:


Do yo have an airflow_local_settings.py file?

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538706#comment-16538706
 ] 

Maciej Bryński commented on AIRFLOW-2743:
-

Fresh install of 1.10rc1.
Runned in docker from 
https://github.com/gsemet/docker-airflow/tree/2.0dev
(only version changed)

Logs are created in worker containers (there are log files).

>From webserver log I'm getting only this (no exception)
{code}
100.117.96.198 - - [10/Jul/2018:14:42:11 +] "GET 
/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
 HTTP/1.1" 200 192 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) 
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"
{code}



> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538692#comment-16538692
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2743:


Have you changed anything about Airflow's task loging? Is this a fresh install 
of 1.10rc1 or is this an upgrade from a previous version?

> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Bryński updated AIRFLOW-2743:

Description: 
When I'm trying to check logs from web UI nothing is showing up.

Using developer console I can see that there is request to:
https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null

With response
{code}
{"error": true, 
  "message": ["Task log handler file.task does not support read 
logs.\n'NoneType' object has no attribute 'read'\n"], 
  "metadata": {"end_of_log": true}
}
{code}

PS. I'm using CeleryExecutor

  was:
When I'm trying to check logs from web UI nothing is showing up.

Using console I can see that there is request to:
https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null

With response
{code}
{"error": true, 
  "message": ["Task log handler file.task does not support read 
logs.\n'NoneType' object has no attribute 'read'\n"], 
  "metadata": {"end_of_log": true}
}
{code}

PS. I'm using CeleryExecutor


> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using developer console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA
Maciej Bryński created AIRFLOW-2743:
---

 Summary: Task log handler file.task does not support read logs
 Key: AIRFLOW-2743
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
 Project: Apache Airflow
  Issue Type: Bug
  Components: logging
Affects Versions: 1.10.0
Reporter: Maciej Bryński


When I'm trying to check logs from web UI nothing is showing up.

Using console I can see that there is request to:
https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null

With response
{code}
{"error": true, 
  "message": ["Task log handler file.task does not support read 
logs.\n'NoneType' object has no attribute 'read'\n"], 
  "metadata": {"end_of_log": true}
}
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2743) Task log handler file.task does not support read logs

2018-07-10 Thread JIRA


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Bryński updated AIRFLOW-2743:

Description: 
When I'm trying to check logs from web UI nothing is showing up.

Using console I can see that there is request to:
https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null

With response
{code}
{"error": true, 
  "message": ["Task log handler file.task does not support read 
logs.\n'NoneType' object has no attribute 'read'\n"], 
  "metadata": {"end_of_log": true}
}
{code}

PS. I'm using CeleryExecutor

  was:
When I'm trying to check logs from web UI nothing is showing up.

Using console I can see that there is request to:
https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null

With response
{code}
{"error": true, 
  "message": ["Task log handler file.task does not support read 
logs.\n'NoneType' object has no attribute 'read'\n"], 
  "metadata": {"end_of_log": true}
}
{code}


> Task log handler file.task does not support read logs
> -
>
> Key: AIRFLOW-2743
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2743
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.0
>Reporter: Maciej Bryński
>Priority: Major
>
> When I'm trying to check logs from web UI nothing is showing up.
> Using console I can see that there is request to:
> https://airflow_server/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator_id=run_this_last_date=2018-07-09T00%3A00%3A00%2B00%3A00_number=1=null
> With response
> {code}
> {"error": true, 
>   "message": ["Task log handler file.task does not support read 
> logs.\n'NoneType' object has no attribute 'read'\n"], 
>   "metadata": {"end_of_log": true}
> }
> {code}
> PS. I'm using CeleryExecutor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2742) Fix invalid tez.job.queue.name key

2018-07-10 Thread Jinho Kim (JIRA)
Jinho Kim created AIRFLOW-2742:
--

 Summary: Fix invalid tez.job.queue.name key
 Key: AIRFLOW-2742
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2742
 Project: Apache Airflow
  Issue Type: Bug
  Components: hooks
Affects Versions: 1.10
Reporter: Jinho Kim


The property key should be {{tez.queue.name}} instead of {{tez.job.queue.name}} 
in HiveCliHook
A property can add in extra field so this is not a major issue
https://github.com/apache/tez/blob/master/tez-api/src/main/java/org/apache/tez/dag/api/TezConfiguration.java



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2739) Airflow crashes on startup if LC_ALL env isnt set to utf-8

2018-07-10 Thread Carl Johan Gustavsson (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2739?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Carl Johan Gustavsson updated AIRFLOW-2739:
---
Description: 
When running Airflow 1.10.0 RC1 without LC_ALL environment variable set Airflow 
crashes on start with the following trace

 
{code:java}
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 Traceback (most 
recent call last):
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/bin/airflow", line 21, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/__init__.py",
 line 35, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration as conf
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/configuration.py",
 line 106, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     DEFAULT_CONFIG = 
f.read()
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/encodings/ascii.py", line 26, 
in decode
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     return 
codecs.ascii_decode(input, self.errors)[0]
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 UnicodeDecodeError: 
'ascii' codec can't decode byte 0xe2 in position 20770: ordinal not in 
range(128)
{code}
This is because the `config_templates/default_airflow.cfg` contains a non-ascii 
character and if LC_ALL isnt set to `{color:#00}en_US.UTF-8{color}` or 
similar Python will assume the file is in ascii.

 

Solution would be to always open the config files as utf-8 regardless of the 
LC_ALL environment variable.

 

This worked up until 
[https://github.com/apache/incubator-airflow/commit/16bae5634df24132b37eb752fe816f51bf7e83ca]
 it seems.

 

Python versions affected, 3.4.0, 3.5.5, 3.6.0

  was:
When running Airflow 1.10.0 RC1 without LC_ALL environment variable set Airflow 
crashes on start with the following trace

 
{code:java}
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 Traceback (most 
recent call last):
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/bin/airflow", line 21, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/__init__.py",
 line 35, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration as conf
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/configuration.py",
 line 106, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     DEFAULT_CONFIG = 
f.read()
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/encodings/ascii.py", line 26, 
in decode
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     return 
codecs.ascii_decode(input, self.errors)[0]
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 UnicodeDecodeError: 
'ascii' codec can't decode byte 0xe2 in position 20770: ordinal not in 
range(128)
{code}
This is because the `config_templates/default_airflow.cfg` contains a non-ascii 
character and if LC_ALL isnt set to `{color:#00}en_US.UTF-8{color}` or 
similar Python will assume the file is in ascii.

 

Solution would be to always open the config files as utf-8 regardless of the 
LC_ALL environment variable.

 

This worked up until 
[https://github.com/apache/incubator-airflow/commit/16bae5634df24132b37eb752fe816f51bf7e83ca]
 it seems.


> Airflow crashes on startup if LC_ALL env isnt set to utf-8
> --
>
> Key: AIRFLOW-2739
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2739
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.0
> Environment: Python 3.6.0, Ubuntu 14.04.5 LTS 
>Reporter: Carl Johan Gustavsson
>Assignee: Carl Johan Gustavsson
>Priority: Major
>
> When running Airflow 1.10.0 RC1 without LC_ALL environment variable set 
> Airflow crashes on start with the following trace
>  
> {code:java}
> Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 Traceback (most 
> recent call last):
> Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
> "/opt/virtualenv/tictail/pipeline/bin/airflow", line 21, in 
> Jul 10 08:50:33 

[jira] [Created] (AIRFLOW-2741) Example Kubernetes pod operator doesn't work

2018-07-10 Thread Jon Davies (JIRA)
Jon Davies created AIRFLOW-2741:
---

 Summary: Example Kubernetes pod operator doesn't work
 Key: AIRFLOW-2741
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2741
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Jon Davies


I'm trying to make the DAG from this blog post work:
 
https://www.techatbloomberg.com/blog/airflow-on-kubernetes/

But it fails with:

{code}
airflow-test/airflow-5cc9cf9f99-v6rtd[scheduler]: [2018-07-10 13:00:41,375] 
{kubernetes_executor.py:579} INFO - self.running: {(u'kubernetes_sample', 
u'run_this_first', datetime.datetime(2018, 7, 10, 13, 0, 17, 332338, 
tzinfo=psycopg2.tz.FixedOffsetTimezone(offset=0, name=None))): u'airflow run 
kubernetes_sample run_this_first 2018-07-10T13:00:17.332338+00:00 --local -sd 
/root/airflow/dags'}
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 + airflow run kubernetes_sample run_this_first 
2018-07-10T13:00:17.332338+00:00 --local -sd /root/airflow/dags
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 [2018-07-10 13:00:57,667] {settings.py:174} INFO - setting.configure_orm(): 
Using pool settings. pool_size=5, pool_recycle=1800
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 [2018-07-10 13:00:57,919] {__init__.py:51} INFO - Using executor LocalExecutor
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 [2018-07-10 13:00:58,023] {models.py:257} INFO - Filling up the DagBag from 
/root/airflow/dags
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 [2018-07-10 13:00:58,029] {models.py:348} INFO - File 
/usr/local/lib/python2.7/dist-packages/airflow/example_dags/__init__.py assumed 
to contain no DAGs. Skipping.
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 Traceback (most recent call last):
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
   File "/usr/local/bin/airflow", line 32, in 
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 args.func(args)
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
   File "/usr/local/lib/python2.7/dist-packages/airflow/utils/cli.py", line 74, 
in wrapper
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 return f(*args, **kwargs)
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
   File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 475, 
in run
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 dag = get_dag(args)
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
   File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 146, 
in get_dag
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 'parse.'.format(args.dag_id))
airflow-test/kubernetessamplerunthisfirst-9be9b8e6d3b84b949fdd9cc7ba3263c3[base]:
 airflow.exceptions.AirflowException: dag_id could not be found: 
kubernetes_sample. Either the dag did not exist or it failed to parse.
{code}

...note that the DAG isn't in my Docker image, I just cp'ed it to the 
webserver/scheduler as the blog post did.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2740) Kubernetes RBAC policy required

2018-07-10 Thread Jon Davies (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jon Davies updated AIRFLOW-2740:

Description: 
The Airflow Executor needs to ship with an example policy, something like:
{code:java}
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: pod-reader
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["create", "delete", "get", "watch", "list"]
---
# This role binding allows "default" to read pods in the "testing-airflow" 
namespace.
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: read-pods
subjects:
- kind: ServiceAccount
  name: default # Name is case sensitive
roleRef:
  kind: Role #this must be Role or ClusterRole
  name: pod-reader # this must match the name of the Role or ClusterRole you 
wish to bind to
  apiGroup: rbac.authorization.k8s.io
{code}

  was:
The Airflow Executor needs to ship with an example policy, something like:

{code}
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  namespace: production-airflow
  name: pod-reader
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["create", "delete", "get", "watch", "list"]
---
# This role binding allows "default" to read pods in the "testing-airflow" 
namespace.
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: read-pods
subjects:
- kind: ServiceAccount
  name: default # Name is case sensitive
roleRef:
  kind: Role #this must be Role or ClusterRole
  name: pod-reader # this must match the name of the Role or ClusterRole you 
wish to bind to
  apiGroup: rbac.authorization.k8s.io
{code}


> Kubernetes RBAC policy required
> ---
>
> Key: AIRFLOW-2740
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2740
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jon Davies
>Priority: Major
>
> The Airflow Executor needs to ship with an example policy, something like:
> {code:java}
> kind: Role
> apiVersion: rbac.authorization.k8s.io/v1
> metadata:
>   name: pod-reader
> rules:
> - apiGroups: [""] # "" indicates the core API group
>   resources: ["pods"]
>   verbs: ["create", "delete", "get", "watch", "list"]
> ---
> # This role binding allows "default" to read pods in the "testing-airflow" 
> namespace.
> kind: RoleBinding
> apiVersion: rbac.authorization.k8s.io/v1
> metadata:
>   name: read-pods
> subjects:
> - kind: ServiceAccount
>   name: default # Name is case sensitive
> roleRef:
>   kind: Role #this must be Role or ClusterRole
>   name: pod-reader # this must match the name of the Role or ClusterRole you 
> wish to bind to
>   apiGroup: rbac.authorization.k8s.io
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2740) Kubernetes RBAC policy required

2018-07-10 Thread Jon Davies (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jon Davies updated AIRFLOW-2740:

Description: 
The Airflow Executor needs to ship with an example policy, something like:
{code:java}
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: pod-reader
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["create", "delete", "get", "watch", "list"]
---
# This role binding allows "default" to read pods
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: read-pods
subjects:
- kind: ServiceAccount
  name: default # Name is case sensitive
roleRef:
  kind: Role #this must be Role or ClusterRole
  name: pod-reader # this must match the name of the Role or ClusterRole you 
wish to bind to
  apiGroup: rbac.authorization.k8s.io
{code}

  was:
The Airflow Executor needs to ship with an example policy, something like:
{code:java}
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: pod-reader
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["create", "delete", "get", "watch", "list"]
---
# This role binding allows "default" to read pods in the "testing-airflow" 
namespace.
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: read-pods
subjects:
- kind: ServiceAccount
  name: default # Name is case sensitive
roleRef:
  kind: Role #this must be Role or ClusterRole
  name: pod-reader # this must match the name of the Role or ClusterRole you 
wish to bind to
  apiGroup: rbac.authorization.k8s.io
{code}


> Kubernetes RBAC policy required
> ---
>
> Key: AIRFLOW-2740
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2740
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jon Davies
>Priority: Major
>
> The Airflow Executor needs to ship with an example policy, something like:
> {code:java}
> kind: Role
> apiVersion: rbac.authorization.k8s.io/v1
> metadata:
>   name: pod-reader
> rules:
> - apiGroups: [""] # "" indicates the core API group
>   resources: ["pods"]
>   verbs: ["create", "delete", "get", "watch", "list"]
> ---
> # This role binding allows "default" to read pods
> kind: RoleBinding
> apiVersion: rbac.authorization.k8s.io/v1
> metadata:
>   name: read-pods
> subjects:
> - kind: ServiceAccount
>   name: default # Name is case sensitive
> roleRef:
>   kind: Role #this must be Role or ClusterRole
>   name: pod-reader # this must match the name of the Role or ClusterRole you 
> wish to bind to
>   apiGroup: rbac.authorization.k8s.io
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2740) Kubernetes RBAC policy required

2018-07-10 Thread Jon Davies (JIRA)
Jon Davies created AIRFLOW-2740:
---

 Summary: Kubernetes RBAC policy required
 Key: AIRFLOW-2740
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2740
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Jon Davies


The Airflow Executor needs to ship with an example policy, something like:

{code}
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  namespace: production-airflow
  name: pod-reader
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["create", "delete", "get", "watch", "list"]
---
# This role binding allows "default" to read pods in the "testing-airflow" 
namespace.
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: read-pods
subjects:
- kind: ServiceAccount
  name: default # Name is case sensitive
roleRef:
  kind: Role #this must be Role or ClusterRole
  name: pod-reader # this must match the name of the Role or ClusterRole you 
wish to bind to
  apiGroup: rbac.authorization.k8s.io
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2739) Airflow crashes on startup if LC_ALL env isnt set to utf-8

2018-07-10 Thread Carl Johan Gustavsson (JIRA)
Carl Johan Gustavsson created AIRFLOW-2739:
--

 Summary: Airflow crashes on startup if LC_ALL env isnt set to utf-8
 Key: AIRFLOW-2739
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2739
 Project: Apache Airflow
  Issue Type: Bug
  Components: configuration
Affects Versions: 1.10.0
 Environment: Python 3.6.0, Ubuntu 14.04.5 LTS 
Reporter: Carl Johan Gustavsson
Assignee: Carl Johan Gustavsson


When running Airflow 1.10.0 RC1 without LC_ALL environment variable set Airflow 
crashes on start with the following trace

 
{code:java}
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 Traceback (most 
recent call last):
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/bin/airflow", line 21, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/__init__.py",
 line 35, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     from airflow 
import configuration as conf
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/site-packages/airflow/configuration.py",
 line 106, in 
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     DEFAULT_CONFIG = 
f.read()
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01   File 
"/opt/virtualenv/tictail/pipeline/lib/python3.6/encodings/ascii.py", line 26, 
in decode
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01     return 
codecs.ascii_decode(input, self.errors)[0]
Jul 10 08:50:33 hostname supervisord: airflow-webserver-01 UnicodeDecodeError: 
'ascii' codec can't decode byte 0xe2 in position 20770: ordinal not in 
range(128)
{code}
This is because the `config_templates/default_airflow.cfg` contains a non-ascii 
character and if LC_ALL isnt set to `{color:#00}en_US.UTF-8{color}` or 
similar Python will assume the file is in ascii.

 

Solution would be to always open the config files as utf-8 regardless of the 
LC_ALL environment variable.

 

This worked up until 
[https://github.com/apache/incubator-airflow/commit/16bae5634df24132b37eb752fe816f51bf7e83ca]
 it seems.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2614) Airflow trigger_run API is very slow

2018-07-10 Thread Mishika Singh (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2614?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538486#comment-16538486
 ] 

Mishika Singh commented on AIRFLOW-2614:


raised PR for this : [https://github.com/apache/incubator-airflow/pull/3590]

 

> Airflow trigger_run API is very slow
> 
>
> Key: AIRFLOW-2614
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2614
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DagRun
>Affects Versions: Airflow 2.0, 1.9.0
>Reporter: raman
>Priority: Major
>
> Current implementation of trigger_dag processes all Local dag files 
> sequentially before creating a DAG run. Its done inside trigger_dag function 
> in trigger_dag.py
> "def trigger_dag(dag_id, run_id=None, conf=None, execution_date=None):
>  dagbag = DagBag()
> ."
> Processing all the files to get the dagBag slows down the trigger_dag api and 
> increases latency. We have observed that it starts taking 10(s) of seconds as 
> number of Dag Files increase. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2614) Airflow trigger_run API is very slow

2018-07-10 Thread Mishika Singh (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2614?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mishika Singh reassigned AIRFLOW-2614:
--

Assignee: Mishika Singh

> Airflow trigger_run API is very slow
> 
>
> Key: AIRFLOW-2614
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2614
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DagRun
>Affects Versions: Airflow 2.0, 1.9.0
>Reporter: raman
>Assignee: Mishika Singh
>Priority: Major
>
> Current implementation of trigger_dag processes all Local dag files 
> sequentially before creating a DAG run. Its done inside trigger_dag function 
> in trigger_dag.py
> "def trigger_dag(dag_id, run_id=None, conf=None, execution_date=None):
>  dagbag = DagBag()
> ."
> Processing all the files to get the dagBag slows down the trigger_dag api and 
> increases latency. We have observed that it starts taking 10(s) of seconds as 
> number of Dag Files increase. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2738) XCOM's don't work with PythonVirtualenvOperator

2018-07-10 Thread Cezary (JIRA)
Cezary created AIRFLOW-2738:
---

 Summary: XCOM's don't work with PythonVirtualenvOperator
 Key: AIRFLOW-2738
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2738
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: Airflow 1.8, Airflow 1.9.0
Reporter: Cezary


When you set _provide_context_ to True in PythonVirtualenvOperator, and try to 
use xcom_push, Airflow is trying to pickle context - and throws error. __ 
{code:java}
Traceback (most recent call last):
  File 
"/home/czarek/.virtualenvs/airflow/lib/python3.6/site-packages/airflow/models.py",
 line 1493, in _run_raw_task
result = task_copy.execute(context=context)
  File 
"/home/czarek/.virtualenvs/airflow/lib/python3.6/site-packages/airflow/operators/python_operator.py",
 line 93, in execute
return_value = self.execute_callable()
  File 
"/home/czarek/.virtualenvs/airflow/lib/python3.6/site-packages/airflow/operators/python_operator.py",
 line 249, in execute_callable
self._write_args(input_filename)
  File 
"/home/czarek/.virtualenvs/airflow/lib/python3.6/site-packages/airflow/operators/python_operator.py",
 line 305, in _write_args
pickle.dump(arg_dict, f)
TypeError: can't pickle module objects
{code}
I tried to debug - and it seems that _macros_, _task_instance_, _ti_ and _conf_ 
 in context are guilty in this case.
Using dill doesn't fix this problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2737) Restore original license header

2018-07-10 Thread Stefan Seelmann (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538405#comment-16538405
 ] 

Stefan Seelmann commented on AIRFLOW-2737:
--

PR: https://github.com/apache/incubator-airflow/pull/3591

> Restore original license header
> ---
>
> Key: AIRFLOW-2737
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2737
> Project: Apache Airflow
>  Issue Type: Improvement
>Affects Versions: 1.9.0
>Reporter: Stefan Seelmann
>Assignee: Stefan Seelmann
>Priority: Major
> Fix For: 2.0.0
>
>
> The original license header in airflow/api/auth/backend/kerberos_auth.py was 
> replaced with the AL. It should be restored.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2735) Equality vs identity check error with AIRFLOW-2706 patch

2018-07-10 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-2735.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

Issue resolved by pull request #3589
[https://github.com/apache/incubator-airflow/pull/3589]

> Equality vs identity check error with AIRFLOW-2706 patch
> 
>
> Key: AIRFLOW-2735
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2735
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Craig Forster
>Assignee: Craig Forster
>Priority: Minor
> Fix For: 2.0.0
>
>
> AIRFLOW-2706 uses "job_status is 'FAILED'" whereas it should be job_status == 
> 'FAILED'.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2735) Equality vs identity check error with AIRFLOW-2706 patch

2018-07-10 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16538226#comment-16538226
 ] 

ASF subversion and git services commented on AIRFLOW-2735:
--

Commit 9ebb04acbd07c208c30160bd28c342c569246e8f in incubator-airflow's branch 
refs/heads/master from [~cforster]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=9ebb04a ]

[AIRFLOW-2735] Use equality, not identity, check for detecting AWS Batch 
failures[]

Closes #3589 from craigforster/master


> Equality vs identity check error with AIRFLOW-2706 patch
> 
>
> Key: AIRFLOW-2735
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2735
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Craig Forster
>Assignee: Craig Forster
>Priority: Minor
> Fix For: 2.0.0
>
>
> AIRFLOW-2706 uses "job_status is 'FAILED'" whereas it should be job_status == 
> 'FAILED'.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2735] Use equality, not identity, check for detecting AWS Batch failures[]

2018-07-10 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 5060d90db -> 9ebb04acb


[AIRFLOW-2735] Use equality, not identity, check for detecting AWS Batch 
failures[]

Closes #3589 from craigforster/master


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/9ebb04ac
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/9ebb04ac
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/9ebb04ac

Branch: refs/heads/master
Commit: 9ebb04acbd07c208c30160bd28c342c569246e8f
Parents: 5060d90
Author: Craig Forster 
Authored: Tue Jul 10 10:03:09 2018 +0200
Committer: Fokko Driesprong 
Committed: Tue Jul 10 10:03:09 2018 +0200

--
 airflow/contrib/operators/awsbatch_operator.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/9ebb04ac/airflow/contrib/operators/awsbatch_operator.py
--
diff --git a/airflow/contrib/operators/awsbatch_operator.py 
b/airflow/contrib/operators/awsbatch_operator.py
index 75706aa..a5c86af 100644
--- a/airflow/contrib/operators/awsbatch_operator.py
+++ b/airflow/contrib/operators/awsbatch_operator.py
@@ -153,7 +153,7 @@ class AWSBatchOperator(BaseOperator):
 
 for job in response['jobs']:
 job_status = job['status']
-if job_status is 'FAILED':
+if job_status == 'FAILED':
 reason = job['statusReason']
 raise AirflowException('Job failed with status 
{}'.format(reason))
 elif job_status in [



[jira] [Created] (AIRFLOW-2737) Restore original license header

2018-07-10 Thread Stefan Seelmann (JIRA)
Stefan Seelmann created AIRFLOW-2737:


 Summary: Restore original license header
 Key: AIRFLOW-2737
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2737
 Project: Apache Airflow
  Issue Type: Improvement
Affects Versions: 1.9.0
Reporter: Stefan Seelmann
Assignee: Stefan Seelmann
 Fix For: 2.0.0


The original license header in airflow/api/auth/backend/kerberos_auth.py was 
replaced with the AL. It should be restored.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)