[GitHub] [airflow] dhuang commented on issue #6794: [AIRFLOW-6231] Display DAG run conf in the list view

2020-04-17 Thread GitBox
dhuang commented on issue #6794: [AIRFLOW-6231] Display DAG run conf in the 
list view
URL: https://github.com/apache/airflow/pull/6794#issuecomment-615572227
 
 
   Apologies for abandoning this, moved over to list view as suggested! PR 
description updated with screenshots. Made it searchable as well as I think it 
can be useful and also makes it consistent with both list/add views.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5548) REST API: get DAGs

2020-04-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5548?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17086306#comment-17086306
 ] 

ASF GitHub Bot commented on AIRFLOW-5548:
-

stale[bot] commented on pull request #6652: [AIRFLOW-5548] [AIRFLOW-5550] REST 
API enhancement - dag info, task …
URL: https://github.com/apache/airflow/pull/6652
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> REST API: get DAGs
> --
>
> Key: AIRFLOW-5548
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5548
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: api
>Affects Versions: 2.0.0
>Reporter: Norbert Biczo
>Assignee: Matt Buell
>Priority: Minor
>
> Like the already implemented [get 
> pools|[https://airflow.apache.org/api.html#get--api-experimental-pools]] but 
> with the DAGs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] stale[bot] closed pull request #6652: [AIRFLOW-5548] [AIRFLOW-5550] REST API enhancement - dag info, task …

2020-04-17 Thread GitBox
stale[bot] closed pull request #6652: [AIRFLOW-5548] [AIRFLOW-5550] REST API 
enhancement - dag info, task …
URL: https://github.com/apache/airflow/pull/6652
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615567459
 
 
   As I started thinking about it for a long time, we can create a  
`get_subprocess_context_manager` method in ``hook`` and also use the 
``get_hook`` method here. I'm afraid it might be overenginnering.  However, if 
you agree with me that we should use the composition, I can try to do it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615567459
 
 
   As I started thinking about it for a long time, we can create a  
`get_subprocess_context` method and also use the ``get_hook`` method here. I'm 
afraid it might be overenginnering.  However, if you agree with me that we 
should use the composition, I can try to do it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615567459
 
 
   As I started thinking about it for a long time, we can create a  
`get_subprocess_context`` method and also use the ``get_hook`` method here. I'm 
afraid it might be overenginnering.  However, if you agree with me that we 
should use the composition, I can try to do it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj commented on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615567459
 
 
   As I started thinking about it for a long time, we can create a  
`get_subprocess_context`` method and also use the ``get_hook`` method here. I'm 
afraid it might be overenginnering.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615560449
 
 
   @potiuk This contradicts the whole idea and the need for this operator. 
BashOperator and PythonOperaator are very useful because it is universal. Bash 
and Python are also built by compositions. New applications are installed on 
the system and can be used by any tool. If we inherit and make customization 
GCP-specific, we will limit its functionality. It will no longer be a universal 
operator. You will only be able to use it with one provider.  I think this is a 
similar problem to ``Connection.get_hook``.
   
https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L301
   This method is useful because it is universal and can be used regardless of 
the provider.
   In the future, if we need to separate the core and providers, we can extend 
this class with a plugin. A plugin that will add new parameters to the class. 
Like the get_hook method, it should use the plugin mechanism.
   
   I hope that in the future new parameters will be added for other cloud 
providers, e.g. AWS.
   ```python
   cross_platform_task = BashOperator(
   task_id='gcloud',
   bash_command=(
   'gsutil cp gs//bucket/a.txt a.txt && aws s3 cp test.txt 
s3://mybucket/test2.txt'
   ),
   gcp_conn_id=GCP_PROJECT_ID,
   aws_conn_id=AWS_PROJECT_ID,
   )
   ```
   Then it will still be a universal operator and we will not build a 
vendor-lock for one providers.
   
   From an architectural point of view. Here the use of inheritance will be 
bad, but we should composition. Inheritance will limit these operators too 
much.  
   I invite you to read the 
article.https://en.wikipedia.org/wiki/Composition_over_inheritance
   
   I will only cite one fragment.
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **VisibleAndSolid**, **VisibleAndMovable**, 
**VisibleAndSolidAndMovable**, etc. for every needed combination, though this 
leads to a large amount of repetitive code. 
   
   If we replace some words, we have our problem.
   
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **GoogleAndAws**, **GoogleAndAzure**, **AwsAndAzureAndGoogle**, 
etc. for every needed combination, though this leads to a large amount of 
repetitive code.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615560449
 
 
   @potiuk This contradicts the whole idea and the need for this operator. 
BashOperator and PythonOperaator are very useful because it is universal. Bash 
and Python are also built by compositions. New applications are installed on 
the system and can be used by any tool. If we inherit and make customization 
GCP-specific, we will limit its functionality. It will no longer be a universal 
operator. You will only be able to use it with one provider.  I think this is a 
similar problem to ``Connection.get_hook``.
   
https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L301
   This method is useful because it is universal and can be used regardless of 
the provider.
   In the future, if we need to separate the core and providers, we can extend 
this class with a plugin. A plugin that will add new parameters to the class. 
Like the get_hook method, it should use the plugin mechanism.
   
   I hope that in the future new parameters will be added for other cloud 
providers, e.g. AWS.
   ```python
   cross_platform_task = BashOperator(
   task_id='gcloud',
   bash_command=(
   'gsutil cp gs//bucket/a.txt a.txt && aws s3 cp test.txt 
s3://mybucket/test2.txt'
   ),
   gcp_conn_id=GCP_PROJECT_ID,
   aws_conn_id=AWS_PROJECT_ID,
   )
   ```
   Then it will still be a universal operator and we will not build a 
vendor-lock for one providers.
   
   From an architectural point of view. Here the use of inheritance will be 
bad, but we should composition. Inheritance will limit these operators too 
much.  
   I invite you to read the 
article.https://en.wikipedia.org/wiki/Composition_over_inheritance
   
   I will only cite one fragment.
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **VisibleAndSolid**, **VisibleAndMovable**, 
**VisibleAndSolidAndMovable**, etc. for every needed combination, though this 
leads to a large amount of repetitive code. 
   
   If we replace some words, we have our problem.
   
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **GoogleAndAws**, **GoogleAndAzure**, **AwsAndAzureAndGoogle**, 
etc. for every needed combination, though this leads to a large amount of 
repetitive code. 
   
   
   However, this is one of the parts that should be resolved by 
[AIP-8](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=100827303).
 We do not have enough use cases yet. It will be very difficult to build 
abstractions if we only support GCP.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615560449
 
 
   @potiuk This contradicts the whole idea and the need for this operator. 
BashOperator and PythonOperaator are very useful because it is universal. Bash 
and Python are also built by compositions. New applications are installed on 
the system and can be used by any tool. If we inherit and make customization 
GCP-specific, we will limit its functionality. It will no longer be a universal 
operator. You will only be able to use it with one provider.  I think this is a 
similar problem to ``Connection.get_hook``.
   
https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L301
   This method is useful because it is universal and can be used regardless of 
the provider.
   In the future, if we need to separate the core and providers, we can extend 
this class with a plugin. A plugin that will add new parameters to the class. 
Like the get_hook method, it should use the plugin mechanism.
   
   I hope that in the future new parameters will be added for other cloud 
providers, e.g. AWS.
   ```python
   cross_platform_task = BashOperator(
   task_id='gcloud',
   bash_command=(
   'gsutil cp gs//bucket/a.txt a.txt && aws s3 cp test.txt 
s3://mybucket/test2.txt'
   ),
   gcp_conn_id=GCP_PROJECT_ID,
   aws_conn_id=AWS_PROJECT_ID,
   )
   ```
   Then it will still be a universal operator and we will not build a 
vendor-lock for one providers.
   
   From an architectural point of view. Here the use of inheritance will be 
bad, but we should composition. Inheritance will limit these operators too 
much.  
   I invite you to read the 
article.https://en.wikipedia.org/wiki/Composition_over_inheritance
   
   I will only cite one fragment.
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **VisibleAndSolid**, **VisibleAndMovable**, 
**VisibleAndSolidAndMovable**, etc. for every needed combination, though this 
leads to a large amount of repetitive code. 
   
   If we replace some words, we have our problem.
   
   >Note that multiple inheritance is dangerous if not implemented carefully, 
as it can lead to the diamond problem. One solution to avoid this is to create 
classes such as **GoogleAndAws**, **GoogleAndAzure**, **AwsAndAzureAndGoogle**, 
etc. for every needed combination, though this leads to a large amount of 
repetitive code. 
   
   
   However, this is one of the parts that should be resolved by 
[AIP-8](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=100827303).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj commented on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615560449
 
 
   @potiuk This contradicts the whole idea and the need for this operator. 
BashOperator and PythonOperaator are very useful because it is universal. Bash 
and Python are also built by compositions. New applications are installed on 
the system and can be used by any tool. If we inherit and make customization 
GCP-specific, we will limit its functionality. It will no longer be a universal 
operator. You will only be able to use it with one provider.  I think this is a 
similar problem to ``Connection.get_hook``.
   
https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L301
   This method is useful because it is universal and can be used regardless of 
the provider.
   In the future, if we need to separate the core and providers, we can extend 
this class with a plugin. A plugin that will add new parameters to the class. 
Like the get_hook method, it should use the plugin mechanism.
   
   From an architectural point of view. Here the use of inheritance will be 
bad, but we should composition. Inheritance will limit these operators too 
much.  More information: 
https://en.wikipedia.org/wiki/Composition_over_inheritance
   
   I hope that in the future new parameters will be added for other cloud 
providers, e.g. AWS.
   ```
   cross_platform_task = BashOperator(
   task_id='gcloud',
   bash_command=(
   'gsutil cp gs//bucket/a.txt a.txt && aws s3 cp test.txt 
s3://mybucket/test2.txt'
   ),
   gcp_conn_id=GCP_PROJECT_ID,
   aws_conn_id=AWS_PROJECT_ID,
   )
   ```
   Then it will still be a universal operator and we will not build a 
vendor-lock for one providers.
   
   However, this is one of the parts that should be resolved by 
[AIP-8](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=100827303).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615560449
 
 
   @potiuk This contradicts the whole idea and the need for this operator. 
BashOperator and PythonOperaator are very useful because it is universal. Bash 
and Python are also built by compositions. New applications are installed on 
the system and can be used by any tool. If we inherit and make customization 
GCP-specific, we will limit its functionality. It will no longer be a universal 
operator. You will only be able to use it with one provider.  I think this is a 
similar problem to ``Connection.get_hook``.
   
https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L301
   This method is useful because it is universal and can be used regardless of 
the provider.
   In the future, if we need to separate the core and providers, we can extend 
this class with a plugin. A plugin that will add new parameters to the class. 
Like the get_hook method, it should use the plugin mechanism.
   
   From an architectural point of view. Here the use of inheritance will be 
bad, but we should composition. Inheritance will limit these operators too 
much.  More information: 
https://en.wikipedia.org/wiki/Composition_over_inheritance
   
   I hope that in the future new parameters will be added for other cloud 
providers, e.g. AWS.
   ```python
   cross_platform_task = BashOperator(
   task_id='gcloud',
   bash_command=(
   'gsutil cp gs//bucket/a.txt a.txt && aws s3 cp test.txt 
s3://mybucket/test2.txt'
   ),
   gcp_conn_id=GCP_PROJECT_ID,
   aws_conn_id=AWS_PROJECT_ID,
   )
   ```
   Then it will still be a universal operator and we will not build a 
vendor-lock for one providers.
   
   However, this is one of the parts that should be resolved by 
[AIP-8](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=100827303).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
potiuk commented on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615544056
 
 
   I think it should be a gcp_bash_operator.py deriving from Bash operator and 
it should be in providers/google.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5156) Add other authentication mechanisms to HttpHook

2020-04-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17086231#comment-17086231
 ] 

ASF GitHub Bot commented on AIRFLOW-5156:
-

potiuk commented on pull request #8429: [AIRFLOW-5156] Added auth type to 
HttpHook
URL: https://github.com/apache/airflow/pull/8429
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add other authentication mechanisms to HttpHook
> ---
>
> Key: AIRFLOW-5156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5156
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Joshua Kornblum
>Assignee: Rohit S S
>Priority: Minor
>
> It looks like the only supported authentication for HttpHooks is basic auth.
> The hook code shows 
> {quote}_if conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> requests library supports any auth that inherits AuthBase – in my scenario we 
> need ntlmauth for API on IIS server. 
> [https://2.python-requests.org/en/master/user/advanced/#custom-authentication]
> I would suggest option to pass auth object in constructor then add to if/else 
> control flow like
> {quote}_if self.auth is not None:_
>   _session.auth = self.auth_
> _elif conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> One would have to fetch the connection themselves and then fill out auth and 
> then pass that to hook which is flexible although a little awkard.
> {quote}api_conn = BaseHook().get_connection('my_api')
> auth = HttpNtlmAuth(api_conn.login, api_conn.password)
> HttpSensor(task_id='sensing', auth=auth, )
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5156) Add other authentication mechanisms to HttpHook

2020-04-17 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17086232#comment-17086232
 ] 

ASF subversion and git services commented on AIRFLOW-5156:
--

Commit d61a476da3a649bf2c1d347b9cb3abc62eae3ce9 in airflow's branch 
refs/heads/master from S S Rohit
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=d61a476 ]

[AIRFLOW-5156] Added auth type to HttpHook (#8429)



> Add other authentication mechanisms to HttpHook
> ---
>
> Key: AIRFLOW-5156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5156
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Joshua Kornblum
>Assignee: Rohit S S
>Priority: Minor
>
> It looks like the only supported authentication for HttpHooks is basic auth.
> The hook code shows 
> {quote}_if conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> requests library supports any auth that inherits AuthBase – in my scenario we 
> need ntlmauth for API on IIS server. 
> [https://2.python-requests.org/en/master/user/advanced/#custom-authentication]
> I would suggest option to pass auth object in constructor then add to if/else 
> control flow like
> {quote}_if self.auth is not None:_
>   _session.auth = self.auth_
> _elif conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> One would have to fetch the connection themselves and then fill out auth and 
> then pass that to hook which is flexible although a little awkard.
> {quote}api_conn = BaseHook().get_connection('my_api')
> auth = HttpNtlmAuth(api_conn.login, api_conn.password)
> HttpSensor(task_id='sensing', auth=auth, )
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk merged pull request #8429: [AIRFLOW-5156] Added auth type to HttpHook

2020-04-17 Thread GitBox
potiuk merged pull request #8429: [AIRFLOW-5156] Added auth type to HttpHook
URL: https://github.com/apache/airflow/pull/8429
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
codecov-io edited a comment on issue #8432: Provide GCP credentials in 
Bash/Python operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615537382
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=h1) 
Report
   > Merging 
[#8432](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/96df427e07601e331afd6990ce7613b2026acfe0=desc)
 will **decrease** coverage by `0.00%`.
   > The diff coverage is `0.00%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/8432/graphs/tree.svg?width=650=150=pr=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #8432  +/-   ##
   =
   - Coverage6.23%   6.22%   -0.01% 
   =
 Files 946 950   +4 
 Lines   45661   45723  +62 
   =
 Hits 28462846  
   - Misses  42815   42877  +62 
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/example\_dags/example\_google\_bash\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvci5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[...dags/example\_google\_bash\_operator\_custom\_script.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvcl9jdXN0b21fc2NyaXB0LnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[...low/example\_dags/example\_google\_python\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfcHl0aG9uX29wZXJhdG9yLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/bash.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaC5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/python.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/utils/documentation.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kb2N1bWVudGF0aW9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/www/app.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBwLnB5)
 | `0.00% <0.00%> (ø)` | |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=footer). 
Last update 
[96df427...235382f](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
codecov-io edited a comment on issue #8432: Provide GCP credentials in 
Bash/Python operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615537382
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=h1) 
Report
   > Merging 
[#8432](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/96df427e07601e331afd6990ce7613b2026acfe0=desc)
 will **decrease** coverage by `0.00%`.
   > The diff coverage is `0.00%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/8432/graphs/tree.svg?width=650=150=pr=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #8432  +/-   ##
   =
   - Coverage6.23%   6.22%   -0.01% 
   =
 Files 946 950   +4 
 Lines   45661   45723  +62 
   =
 Hits 28462846  
   - Misses  42815   42877  +62 
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/example\_dags/example\_google\_bash\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvci5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[...dags/example\_google\_bash\_operator\_custom\_script.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvcl9jdXN0b21fc2NyaXB0LnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[...low/example\_dags/example\_google\_python\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfcHl0aG9uX29wZXJhdG9yLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/bash.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaC5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/python.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/utils/documentation.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kb2N1bWVudGF0aW9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/www/app.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBwLnB5)
 | `0.00% <0.00%> (ø)` | |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=footer). 
Last update 
[96df427...235382f](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
codecov-io edited a comment on issue #8432: Provide GCP credentials in 
Bash/Python operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615537382
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=h1) 
Report
   > Merging 
[#8432](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/96df427e07601e331afd6990ce7613b2026acfe0=desc)
 will **decrease** coverage by `0.00%`.
   > The diff coverage is `0.00%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/8432/graphs/tree.svg?width=650=150=pr=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #8432  +/-   ##
   =
   - Coverage6.23%   6.22%   -0.01% 
   =
 Files 946 950   +4 
 Lines   45661   45723  +62 
   =
 Hits 28462846  
   - Misses  42815   42877  +62 
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/example\_dags/example\_google\_bash\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvci5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[...dags/example\_google\_bash\_operator\_custom\_script.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvcl9jdXN0b21fc2NyaXB0LnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[...low/example\_dags/example\_google\_python\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfcHl0aG9uX29wZXJhdG9yLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/bash.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaC5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/python.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/utils/documentation.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kb2N1bWVudGF0aW9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/www/app.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBwLnB5)
 | `0.00% <0.00%> (ø)` | |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=footer). 
Last update 
[96df427...235382f](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
codecov-io commented on issue #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432#issuecomment-615537382
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=h1) 
Report
   > Merging 
[#8432](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/96df427e07601e331afd6990ce7613b2026acfe0=desc)
 will **decrease** coverage by `0.00%`.
   > The diff coverage is `0.00%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/8432/graphs/tree.svg?width=650=150=pr=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #8432  +/-   ##
   =
   - Coverage6.23%   6.22%   -0.01% 
   =
 Files 946 950   +4 
 Lines   45661   45723  +62 
   =
 Hits 28462846  
   - Misses  42815   42877  +62 
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/example\_dags/example\_google\_bash\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvci5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[...dags/example\_google\_bash\_operator\_custom\_script.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfYmFzaF9vcGVyYXRvcl9jdXN0b21fc2NyaXB0LnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[...low/example\_dags/example\_google\_python\_operator.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9nb29nbGVfcHl0aG9uX29wZXJhdG9yLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/bash.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaC5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/python.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/utils/documentation.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kb2N1bWVudGF0aW9uLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/www/app.py](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBwLnB5)
 | `0.00% <0.00%> (ø)` | |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/8432/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=footer). 
Last update 
[96df427...235382f](https://codecov.io/gh/apache/airflow/pull/8432?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj opened a new pull request #8432: Provide GCP credentials in Bash/Python operators

2020-04-17 Thread GitBox
mik-laj opened a new pull request #8432: Provide GCP credentials in Bash/Python 
operators
URL: https://github.com/apache/airflow/pull/8432
 
 
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [X] Description above provides context of the change
   - [X] Unit tests coverage for changes (not needed for documentation changes)
   - [X] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [X] Relevant documentation is updated including usage instructions.
   - [X] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] casassg commented on issue #8052: [AIP-31] Create XComArg model

2020-04-17 Thread GitBox
casassg commented on issue #8052: [AIP-31] Create XComArg model
URL: https://github.com/apache/airflow/issues/8052#issuecomment-615531173
 
 
   I've been a bit overwhelmed with the work from home situation. Will try to 
get to this on the weekend or next week. Otherwise, please feel free to take it 
from me  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dedunumax commented on issue #8418: Update docker operator network documentation

2020-04-17 Thread GitBox
dedunumax commented on issue #8418: Update docker operator network documentation
URL: https://github.com/apache/airflow/issues/8418#issuecomment-615487650
 
 
   I added the link on docker documentation. network_method is used to define 
Docker network drivers as I understood. Correct me if I am wrong @pablosjv. 
Thank you.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-site] dedunumax opened a new pull request #266: Improve documentation on DockerOperator

2020-04-17 Thread GitBox
dedunumax opened a new pull request #266: Improve documentation on 
DockerOperator
URL: https://github.com/apache/airflow-site/pull/266
 
 
   https://github.com/apache/airflow/issues/8418


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alexsstock commented on a change in pull request #8256: updated _write_args on PythonVirtualenvOperator

2020-04-17 Thread GitBox
alexsstock commented on a change in pull request #8256: updated _write_args on 
PythonVirtualenvOperator
URL: https://github.com/apache/airflow/pull/8256#discussion_r410483125
 
 

 ##
 File path: airflow/operators/python_operator.py
 ##
 @@ -330,13 +330,28 @@ def _write_string_args(self, filename):
 
 def _write_args(self, input_filename):
 # serialize args to file
+if self.use_dill:
+serializer = dill
+else:
+serializer = pickle
+# some args from context can't be loaded in virtual env
+invalid_args = set(['dag', 'task', 'ti'])
 
 Review comment:
   This helped me thanks @maganaluis 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3347) Unable to configure Kubernetes secrets through environment

2020-04-17 Thread Kaxil Naik (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3347?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17086061#comment-17086061
 ] 

Kaxil Naik commented on AIRFLOW-3347:
-

Duplicate of https://issues.apache.org/jira/browse/AIRFLOW-5030 . Solved by 
https://github.com/apache/airflow/pull/5650

> Unable to configure Kubernetes secrets through environment
> --
>
> Key: AIRFLOW-3347
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3347
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration, executors
>Affects Versions: 1.10.0
>Reporter: Chris Bandy
>Priority: Major
>  Labels: kubernetes
>
> We configure Airflow through environment variables. While setting up the 
> Kubernetes Executor, we wanted to pass the SQL Alchemy connection string to 
> workers by including it the {{kubernetes_secrets}} section of config.
> Unfortunately, even with 
> {{AIRFLOW_\_KUBERNETES_SECRETS_\_AIRFLOW_\_CORE_\_SQL_ALCHEMY_CONN}} set in 
> the scheduler environment, the worker gets no environment secret environment 
> variables.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-3347) Unable to configure Kubernetes secrets through environment

2020-04-17 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3347?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik closed AIRFLOW-3347.
---
Resolution: Duplicate

> Unable to configure Kubernetes secrets through environment
> --
>
> Key: AIRFLOW-3347
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3347
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration, executors
>Affects Versions: 1.10.0
>Reporter: Chris Bandy
>Priority: Major
>  Labels: kubernetes
>
> We configure Airflow through environment variables. While setting up the 
> Kubernetes Executor, we wanted to pass the SQL Alchemy connection string to 
> workers by including it the {{kubernetes_secrets}} section of config.
> Unfortunately, even with 
> {{AIRFLOW_\_KUBERNETES_SECRETS_\_AIRFLOW_\_CORE_\_SQL_ALCHEMY_CONN}} set in 
> the scheduler environment, the worker gets no environment secret environment 
> variables.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] khyurri commented on a change in pull request #8430: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
khyurri commented on a change in pull request #8430: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8430#discussion_r410444548
 
 

 ##
 File path: 
airflow/providers/google/cloud/hooks/cloud_storage_transfer_service.py
 ##
 @@ -98,6 +105,22 @@ class GcpTransferOperationStatus:
 NEGATIVE_STATUSES = {GcpTransferOperationStatus.FAILED, 
GcpTransferOperationStatus.ABORTED}
 
 
+def gen_job_name(job_name: str) -> str:
+"""
+Adds unique suffix to job name. If suffix already exists, updates it.
+Suffix — current timestamp
+:param job_name:
+:rtype job_name: str
+:return:
+"""
+split = job_name.split("_")
+uniq = str(int(time.time()))
+if len(split) > 1 and re.compile("^[0-9]{10}$").match(split[-1]):
+split[-1] = uniq
+return "_".join(split)
+return "_".join([job_name, uniq])
 
 Review comment:
   Sure we can, i've coded it before your advise. I'll fix it


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615427650
 
 
   Thank you so much. These are more than enough. I really appreciate!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj edited a comment on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
mik-laj edited a comment on issue #8272: Cloud Life Sciences operator and hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615426469
 
 
   @ephraimbuddy Google Cloud has two types of libraries. 
   * Native python library - https://github.com/googleapis/google-cloud-python  
It exists for most, but not for all services. These are recommended. libraries. 
Most often they use Protobuf for communication.
   * Discovery based - https://github.com/googleapis/google-api-python-client 
These are libraries that are automatically generated based on the API 
specification (called the discovery document) at the time of use There are 
always Googlle services for everyone and they have all the options - it's 
always fresh. For communication uses HTTP only
   
   We don't have a native library for this library, so we need to use 
[google-api-client-python](https://github.com/googleapis/google-api-python-client)..
 In order to initialize the library, you should use the following code.
   ```python
   from googleapiclient.discovery import build
   service = build('lifesciences', 'v2beta', ...)
   ```
   Unfortunately, there is no documentation for this library, but you can build 
a client and check what methods exist in this API using ipdb
   Documentation for other service is available here:
   
https://github.com/googleapis/google-api-python-client/blob/master/docs/dyn/index.md
   
   Here is an example of how to check documentation for dataflow.
   ```python
   from googleapiclient.discovery import build
   dataflow_service = build('dataflow', 'v1b3')
   projects_resource = dataflow_service.projects()
   locations_resource = projects_resource.locations()
   flex_templates_resource = locations_resource.flexTemplates()
   
   print(flex_templates_resource.launch.__doc__)
   ```
   These APIs are automatically generated based on the REST API, so you can 
check the general idea and required arguments in the REST API documentation for 
the Life Science service.
   https://cloud.google.com/life-sciences/docs/reference/rest
   
   If you looking for example hook, you should look at Cloud Build:
   
https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/cloud_build.py
   It still uses discovery-based client
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
mik-laj commented on issue #8272: Cloud Life Sciences operator and hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615426469
 
 
   @ephraimbuddy Google Cloud has two types of libraries. 
   * Native python library - https://github.com/googleapis/google-cloud-python  
It exists for most, but not for all services. These are recommended. libraries. 
Most often they use Protobuf for communication.
   * Discovery based - https://github.com/googleapis/google-api-python-client 
These are libraries that are automatically generated based on the API 
specification (called the discovery document) at the time of use There are 
always Googlle services for everyone and they have all the options - it's 
always fresh. For communication uses HTTP only
   
   We don't have a native library for this library, so we need to use 
[google-api-client-python](https://github.com/googleapis/google-api-python-client)..
 In order to initialize the library, you should use the following code.
   ```python
   from googleapiclient.discovery import build
   service = build('lifesciences', 'v2beta', ...)
   ```
   Unfortunately, there is no documentation for this library, but you can build 
a client and check what methods exist in this API using ipdb
   Documentation for other service is available here:
   
https://github.com/googleapis/google-api-python-client/blob/master/docs/dyn/index.md
   
   Here is an example of how to check documentation for dataflow.
   ```python
   from googleapiclient.discovery import build
   dataflow_service = build('dataflow', 'v1b3')
   projects_resource = dataflow_service.projects()
   locations_resource = projects_resource.locations()
   flex_templates_resource = locations_resource.flexTemplates()
   
   print(flex_templates_resource.launch.__doc__)
   ```
   These APIs are automatically generated based on the REST API, so you can 
check the general idea and required arguments in the REST API documentation for 
the Life Science service.
   https://cloud.google.com/life-sciences/docs/reference/rest
   
   If you looking for example hook, you should look at Cloud Build:
   
https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/cloud_build.py
   It still uses discovery.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615416283
 
 
   Hi @mik-laj , Please can you point me to the python library for this Cloud 
Life Science. I have been looking for it and can't find it. Sorry for any 
inconveniences this may cause.
   Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dhuang commented on a change in pull request #8423: Fix Snowflake hook conn id

2020-04-17 Thread GitBox
dhuang commented on a change in pull request #8423: Fix Snowflake hook conn id
URL: https://github.com/apache/airflow/pull/8423#discussion_r410395250
 
 

 ##
 File path: tests/providers/snowflake/hooks/test_snowflake.py
 ##
 @@ -102,6 +102,7 @@ def test_get_conn_params(self):
 'warehouse': 'af_wh',
 'region': 'af_region',
 'role': 'af_role'}
+self.assertEqual(self.db_hook.snowflake_conn_id, 'snowflake_default')
 
 Review comment:
   Hmm yeah that could've been the intent. However, with the way it was 
implemented it, I believe it also would have overwrote an explicit 
`snowflake_conn_id` passed into the hook/operator on initialization. But good 
call, I think it does make sense to note this in `UPDATING.md`, added.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dhuang commented on issue #8422: Add Snowflake system test

2020-04-17 Thread GitBox
dhuang commented on issue #8422: Add Snowflake system test
URL: https://github.com/apache/airflow/pull/8422#issuecomment-615381527
 
 
   > Ech. some static check failures :(
   
   臘‍♂️ Sorry, fixed and all static checks passed locally.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] turbaszek commented on a change in pull request #8430: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
turbaszek commented on a change in pull request #8430: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8430#discussion_r410337665
 
 

 ##
 File path: 
airflow/providers/google/cloud/hooks/cloud_storage_transfer_service.py
 ##
 @@ -98,6 +105,22 @@ class GcpTransferOperationStatus:
 NEGATIVE_STATUSES = {GcpTransferOperationStatus.FAILED, 
GcpTransferOperationStatus.ABORTED}
 
 
+def gen_job_name(job_name: str) -> str:
+"""
+Adds unique suffix to job name. If suffix already exists, updates it.
+Suffix — current timestamp
+:param job_name:
+:rtype job_name: str
+:return:
+"""
+split = job_name.split("_")
+uniq = str(int(time.time()))
+if len(split) > 1 and re.compile("^[0-9]{10}$").match(split[-1]):
+split[-1] = uniq
+return "_".join(split)
+return "_".join([job_name, uniq])
 
 Review comment:
   Can't we just return `return f"{job_name}_{uniq}`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
potiuk commented on issue #8417: Increase max password length in Airflow 
Connections
URL: https://github.com/apache/airflow/issues/8417#issuecomment-615328426
 
 
   Indeed :). If you are looking for Password length change, you'd not find it 
:)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KevinKobi opened a new issue #8431: Mark Success/Failure from the UI uses wrong settings

2020-04-17 Thread GitBox
KevinKobi opened a new issue #8431: Mark Success/Failure from the UI uses wrong 
settings
URL: https://github.com/apache/airflow/issues/8431
 
 
   
   **What happened**:
   In the UI the default behavior is Downstream and Recursive set for Clear:
   ![Screen Shot 2020-04-17 at 18 41 
39](https://user-images.githubusercontent.com/63675983/79587565-65109480-80db-11ea-9d27-d9ff32a5d5a9.png)
   
   these preset choices should effect only clear button but they effect all the 
others.
   By choosing `Mark success` or `Mark Failed` it will take the choice of 
`Downstream` and `Recursive` from `Clear` unless removing them explicitly. 
   
   
   **What you expected to happen**:
   
   Each row has it's own settings. As the UI shows Mark Success has it's own 
Downstream Upstream buttons. It should use it's own not other rows settings.
   
   As alternative, at the end the window perform only ONE action (Clear, Mark 
Success, Mark Failure) the block of 4 * 3  = 12 buttons is not needed:
   ![Screen Shot 2020-04-17 at 18 47 
26](https://user-images.githubusercontent.com/63675983/79588057-1283a800-80dc-11ea-89a5-0841e0493caa.png)
   It can have just 4 buttons for all the three actions (Clear, Mark Success, 
Mark Failed) and Clear can have extra setting of Recursive.
   
   I'm not UX expert so there might be better way to do this but it feels that 
there is redundancy with the buttons (which also create this bug)
   
   
   **How to reproduce it**:
   
   1. Add any DAG
   2. in the UI click on any task to open the menu
   3. Choose `Mark success` or `Mark Failure` it will show you list of all 
tasks met with `Recursive` and `Downsteam` though you did not choose these 
settings.
   4. Go back, remove the preset of `Recursive` and `Downsteam`  from `Clear` 
row. Click on `Mark success` or `Mark Failure` now it will show you only the 
task you choose.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj removed a comment on issue #8399: WIP: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
mik-laj removed a comment on issue #8399: WIP: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8399#issuecomment-615319640
 
 
   @turbaszek If the job has been completed, do not perform it again. This is a 
serious problem when user does backfill.  If the user wants to run the task 
multiple times, the user should not provide any ID and a new ID will be 
generated each time.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #8399: WIP: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
mik-laj commented on issue #8399: WIP: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8399#issuecomment-615319640
 
 
   @turbaszek If the job has been completed, do not perform it again. This is a 
serious problem when user does backfill.  If the user wants to run the task 
multiple times, the user should not provide any ID and a new ID will be 
generated each time.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8420: DockerSwarmOperator always pulls docker image

2020-04-17 Thread GitBox
potiuk commented on issue #8420: DockerSwarmOperator always pulls docker image
URL: https://github.com/apache/airflow/issues/8420#issuecomment-615317677
 
 
   Feel free to make a PR :). The issue Llooks very VALID :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] CodingJonas commented on issue #8420: DockerSwarmOperator always pulls docker image

2020-04-17 Thread GitBox
CodingJonas commented on issue #8420: DockerSwarmOperator always pulls docker 
image
URL: https://github.com/apache/airflow/issues/8420#issuecomment-615317105
 
 
   The invalid tag was set automatically, and I didn't find a way to change it. 
I must have done something wrong when I created the issue.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
kaxil commented on issue #8417: Increase max password length in Airflow 
Connections
URL: https://github.com/apache/airflow/issues/8417#issuecomment-615313567
 
 
   > Thanks! I was working with 1.10.4 version.
   > 
   > However, I couldn't find that issue in the [changelog for 
1.10.7](https://airflow.apache.org/docs/stable/changelog.html#airflow-1-10-7-2019-12-24).
 Is the changelog updated? Or Am I not looking at the correct place?
   
   It is not nicely worded by it is at 
https://airflow.apache.org/docs/stable/changelog.html#id9
   
   
![image](https://user-images.githubusercontent.com/8811558/79586709-65079900-80c9-11ea-8750-4b61307f53d2.png)
   
   >[AIRFLOW-6185] SQLAlchemy Connection model schema not aligned with Alembic 
schema (#6754)
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil edited a comment on issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
kaxil edited a comment on issue #8417: Increase max password length in Airflow 
Connections
URL: https://github.com/apache/airflow/issues/8417#issuecomment-615313567
 
 
   > Thanks! I was working with 1.10.4 version.
   > 
   > However, I couldn't find that issue in the [changelog for 
1.10.7](https://airflow.apache.org/docs/stable/changelog.html#airflow-1-10-7-2019-12-24).
 Is the changelog updated? Or Am I not looking at the correct place?
   
   It is not nicely worded but it is at 
https://airflow.apache.org/docs/stable/changelog.html#id9
   
   
![image](https://user-images.githubusercontent.com/8811558/79586709-65079900-80c9-11ea-8750-4b61307f53d2.png)
   
   >[AIRFLOW-6185] SQLAlchemy Connection model schema not aligned with Alembic 
schema (#6754)
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #8414: Use repeated arguments in pytest

2020-04-17 Thread GitBox
mik-laj commented on issue #8414: Use repeated arguments in pytest
URL: https://github.com/apache/airflow/pull/8414#issuecomment-615312646
 
 
   Go ahead and do it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj closed pull request #8414: Use repeated arguments in pytest

2020-04-17 Thread GitBox
mik-laj closed pull request #8414: Use repeated arguments in pytest
URL: https://github.com/apache/airflow/pull/8414
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pablosjv commented on issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
pablosjv commented on issue #8417: Increase max password length in Airflow 
Connections
URL: https://github.com/apache/airflow/issues/8417#issuecomment-615312159
 
 
   Thanks! I was working with 1.10.4 version.
   
   However, I couldn't find that issue in the [changelog for 
1.10.7](https://airflow.apache.org/docs/stable/changelog.html#airflow-1-10-7-2019-12-24).
 Is the changelog updated? Or Am I not looking at the correct place?
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #8423: Fix Snowflake hook conn id

2020-04-17 Thread GitBox
ashb commented on a change in pull request #8423: Fix Snowflake hook conn id
URL: https://github.com/apache/airflow/pull/8423#discussion_r410298189
 
 

 ##
 File path: tests/providers/snowflake/hooks/test_snowflake.py
 ##
 @@ -102,6 +102,7 @@ def test_get_conn_params(self):
 'warehouse': 'af_wh',
 'region': 'af_region',
 'role': 'af_role'}
+self.assertEqual(self.db_hook.snowflake_conn_id, 'snowflake_default')
 
 Review comment:
   I _think_ that this is actually a change in the default, and 
`snowflake_conn_id` was the previous default.
   
   Do you think it's worth changing this default? If so it needs a note in 
UPDATING.md


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
kaxil merged pull request #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8418: Update docker operator network documentation

2020-04-17 Thread GitBox
potiuk commented on issue #8418: Update docker operator network documentation
URL: https://github.com/apache/airflow/issues/8418#issuecomment-615302186
 
 
   How about you add it yourself ? @pablosjv. Just create a PR!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] khyurri opened a new pull request #8430: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
khyurri opened a new pull request #8430: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8430
 
 
   This PR Resolves #8285 
   
   1. If `body.name` is passed `CloudDataTransferServiceCreateJobOperator` 
became idempotent
   2. If transfer `body.name` has been soft deleted, operator became *not 
idempotent*. Every run name will have unique suffix (`name_{unix_time_stamp}`)
   DRAFT PR: https://github.com/apache/airflow/pull/8399 (sorry, i've deleted 
branch, my mistake :( )
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk merged pull request #8419: fixed typo in confirm script

2020-04-17 Thread GitBox
potiuk merged pull request #8419: fixed typo in confirm script
URL: https://github.com/apache/airflow/pull/8419
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8420: DockerSwarmOperator always pulls docker image

2020-04-17 Thread GitBox
potiuk commented on issue #8420: DockerSwarmOperator always pulls docker image
URL: https://github.com/apache/airflow/issues/8420#issuecomment-615301042
 
 
   Why invalid? Looks like it's quite valid point :). Did you change your mind ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on a change in pull request #8393: Bring back CI optimisations

2020-04-17 Thread GitBox
potiuk commented on a change in pull request #8393: Bring back CI optimisations
URL: https://github.com/apache/airflow/pull/8393#discussion_r410288174
 
 

 ##
 File path: .github/workflows/ci.yml
 ##
 @@ -80,134 +80,152 @@ jobs:
 name: Build docs
 runs-on: ubuntu-latest
 env:
-  TRAVIS_JOB_NAME: "Build documentation"
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Documentation"
 steps:
   - uses: actions/checkout@master
-  - name: "Build documentation"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Build docs"
 run: ./scripts/ci/ci_docs.sh
 
   tests-p36-postgres-integrations:
-name: "Tests [Postgres9.6][Py3.6][integrations]"
+name: "[Pg9.6][Py3.6][integrations]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres9.6][Py3.6][integrations]"
   BACKEND: postgres
   PYTHON_VERSION: 3.6
   POSTGRES_VERSION: 9.6
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres9.6][Py3.6][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-postgres-providers:
-name: "Tests [Postgres10][Py3.6][providers]"
+name: "[Pg10][Py3.6][prov]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres10][Py3.6][providers]"
   BACKEND: postgres
   POSTGRES_VERSION: 10
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres10][Py3.6][providers]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh tests/providers
 
   tests-p36-postgres-core:
-name: "Tests [Postgres9.6][Py3.6][core]"
+name: "[Pg9.6][Py3.6][core]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres9.6][Py3.6][core]"
   BACKEND: postgres
   POSTGRES_VERSION: 9.6
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres9.6][Py3.6][core]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh --ignore=tests/providers
 
 
   tests-p37-sqlite-integrations:
-name: "Tests [Sqlite][3.7][integrations]"
+name: "[Sqlite][3.7][int]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Sqlite][3.7][integrations]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.7
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Sqlite][3.7][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-sqlite:
-name: "Tests [Sqlite][Py3.6]"
+name: "[Sqlite][Py3.6]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Sqlite][Py3.6]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Sqlite][Py3.6]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-mysql-integrations:
-name: "Tests [MySQL][Py3.6][integrations]"
+name: "[MySQL5.7][Py3.6][int]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [MySQL][Py3.6][integrations]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.6
   MYSQL_VERSION: 5.7
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [MySQL][Py3.6][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-mysql-providers:
-name: "Tests [MySQL5.7][Py3.7][providers][kerberos]"
+name: "[MySQL5.7][Py3.7][prov][kerb]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [MySQL5.7][Py3.7][providers][kerberos]"
   BACKEND: mysql
  

[GitHub] [airflow] potiuk commented on issue #8422: Add Snowflake system test

2020-04-17 Thread GitBox
potiuk commented on issue #8422: Add Snowflake system test
URL: https://github.com/apache/airflow/pull/8422#issuecomment-615299109
 
 
   Ech. some static check failures :(


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #8412: Remove duplicate dependency ('curl') from Dockerfile

2020-04-17 Thread GitBox
kaxil merged pull request #8412: Remove duplicate dependency ('curl') from 
Dockerfile
URL: https://github.com/apache/airflow/pull/8412
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk closed issue #8424: Exceptions inconsistent with UI and logfile

2020-04-17 Thread GitBox
potiuk closed issue #8424: Exceptions inconsistent with UI and logfile
URL: https://github.com/apache/airflow/issues/8424
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8424: Exceptions inconsistent with UI and logfile

2020-04-17 Thread GitBox
potiuk commented on issue #8424: Exceptions inconsistent with UI and logfile
URL: https://github.com/apache/airflow/issues/8424#issuecomment-615295670
 
 
   I think this has not enough details. I am not even sure whether those are 
related. There is no information on how to reproduce it. Please open it again 
with all the details if you want anyone to take a look at it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] khyurri closed pull request #8399: WIP: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
khyurri closed pull request #8399: WIP: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8399
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8428: Create isacko

2020-04-17 Thread GitBox
potiuk commented on issue #8428: Create isacko
URL: https://github.com/apache/airflow/pull/8428#issuecomment-615291957
 
 
   ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk closed pull request #8428: Create isacko

2020-04-17 Thread GitBox
potiuk closed pull request #8428: Create isacko
URL: https://github.com/apache/airflow/pull/8428
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5577) Dags Filter_by_owner is missing in RBAC

2020-04-17 Thread Kaxil Naik (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5577?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17085840#comment-17085840
 ] 

Kaxil Naik commented on AIRFLOW-5577:
-

Well, you can also use using the access_control parameter on each dag:
https://github.com/apache/airflow/blob/4b25cb9d08565502172cb847c79d81559775d504/airflow/models/dag.py#L174

You can assign dags to only be accessible by certain roles. After doing this 
you should run `airflow sync_perm` from the CLI to update the permissions

> Dags Filter_by_owner is missing in RBAC
> ---
>
> Key: AIRFLOW-5577
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5577
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.3, 1.10.4, 1.10.5
>Reporter: Hari
>Assignee: Hari
>Priority: Major
>  Labels: easyfix
>
> After enabling the RBAC, the dags filter by owner option is missing. All the 
> Dags will be visible to all the users.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-5156) Add other authentication mechanisms to HttpHook

2020-04-17 Thread Rohit S S (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rohit S S reassigned AIRFLOW-5156:
--

Assignee: Rohit S S

> Add other authentication mechanisms to HttpHook
> ---
>
> Key: AIRFLOW-5156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5156
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Joshua Kornblum
>Assignee: Rohit S S
>Priority: Minor
>
> It looks like the only supported authentication for HttpHooks is basic auth.
> The hook code shows 
> {quote}_if conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> requests library supports any auth that inherits AuthBase – in my scenario we 
> need ntlmauth for API on IIS server. 
> [https://2.python-requests.org/en/master/user/advanced/#custom-authentication]
> I would suggest option to pass auth object in constructor then add to if/else 
> control flow like
> {quote}_if self.auth is not None:_
>   _session.auth = self.auth_
> _elif conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> One would have to fetch the connection themselves and then fill out auth and 
> then pass that to hook which is flexible although a little awkard.
> {quote}api_conn = BaseHook().get_connection('my_api')
> auth = HttpNtlmAuth(api_conn.login, api_conn.password)
> HttpSensor(task_id='sensing', auth=auth, )
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5156) Add other authentication mechanisms to HttpHook

2020-04-17 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17085811#comment-17085811
 ] 

ASF GitHub Bot commented on AIRFLOW-5156:
-

randr97 commented on pull request #8429: [AIRFLOW-5156] Added auth type to 
HttpHook
URL: https://github.com/apache/airflow/pull/8429
 
 
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add other authentication mechanisms to HttpHook
> ---
>
> Key: AIRFLOW-5156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5156
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Joshua Kornblum
>Assignee: Rohit S S
>Priority: Minor
>
> It looks like the only supported authentication for HttpHooks is basic auth.
> The hook code shows 
> {quote}_if conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> requests library supports any auth that inherits AuthBase – in my scenario we 
> need ntlmauth for API on IIS server. 
> [https://2.python-requests.org/en/master/user/advanced/#custom-authentication]
> I would suggest option to pass auth object in constructor then add to if/else 
> control flow like
> {quote}_if self.auth is not None:_
>   _session.auth = self.auth_
> _elif conn.login:_
>   _session.auth = (conn.login, conn.password)_
> {quote}
> One would have to fetch the connection themselves and then fill out auth and 
> then pass that to hook which is flexible although a little awkard.
> {quote}api_conn = BaseHook().get_connection('my_api')
> auth = HttpNtlmAuth(api_conn.login, api_conn.password)
> HttpSensor(task_id='sensing', auth=auth, )
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] randr97 opened a new pull request #8429: [AIRFLOW-5156] Added auth type to HttpHook

2020-04-17 Thread GitBox
randr97 opened a new pull request #8429: [AIRFLOW-5156] Added auth type to 
HttpHook
URL: https://github.com/apache/airflow/pull/8429
 
 
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] boring-cyborg[bot] commented on issue #8428: Create isacko

2020-04-17 Thread GitBox
boring-cyborg[bot] commented on issue #8428: Create isacko
URL: https://github.com/apache/airflow/pull/8428#issuecomment-615270313
 
 
   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Isackosharamo opened a new pull request #8428: Create isacko

2020-04-17 Thread GitBox
Isackosharamo opened a new pull request #8428: Create isacko
URL: https://github.com/apache/airflow/pull/8428
 
 
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk merged pull request #8095: stop rendering some class docs in wrong place

2020-04-17 Thread GitBox
potiuk merged pull request #8095: stop rendering some class docs in wrong place
URL: https://github.com/apache/airflow/pull/8095
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] turbaszek commented on a change in pull request #8377: Use python client in BQ hook create_empty_table method

2020-04-17 Thread GitBox
turbaszek commented on a change in pull request #8377: Use python client in BQ 
hook create_empty_table method
URL: https://github.com/apache/airflow/pull/8377#discussion_r410229210
 
 

 ##
 File path: airflow/providers/google/cloud/example_dags/example_bigquery.py
 ##
 @@ -219,10 +219,10 @@
 # [START howto_operator_bigquery_create_view]
 create_view = BigQueryCreateEmptyTableOperator(
 task_id="create_view",
-dataset_id=LOCATION_DATASET_NAME,
+dataset_id=DATASET_NAME,
 table_id="test_view",
 view={
-"query": "SELECT * FROM `{}.test_table`".format(DATASET_NAME),
+"query": f"SELECT * FROM `{PROJECT_ID}.{DATASET_NAME}.test_table`",
 
 Review comment:
   As you can see line below:
   ```
   "useLegacySql": False
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #8393: Bring back CI optimisations

2020-04-17 Thread GitBox
kaxil commented on a change in pull request #8393: Bring back CI optimisations
URL: https://github.com/apache/airflow/pull/8393#discussion_r410119916
 
 

 ##
 File path: .github/workflows/ci.yml
 ##
 @@ -80,134 +80,152 @@ jobs:
 name: Build docs
 runs-on: ubuntu-latest
 env:
-  TRAVIS_JOB_NAME: "Build documentation"
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Documentation"
 steps:
   - uses: actions/checkout@master
-  - name: "Build documentation"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Build docs"
 run: ./scripts/ci/ci_docs.sh
 
   tests-p36-postgres-integrations:
-name: "Tests [Postgres9.6][Py3.6][integrations]"
+name: "[Pg9.6][Py3.6][integrations]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres9.6][Py3.6][integrations]"
   BACKEND: postgres
   PYTHON_VERSION: 3.6
   POSTGRES_VERSION: 9.6
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres9.6][Py3.6][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-postgres-providers:
-name: "Tests [Postgres10][Py3.6][providers]"
+name: "[Pg10][Py3.6][prov]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres10][Py3.6][providers]"
   BACKEND: postgres
   POSTGRES_VERSION: 10
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres10][Py3.6][providers]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh tests/providers
 
   tests-p36-postgres-core:
-name: "Tests [Postgres9.6][Py3.6][core]"
+name: "[Pg9.6][Py3.6][core]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Postgres9.6][Py3.6][core]"
   BACKEND: postgres
   POSTGRES_VERSION: 9.6
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Postgres9.6][Py3.6][core]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh --ignore=tests/providers
 
 
   tests-p37-sqlite-integrations:
-name: "Tests [Sqlite][3.7][integrations]"
+name: "[Sqlite][3.7][int]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Sqlite][3.7][integrations]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.7
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Sqlite][3.7][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-sqlite:
-name: "Tests [Sqlite][Py3.6]"
+name: "[Sqlite][Py3.6]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [Sqlite][Py3.6]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.6
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [Sqlite][Py3.6]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-mysql-integrations:
-name: "Tests [MySQL][Py3.6][integrations]"
+name: "[MySQL5.7][Py3.6][int]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [MySQL][Py3.6][integrations]"
   BACKEND: sqlite
   PYTHON_VERSION: 3.6
   MYSQL_VERSION: 5.7
   ENABLED_INTEGRATIONS: "cassandra kerberos mongo openldap rabbitmq redis"
   RUN_INTEGRATION_TESTS: all
+  CI_JOB_TYPE: "Tests"
 steps:
   - uses: actions/checkout@master
-  - name: "Tests [MySQL][Py3.6][integrations]"
+  - name: "Build CI image"
+run: ./scripts/ci/ci_prepare_image_on_ci.sh
+  - name: "Tests"
 run: ./scripts/ci/ci_run_airflow_testing.sh
 
   tests-p36-mysql-providers:
-name: "Tests [MySQL5.7][Py3.7][providers][kerberos]"
+name: "[MySQL5.7][Py3.7][prov][kerb]"
 runs-on: ubuntu-latest
 needs: [statics, statics-tests]
 env:
-  TRAVIS_JOB_NAME: "Tests [MySQL5.7][Py3.7][providers][kerberos]"
   BACKEND: mysql
   

[GitHub] [airflow] turbaszek commented on issue #8399: WIP: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
turbaszek commented on issue #8399: WIP: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8399#issuecomment-615238370
 
 
   > But, don't you think, that in this case, when JOB was deleted, right 
solution is to fail fast?
   
   As a user, when I run DAG I don't care if someone has deleted the job (for 
example by mistake), I want it to be created. So, I would suggest adding unique 
suffix in operator / hook method:
   
   ```python
   def create_job(job_name, ...):
   job_name = f"{job_name}-{uuid.uuid4()}"
   ...
   return job_name
   ```
   
   In this way, each job will be unique (up to uuid uniqueness) 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk merged pull request #8415: Fix subcommand error when running production image without argument

2020-04-17 Thread GitBox
potiuk merged pull request #8415: Fix subcommand error when running production 
image without argument
URL: https://github.com/apache/airflow/pull/8415
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #8415: Fix subcommand error when running production image without argument

2020-04-17 Thread GitBox
potiuk commented on issue #8415: Fix subcommand error when running production 
image without argument
URL: https://github.com/apache/airflow/pull/8415#issuecomment-615224863
 
 
   Thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #8427: Make doc clearer about Airflow Variables using Environment Variables

2020-04-17 Thread GitBox
kaxil merged pull request #8427: Make doc clearer about Airflow Variables using 
Environment Variables
URL: https://github.com/apache/airflow/pull/8427
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil edited a comment on issue #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
kaxil edited a comment on issue #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#issuecomment-615219456
 
 
   > But there were never any contrib secrets?
   
   Not in Master, yes but in Airflow 1.10.10 as we don't have "Providers" 
folder for v1-10-* series:
   
   https://github.com/apache/airflow/tree/v1-10-test/airflow/contrib/secrets
   
   And we don't want to have the "Providers" folder in 1.10.* as it might fail 
implicit namepackages from getting discovered for our Backport Packages


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
kaxil commented on issue #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#issuecomment-615219456
 
 
   > But there were never any contrib secrets?
   
   Not in Master, yes but in Airflow 1.10.10 as we don't have "Providers" 
folder for v1-10-* series:
   
   https://github.com/apache/airflow/tree/v1-10-test/airflow/contrib/secrets


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] BasPH edited a comment on issue #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
BasPH edited a comment on issue #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#issuecomment-615218599
 
 
   ~But there were never any contrib secrets?~ Nevermind, they exist in 1.10.10


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] BasPH commented on issue #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
BasPH commented on issue #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#issuecomment-615218599
 
 
   But there were never any contrib secrets?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #8426: Add a dedicated "free disk space" step to fix CI

2020-04-17 Thread GitBox
kaxil merged pull request #8426: Add a dedicated "free disk space" step to fix 
CI
URL: https://github.com/apache/airflow/pull/8426
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #8411: BugFix: DAG trigger via UI error in RBAC UI

2020-04-17 Thread GitBox
kaxil commented on a change in pull request #8411: BugFix: DAG trigger via UI 
error in RBAC UI
URL: https://github.com/apache/airflow/pull/8411#discussion_r410184054
 
 

 ##
 File path: airflow/www_rbac/views.py
 ##
 @@ -1062,6 +1062,7 @@ def trigger(self, session=None):
 conf=conf
 )
 
+dag = dagbag.get_dag(dag_id)
 
 Review comment:
   This only errors if the Webserver does not have access to Dag Files (like 
was mentioned in the issue when the DAGs dir was not mounted to Webserver).
   
   We might have tests to see when the config is enabled this still works but 
we might not have tests to check the Webserver doesn't have access to Dag file.
   
   Have kept the PR in draft so that I can add some tests.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
kaxil commented on issue #8413: Add back-compat modules from 1.10.10 for 
SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#issuecomment-615211180
 
 
   > Why do we need these? They never existed in thest
   
   Otherwise "contrib" Secrets won't work for people upgrading to Airflow 2.0 
(when that happens)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #8411: BugFix: DAG trigger via UI error in RBAC UI

2020-04-17 Thread GitBox
ashb commented on a change in pull request #8411: BugFix: DAG trigger via UI 
error in RBAC UI
URL: https://github.com/apache/airflow/pull/8411#discussion_r410180370
 
 

 ##
 File path: airflow/www_rbac/views.py
 ##
 @@ -1062,6 +1062,7 @@ def trigger(self, session=None):
 conf=conf
 )
 
+dag = dagbag.get_dag(dag_id)
 
 Review comment:
   Lol what? Do we have _no_ tests of this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ephraimbuddy edited a comment on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
ephraimbuddy edited a comment on issue #8272: Cloud Life Sciences operator and 
hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615209256
 
 
   Hi, I'll like to work on this. I have sent request to view the [GCP Service 
Airflow Integration 
Guide](https://docs.google.com/document/d/1_rTdJSLCt0eyrAylmmgYc3yZr-_h51fVlnvMmWqhCkY/edit).
   Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook

2020-04-17 Thread GitBox
ephraimbuddy commented on issue #8272: Cloud Life Sciences operator and hook
URL: https://github.com/apache/airflow/issues/8272#issuecomment-615209256
 
 
   I'll like to work on this. I have sent request to view the [GCP Service 
Airflow Integration 
Guide](https://docs.google.com/document/d/1_rTdJSLCt0eyrAylmmgYc3yZr-_h51fVlnvMmWqhCkY/edit).
   Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #8413: Add back-compat modules from 1.10.10 for SecretsBackends

2020-04-17 Thread GitBox
ashb commented on a change in pull request #8413: Add back-compat modules from 
1.10.10 for SecretsBackends
URL: https://github.com/apache/airflow/pull/8413#discussion_r410179769
 
 

 ##
 File path: airflow/contrib/secrets/__init__.py
 ##
 @@ -0,0 +1,26 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This package is deprecated. Please use `airflow.secrets` or 
`airflow.providers.*.secrets`."""
+
+import warnings
+
+warnings.warn(
+"This package is deprecated. Please use `airflow.secrets` or 
`airflow.providers.*.secrets`.",
+DeprecationWarning,
+stacklevel=2,
+)
 
 Review comment:
   ```suggestion
   ```
   I would probably say no deprecation is needed here -- this will issue two 
deprecations as a result, cos importing any package under this will have to 
import this too.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] turbaszek commented on issue #8052: [AIP-31] Create XComArg model

2020-04-17 Thread GitBox
turbaszek commented on issue #8052: [AIP-31] Create XComArg model
URL: https://github.com/apache/airflow/issues/8052#issuecomment-615206932
 
 
   Hey @casassg any update on this issue? :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ivorynoise commented on issue #7724: [AIRFLOW-1536] Inherit umask from parent process in daemon mode

2020-04-17 Thread GitBox
ivorynoise commented on issue #7724: [AIRFLOW-1536] Inherit umask from parent 
process in daemon mode
URL: https://github.com/apache/airflow/pull/7724#issuecomment-615201304
 
 
   @mik-laj  I never meant to close this. I am still struggling to understand 
the git workflow so it just happened. I ll surely update this one. Please allow 
me today's time.   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil opened a new pull request #8427: Make doc cleared about Airflow Variables using Environment Variables

2020-04-17 Thread GitBox
kaxil opened a new pull request #8427: Make doc cleared about Airflow Variables 
using Environment Variables
URL: https://github.com/apache/airflow/pull/8427
 
 
   One of the users had this confusion on Slack and I feel better to mention 
this explicitly.
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] khyurri commented on issue #8399: WIP: Improve idempodency in CloudDataTransferServiceCreateJobOperator

2020-04-17 Thread GitBox
khyurri commented on issue #8399: WIP: Improve idempodency in 
CloudDataTransferServiceCreateJobOperator
URL: https://github.com/apache/airflow/pull/8399#issuecomment-615196904
 
 
   @turbaszek 
   In this case, if user will not change `JOB_NAME` in 
`aws_to_gcs_transfer_body` in DAG configuration, we will not achieve  
idempodency.
   
   For example, I've got DAG configuration:
   ```
   aws_to_gcs_transfer_body = {
   ...
   JOB_NAME: "transferJobs/helloJobName",
   ...
   }
   ```
   
   Then I've deleted this transfer job using GCP UI.
   
   Every next dag run will create new job, because `transferJobs/helloJobName` 
is deleted and we should add suffix to its name.
   
   We can find all jobs and try to check — if this job is matches our format 
(job_name+suffix). 
   
   But, don't you think, that in this case, when JOB was deleted, right 
solution is to fail fast? 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil edited a comment on issue #8421: Hide sensitive data in UI

2020-04-17 Thread GitBox
kaxil edited a comment on issue #8421: Hide sensitive data in UI
URL: https://github.com/apache/airflow/issues/8421#issuecomment-615191959
 
 
   > > Airflow 1.10.10 allows getting connections from a Vault: 
https://airflow.apache.org/blog/airflow-1.10.10/#allow-retrieving-airflow-connections-variables-from-various-secrets-backend
   > > Does that help your use-case?
   > 
   > So basically, with Airflow 1.10.10, if I configure the airflow.cfg to use 
Hashicorp Vault, I can use connections and variables as usual but instead of 
getting data from Airflow database, it will got it from Vault?
   
   Exactly. Here is one of the guide: 
https://www.astronomer.io/guides/airflow-and-hashicorp-vault/ to test it out 
locally and following docs:
   
   - 
https://airflow.apache.org/docs/1.10.10/concepts.html#storing-variables-in-environment-variables
   - 
https://airflow.apache.org/docs/1.10.10/howto/use-alternative-secrets-backend.html#configuration


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8421: Hide sensitive data in UI

2020-04-17 Thread GitBox
kaxil commented on issue #8421: Hide sensitive data in UI
URL: https://github.com/apache/airflow/issues/8421#issuecomment-615191959
 
 
   > > Airflow 1.10.10 allows getting connections from a Vault: 
https://airflow.apache.org/blog/airflow-1.10.10/#allow-retrieving-airflow-connections-variables-from-various-secrets-backend
   > > Does that help your use-case?
   > 
   > So basically, with Airflow 1.10.10, if I configure the airflow.cfg to use 
Hashicorp Vault, I can use connections and variables as usual but instead of 
getting data from Airflow database, it will got it from Vault?
   
   Exactly. Here is one of the guide: 
https://www.astronomer.io/guides/airflow-and-hashicorp-vault/ to test it out 
locally


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil opened a new pull request #8426: Add a dedicated "free disk space" step to fix CI

2020-04-17 Thread GitBox
kaxil opened a new pull request #8426: Add a dedicated "free disk space" step 
to fix CI
URL: https://github.com/apache/airflow/pull/8426
 
 
   The tests are failing on Masters and PRs are failing with:
   ```
   failed to register layer: Error processing tar file(exit status 1): write 
/usr/share/doc/python2.7/README.gz: no space left on device
   ```
   
   This is based on the solution mentioned in 
https://github.community/t5/GitHub-Actions/BUG-Strange-quot-No-space-left-on-device-quot-IOExceptions-on/m-p/47691#M6920
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] n4rk0o commented on issue #8421: Hide sensitive data in UI

2020-04-17 Thread GitBox
n4rk0o commented on issue #8421: Hide sensitive data in UI
URL: https://github.com/apache/airflow/issues/8421#issuecomment-615190998
 
 
   > Airflow 1.10.10 allows getting connections from a Vault: 
https://airflow.apache.org/blog/airflow-1.10.10/#allow-retrieving-airflow-connections-variables-from-various-secrets-backend
   > 
   > Does that help your use-case?
   
   So basically, with Airflow 1.10.10, if I configure the airflow.cfg to use 
Hashicorp Vault, I can use connections and variables as usual but instead of 
getting data from Airflow database, it will got it from Vault?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] rtstock commented on issue #8129: API Endpoints - CRUD - DAG Runs

2020-04-17 Thread GitBox
rtstock commented on issue #8129: API Endpoints - CRUD - DAG Runs
URL: https://github.com/apache/airflow/issues/8129#issuecomment-615185291
 
 
   +1 would also love to see historical output (eg results of 
admin/airflow/tree?root=_id=) in json as well
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2999) Add S3DownloadOperator

2020-04-17 Thread lovk korm (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17085644#comment-17085644
 ] 

lovk korm commented on AIRFLOW-2999:


[~feluelle] 
there is download_file function in s3hook probably it just need to wrap it with 
a new operator called S3DownloadOperator 

> Add S3DownloadOperator
> --
>
> Key: AIRFLOW-2999
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2999
> Project: Apache Airflow
>  Issue Type: Task
>Affects Versions: 1.10.0
>Reporter: jack
>Priority: Major
>
> The [S3_hook 
> |https://github.com/apache/incubator-airflow/blob/master/airflow/hooks/S3_hook.py#L177]
>  has get_key method that returns boto3.s3.Object it also has load_file method 
> which loads file from local file system to S3.
>  
> What it doesn't have is a method to download a file from S3 to the local file 
> system.
> Basicly it should be something very simple... an extention to the get_key 
> method with parameter to the destination on local file system adding a code 
> for taking the boto3.s3.Object and save it on the disk.  Note: that it can be 
> more than 1 file if the user choose a folder in S3.
>  
> +*Update:*+
> As discussed in comments instead having the property in the hook it's better 
> to mirror the GoogleCloudStorageDownloadOperator and have S3DownloadOperator
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io commented on issue #8227: Add run_type to DagRun

2020-04-17 Thread GitBox
codecov-io commented on issue #8227: Add run_type to DagRun
URL: https://github.com/apache/airflow/pull/8227#issuecomment-615178713
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=h1) 
Report
   > Merging 
[#8227](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/79d3f33c1b65c9c7e7b1a75e25d38cab9aa4517f=desc)
 will **increase** coverage by `0.00%`.
   > The diff coverage is `14.63%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/8227/graphs/tree.svg?width=650=150=pr=WdLKlKHOAU)](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=tree)
   
   ```diff
   @@  Coverage Diff   @@
   ##   master   #8227   +/-   ##
   ==
 Coverage6.23%   6.24%   
   ==
 Files 941 941   
 Lines   45644   45667   +23 
   ==
   + Hits 28462851+5 
   - Misses  42798   42816   +18 
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/api/common/experimental/mark\_tasks.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC9tYXJrX3Rhc2tzLnB5)
 | `0.00% <ø> (ø)` | |
   | 
[airflow/api/common/experimental/trigger\_dag.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC90cmlnZ2VyX2RhZy5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/jobs/base\_job.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2Jhc2Vfam9iLnB5)
 | `0.00% <ø> (ø)` | |
   | 
[airflow/jobs/scheduler\_job.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL3NjaGVkdWxlcl9qb2IucHk=)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/operators/subdag\_operator.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc3ViZGFnX29wZXJhdG9yLnB5)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/ti\_deps/deps/dagrun\_id\_dep.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvZGFncnVuX2lkX2RlcC5weQ==)
 | `69.23% <0.00%> (-2.20%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `0.00% <0.00%> (ø)` | |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `26.47% <11.11%> (-0.18%)` | :arrow_down: |
   | 
[airflow/models/dagrun.py](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFncnVuLnB5)
 | `27.19% <33.33%> (+0.47%)` | :arrow_up: |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/8227/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=footer). 
Last update 
[79d3f33...a5b36b7](https://codecov.io/gh/apache/airflow/pull/8227?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] xiaohan2013 opened a new issue #8425: defaults_args can't take into method of class instance

2020-04-17 Thread GitBox
xiaohan2013 opened a new issue #8425: defaults_args can't take into method of 
class instance 
URL: https://github.com/apache/airflow/issues/8425
 
 
   
   
   
   
   **Apache Airflow version**:  1.10.9
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: amazon ec2
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):Linux version 4.14.165-103.209.amzn1.x86_64
   - **Install tools**:pip3
   - **Others**:
   
   **What happened**: pickle _thread.RLock objects
   
   
   
   **What you expected to happen**: 
   
   
   
   **How to reproduce it**:   
   1. 
   default_args = {
   'owner': 'xxx',
   'depends_on_past': False,
   'start_date': datetime(2020, 4, 6),
   'email': ['x...@xxx.com'],
   'email_on_failure': False,
   'email_on_retry': False,
   'retries': 0,
   'retry_delay': timedelta(minutes=5),
   'on_failure_callback': notifier.failure_callback
   }
   2.  notifier = Notifier(subscribers=[Notifier.MSG], 
phone_receivers="",
   email_receivers="xx...@x.com")
   
   3. dag = DAG('xxx',
 default_args=default_args,
 schedule_interval='01 20 * * *')
   4. UI throw exception, pickle _thread.RLock objects ,however in the logfile 
throwing TypeError: cannot serialize '_io.TextIOWrapper' object
   
   
   
   
   **Anything else we need to know**:
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil closed issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
kaxil closed issue #8417: Increase max password length in Airflow Connections
URL: https://github.com/apache/airflow/issues/8417
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8417: Increase max password length in Airflow Connections

2020-04-17 Thread GitBox
kaxil commented on issue #8417: Increase max password length in Airflow 
Connections
URL: https://github.com/apache/airflow/issues/8417#issuecomment-615175422
 
 
   We have already increased the limit to 5000 in Airflow 1.10.7
   
   https://github.com/apache/airflow/pull/6241


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #8421: Hide sensitive data in UI

2020-04-17 Thread GitBox
kaxil commented on issue #8421: Hide sensitive data in UI
URL: https://github.com/apache/airflow/issues/8421#issuecomment-615174481
 
 
   Airflow 1.10.10 allows getting connections from a Vault: 
https://airflow.apache.org/blog/airflow-1.10.10/#allow-retrieving-airflow-connections-variables-from-various-secrets-backend
   
   Does that help your use-case?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] boring-cyborg[bot] commented on issue #8424: Exceptions inconsistent with UI and logfile

2020-04-17 Thread GitBox
boring-cyborg[bot] commented on issue #8424: Exceptions inconsistent with UI 
and logfile
URL: https://github.com/apache/airflow/issues/8424#issuecomment-615172070
 
 
   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >