[jira] [Updated] (AIRFLOW-5211) Add pass_value to template_fields -- BigQueryValueCheckOperator

2019-08-13 Thread Damon Liao (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5211?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Damon Liao updated AIRFLOW-5211:

External issue URL: https://github.com/apache/airflow/pull/5816
   Description: There's use cases to fill *pass_value* from *XCom* when 
use *BigQueryValueCheckOperator*, so add pass_value to template_fields.  (was: 
There's use cases to pass_value from xcom when use 
*BigQueryValueCheckOperator*, so add pass_value to template_fields.)

> Add pass_value to template_fields -- BigQueryValueCheckOperator
> ---
>
> Key: AIRFLOW-5211
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5211
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Affects Versions: 1.10.4
>Reporter: Damon Liao
>Assignee: Damon Liao
>Priority: Minor
> Fix For: 1.10.4, 1.10.5
>
>
> There's use cases to fill *pass_value* from *XCom* when use 
> *BigQueryValueCheckOperator*, so add pass_value to template_fields.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] damon09273 opened a new pull request #5816: Add pass_value to template_fields for BigQueryValueCheckOperator

2019-08-13 Thread GitBox
damon09273 opened a new pull request #5816: Add pass_value to template_fields 
for BigQueryValueCheckOperator
URL: https://github.com/apache/airflow/pull/5816
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-5211) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5211
   
   ### Description
   
   - [ ] There's use cases to fill `pass_value` from `xcom` when use 
*BigQueryValueCheckOperator*, so add pass_value to template_fields.
   
   
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5211) Add pass_value to template_fields -- BigQueryValueCheckOperator

2019-08-13 Thread Damon Liao (JIRA)
Damon Liao created AIRFLOW-5211:
---

 Summary: Add pass_value to template_fields -- 
BigQueryValueCheckOperator
 Key: AIRFLOW-5211
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5211
 Project: Apache Airflow
  Issue Type: Improvement
  Components: contrib
Affects Versions: 1.10.4
Reporter: Damon Liao
Assignee: Damon Liao
 Fix For: 1.10.5, 1.10.4


There's use cases to pass_value from xcom when use 
*BigQueryValueCheckOperator*, so add pass_value to template_fields.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-5210) Resolving Template Files for large DAGs hurts performance

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906809#comment-16906809
 ] 

ASF GitHub Bot commented on AIRFLOW-5210:
-

danfrankj commented on pull request #5815: [AIRFLOW-5210] Make finding template 
files more efficient
URL: https://github.com/apache/airflow/pull/5815
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   https://issues.apache.org/jira/browse/AIRFLOW-5210
   
   ### Description
   
   For large DAGs (1000s of tasks) iterating over template_fields adds 
significant time to task execution and is not necessary for tasks that do not 
specify template_ext 
   
   ### Tests
   
   I can add tests if you think necessary, but this is a _very_ simple change 
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Resolving Template Files for large DAGs hurts performance 
> --
>
> Key: AIRFLOW-5210
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5210
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Affects Versions: 1.10.4
>Reporter: Daniel Frank
>Priority: Major
>
> During task execution,  "resolve_template_files" runs for all tasks in a 
> given DAG. For large DAGs this takes a long time and is not necessary for 
> tasks that do not use the template_ext field 



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] danfrankj opened a new pull request #5815: [AIRFLOW-5210] Make finding template files more efficient

2019-08-13 Thread GitBox
danfrankj opened a new pull request #5815: [AIRFLOW-5210] Make finding template 
files more efficient
URL: https://github.com/apache/airflow/pull/5815
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   https://issues.apache.org/jira/browse/AIRFLOW-5210
   
   ### Description
   
   For large DAGs (1000s of tasks) iterating over template_fields adds 
significant time to task execution and is not necessary for tasks that do not 
specify template_ext 
   
   ### Tests
   
   I can add tests if you think necessary, but this is a _very_ simple change 
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5210) Resolving Template Files for large DAGs hurts performance

2019-08-13 Thread Daniel Frank (JIRA)
Daniel Frank created AIRFLOW-5210:
-

 Summary: Resolving Template Files for large DAGs hurts performance 
 Key: AIRFLOW-5210
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5210
 Project: Apache Airflow
  Issue Type: Bug
  Components: DAG
Affects Versions: 1.10.4
Reporter: Daniel Frank


During task execution,  "resolve_template_files" runs for all tasks in a given 
DAG. For large DAGs this takes a long time and is not necessary for tasks that 
do not use the template_ext field 



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4732) Add hook to interact with Google Drive (GoogleDriveHook)

2019-08-13 Thread Omar Al-Jadda (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4732?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906797#comment-16906797
 ] 

Omar Al-Jadda commented on AIRFLOW-4732:


I'm implementing  AIRFLOW-5158 to interact specifically with the Google sheets 
API, please comment there if you have any requests, questions etc.

> Add hook to interact with Google Drive (GoogleDriveHook)
> 
>
> Key: AIRFLOW-4732
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4732
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: gcp, hooks
>Affects Versions: 1.10.3
>Reporter: jack
>Priority: Major
>
> Having a hook for Google Drive will make it easier to Airflow users to 
> upload/delete/change files in Drive.
>  
> Example for use case and code sample:
> [https://pacuna.io/data-engineering/etl/2018/07/01/etl-sheets-postgres.html]



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Updated] (AIRFLOW-5158) Google Sheets hook

2019-08-13 Thread Omar Al-Jadda (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5158?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Omar Al-Jadda updated AIRFLOW-5158:
---
Summary: Google Sheets hook  (was: Google Sheet hook)

> Google Sheets hook
> --
>
> Key: AIRFLOW-5158
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5158
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Omar Al-Jadda
>Assignee: Omar Al-Jadda
>Priority: Major
> Fix For: 1.10.5
>
>
> Need a hook that can both read and write from a google sheet.
> Planning to implement the  Reading & Writing Cell Values of the API:
> [https://developers.google.com/sheets/api/guides/values]
> In particular get / update / append both single range and batch



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Assigned] (AIRFLOW-5158) Google Sheet hook

2019-08-13 Thread Omar Al-Jadda (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5158?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Omar Al-Jadda reassigned AIRFLOW-5158:
--

 Assignee: Omar Al-Jadda
Fix Version/s: 1.10.5
  Description: 
Need a hook that can both read and write from a google sheet.

Planning to implement the  Reading & Writing Cell Values of the API:

[https://developers.google.com/sheets/api/guides/values]

In particular get / update / append both single range and batch

  was:Need a hook that can both read and write from a google sheet

  Component/s: (was: operators)
   hooks

> Google Sheet hook
> -
>
> Key: AIRFLOW-5158
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5158
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: hooks
>Affects Versions: 1.10.4
>Reporter: Omar Al-Jadda
>Assignee: Omar Al-Jadda
>Priority: Major
> Fix For: 1.10.5
>
>
> Need a hook that can both read and write from a google sheet.
> Planning to implement the  Reading & Writing Cell Values of the API:
> [https://developers.google.com/sheets/api/guides/values]
> In particular get / update / append both single range and batch



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] houqp commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
houqp commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313668084
 
 

 ##
 File path: airflow/settings.py
 ##
 @@ -63,13 +67,13 @@
 LOG_FORMAT = conf.get('core', 'log_format')
 SIMPLE_LOG_FORMAT = conf.get('core', 'simple_log_format')
 
-SQL_ALCHEMY_CONN = None
-DAGS_FOLDER = None
-PLUGINS_FOLDER = None
-LOGGING_CLASS_PATH = None
+SQL_ALCHEMY_CONN = None  # type: Optional[str]
+DAGS_FOLDER = None  # type: Optional[str]
+PLUGINS_FOLDER = None  # type: Optional[str]
 
 Review comment:
   mypy only enforces type check statically, this means any change that happens 
at runtime will not be visible to it. Given these globals (and hook in 
`test_gcp_cloud_build_hook`) are actually populated after the python 
interpreter starts running, the optional is required.
   
   The optional type hint is basically added for this specific use case: this 
variable will be initialized as None, but will be changed at runtime.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] houqp commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
houqp commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313668133
 
 

 ##
 File path: tests/contrib/hooks/test_gcp_cloud_build_hook.py
 ##
 @@ -46,7 +47,7 @@
 
 
 class TestCloudBuildHookWithPassedProjectId(unittest.TestCase):
-hook = None
+hook = None  # type: Optional[CloudBuildHook]
 
 Review comment:
   replied in the other comment :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group references in one section

2019-08-13 Thread GitBox
kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group 
references in one section
URL: https://github.com/apache/airflow/pull/5776#discussion_r313640914
 
 

 ##
 File path: docs/usage-rest-api.rst
 ##
 @@ -0,0 +1,74 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Using the REST API
+==
+
+This document is meant to give a overview of all common tasks while using an 
REST API.
 
 Review comment:
   ```suggestion
   This document is meant to give an overview of all common tasks while using 
the REST API.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group references in one section

2019-08-13 Thread GitBox
kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group 
references in one section
URL: https://github.com/apache/airflow/pull/5776#discussion_r313641208
 
 

 ##
 File path: docs/usage-rest-api.rst
 ##
 @@ -0,0 +1,74 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Using the REST API
+==
+
+This document is meant to give a overview of all common tasks while using an 
REST API.
+
+.. note::
+
+API endpoints and samples are described in :doc:`rest-api-ref`.
+
+CLI
+-
+
+For some functions the cli can use the API. To configure the CLI to use the 
API when available
+configure as follows:
+
+.. code-block:: ini
+
+[cli]
+api_client = airflow.api.client.json_client
+endpoint_url = http://:
+
+
+Authentication
+--
+
+Authentication for the API is handled separately to the Web Authentication. 
The default is to not
+require any authentication on the API -- i.e. wide open by default. This is 
not recommended if your
+Airflow webserver is publicly accessible, and you should probably use the deny 
all backend:
 
 Review comment:
   ```suggestion
   Airflow webserver is publicly accessible, and you should probably use the 
``deny_all`` backend:
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group references in one section

2019-08-13 Thread GitBox
kaxil commented on a change in pull request #5776: [AIRFLOW-XXX] Group 
references in one section
URL: https://github.com/apache/airflow/pull/5776#discussion_r313641004
 
 

 ##
 File path: docs/usage-rest-api.rst
 ##
 @@ -0,0 +1,74 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Using the REST API
+==
+
+This document is meant to give a overview of all common tasks while using an 
REST API.
+
+.. note::
+
+API endpoints and samples are described in :doc:`rest-api-ref`.
+
+CLI
+-
 
 Review comment:
   ```suggestion
   ---
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-5052) Add the include_deleted parameter to Salesforce Hook

2019-08-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5052?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-5052.
-
   Resolution: Fixed
Fix Version/s: 1.10.5

> Add the include_deleted parameter to Salesforce Hook
> 
>
> Key: AIRFLOW-5052
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5052
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4, 2.0.0, 1.10.5
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
> Fix For: 1.10.5
>
>
> At present, the salesforce hook uses the make_query() method to perform query 
> via Salesforce api. The make_query() accepts a query and and uses a wrapper 
> function which calls the query_all() method from simple salesforce to perform 
> the query.
> The deficiency with the use of the query_all() method is that it does not 
> factor the inclusion of deleted records which is an optional parameter in the 
> query_all() method of simple salesforce as shown below:
> ```
> #query_all method in the salesforce hook
> query_all(query)
> #This should be changed to reflect what the simple salesforce query_all() 
> does in the. background as follows
> {color:#ffc66d}query_all{color}({color:#94558d}self{color}{color:#cc7832}, 
> {color}query{color:#cc7832}, {color}include_deleted={color:#cc7832}False, 
> {color}**kwargs):
> ```
> The objective of this ticket is add these parameters to the salesforce hook 
> so that the hook also has the option of including the deleted records from 
> Salesforce in the query.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-5052) Add the include_deleted parameter to Salesforce Hook

2019-08-13 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906676#comment-16906676
 ] 

ASF subversion and git services commented on AIRFLOW-5052:
--

Commit 80bd5ff4f612886b9544533b765e99f0d22daaee in airflow's branch 
refs/heads/master from wmorris75
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=80bd5ff ]

[AIRFLOW-5052] Added the include_deleted params to salesforce make_query (#5717)



> Add the include_deleted parameter to Salesforce Hook
> 
>
> Key: AIRFLOW-5052
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5052
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4, 2.0.0, 1.10.5
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
>
> At present, the salesforce hook uses the make_query() method to perform query 
> via Salesforce api. The make_query() accepts a query and and uses a wrapper 
> function which calls the query_all() method from simple salesforce to perform 
> the query.
> The deficiency with the use of the query_all() method is that it does not 
> factor the inclusion of deleted records which is an optional parameter in the 
> query_all() method of simple salesforce as shown below:
> ```
> #query_all method in the salesforce hook
> query_all(query)
> #This should be changed to reflect what the simple salesforce query_all() 
> does in the. background as follows
> {color:#ffc66d}query_all{color}({color:#94558d}self{color}{color:#cc7832}, 
> {color}query{color:#cc7832}, {color}include_deleted={color:#cc7832}False, 
> {color}**kwargs):
> ```
> The objective of this ticket is add these parameters to the salesforce hook 
> so that the hook also has the option of including the deleted records from 
> Salesforce in the query.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] kaxil merged pull request #5717: [AIRFLOW-5052] Added the include_deleted params to salesforce make_query

2019-08-13 Thread GitBox
kaxil merged pull request #5717: [AIRFLOW-5052] Added the include_deleted 
params to salesforce make_query
URL: https://github.com/apache/airflow/pull/5717
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5052) Add the include_deleted parameter to Salesforce Hook

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906675#comment-16906675
 ] 

ASF GitHub Bot commented on AIRFLOW-5052:
-

kaxil commented on pull request #5717: [AIRFLOW-5052] Added the include_deleted 
params to salesforce make_query
URL: https://github.com/apache/airflow/pull/5717
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add the include_deleted parameter to Salesforce Hook
> 
>
> Key: AIRFLOW-5052
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5052
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.4, 2.0.0, 1.10.5
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
>
> At present, the salesforce hook uses the make_query() method to perform query 
> via Salesforce api. The make_query() accepts a query and and uses a wrapper 
> function which calls the query_all() method from simple salesforce to perform 
> the query.
> The deficiency with the use of the query_all() method is that it does not 
> factor the inclusion of deleted records which is an optional parameter in the 
> query_all() method of simple salesforce as shown below:
> ```
> #query_all method in the salesforce hook
> query_all(query)
> #This should be changed to reflect what the simple salesforce query_all() 
> does in the. background as follows
> {color:#ffc66d}query_all{color}({color:#94558d}self{color}{color:#cc7832}, 
> {color}query{color:#cc7832}, {color}include_deleted={color:#cc7832}False, 
> {color}**kwargs):
> ```
> The objective of this ticket is add these parameters to the salesforce hook 
> so that the hook also has the option of including the deleted records from 
> Salesforce in the query.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-5209) Fix Documentation build

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906670#comment-16906670
 ] 

ASF GitHub Bot commented on AIRFLOW-5209:
-

kaxil commented on pull request #5814: [AIRFLOW-5209] Bump Sphinx version to 
fix doc build
URL: https://github.com/apache/airflow/pull/5814
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following Airflow Jira issues and references them 
in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5209

   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Currently, if you try to build on master or 1.10.4 it fails with the 
following error:
   
   ```
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
   return method(match, context, next_state)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2753, in underline
   self.section(title, source, style, lineno - 1, messages)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 327, in section
   self.new_subsection(title, lineno, messages)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 395, in new_subsection
   node=section_node, match_titles=True)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 282, in nested_parse
   node=node, match_titles=match_titles)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 196, in run
   results = StateMachineWS.run(self, input_lines, input_offset)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 239, in run
   context, state, transitions)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
   return method(match, context, next_state)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2326, in explicit_markup
   nodelist, blank_finish = self.explicit_construct(match)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2338, in explicit_construct
   return method(self, expmatch)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2081, in directive
   directive_class, match, type_name, option_presets)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2130, in run_directive
   result = directive_instance.run()
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 121, in run
   documenter_options = process_documenter_options(doccls, self.config, 
self.options)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 73, in process_documenter_options
   return Options(assemble_option_dict(options.items(), 
documenter.option_spec))
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/utils/__init__.py",
 line 328, in assemble_option_dict
   options[name] = convertor(value)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
   return [x.strip() for x in arg.split(',')]
   AttributeError: 'bool' object has no attribute 'split'
   
   Exception occurred:
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
   return [x.strip() for x in arg.split(',')]
   AttributeError: 'bool' object has no attribute 'split'
   ```
   
   Our doc build on RTD also fails with the same error: 
https://readthedocs.org/projects/airflow/builds/9511663/
   
   This is caused where the version of Sphinx < 2
   
   Using the 

[GitHub] [airflow] kaxil opened a new pull request #5814: [AIRFLOW-5209] Bump Sphinx version to fix doc build

2019-08-13 Thread GitBox
kaxil opened a new pull request #5814: [AIRFLOW-5209] Bump Sphinx version to 
fix doc build
URL: https://github.com/apache/airflow/pull/5814
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following Airflow Jira issues and references them 
in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5209

   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Currently, if you try to build on master or 1.10.4 it fails with the 
following error:
   
   ```
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
   return method(match, context, next_state)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2753, in underline
   self.section(title, source, style, lineno - 1, messages)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 327, in section
   self.new_subsection(title, lineno, messages)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 395, in new_subsection
   node=section_node, match_titles=True)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 282, in nested_parse
   node=node, match_titles=match_titles)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 196, in run
   results = StateMachineWS.run(self, input_lines, input_offset)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 239, in run
   context, state, transitions)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
   return method(match, context, next_state)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2326, in explicit_markup
   nodelist, blank_finish = self.explicit_construct(match)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2338, in explicit_construct
   return method(self, expmatch)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2081, in directive
   directive_class, match, type_name, option_presets)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2130, in run_directive
   result = directive_instance.run()
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 121, in run
   documenter_options = process_documenter_options(doccls, self.config, 
self.options)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 73, in process_documenter_options
   return Options(assemble_option_dict(options.items(), 
documenter.option_spec))
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/utils/__init__.py",
 line 328, in assemble_option_dict
   options[name] = convertor(value)
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
   return [x.strip() for x in arg.split(',')]
   AttributeError: 'bool' object has no attribute 'split'
   
   Exception occurred:
 File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
   return [x.strip() for x in arg.split(',')]
   AttributeError: 'bool' object has no attribute 'split'
   ```
   
   Our doc build on RTD also fails with the same error: 
https://readthedocs.org/projects/airflow/builds/9511663/
   
   This is caused where the version of Sphinx < 2
   
   Using the latest Sphinx version solves this for us.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   N/a
   
   ### Commits
   
   - [x] My commits all reference Jira issues in 

[jira] [Updated] (AIRFLOW-5209) Fix Documentation build

2019-08-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-5209:

Description: 
Currently, if you try to build on master or 1.10.4 it fails with the following 
error:


{noformat}
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
return method(match, context, next_state)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2753, in underline
self.section(title, source, style, lineno - 1, messages)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 327, in section
self.new_subsection(title, lineno, messages)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 395, in new_subsection
node=section_node, match_titles=True)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 282, in nested_parse
node=node, match_titles=match_titles)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 196, in run
results = StateMachineWS.run(self, input_lines, input_offset)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 239, in run
context, state, transitions)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
return method(match, context, next_state)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2326, in explicit_markup
nodelist, blank_finish = self.explicit_construct(match)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2338, in explicit_construct
return method(self, expmatch)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2081, in directive
directive_class, match, type_name, option_presets)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2130, in run_directive
result = directive_instance.run()
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 121, in run
documenter_options = process_documenter_options(doccls, self.config, 
self.options)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 73, in process_documenter_options
return Options(assemble_option_dict(options.items(), 
documenter.option_spec))
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/utils/__init__.py",
 line 328, in assemble_option_dict
options[name] = convertor(value)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
return [x.strip() for x in arg.split(',')]
AttributeError: 'bool' object has no attribute 'split'

Exception occurred:
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
return [x.strip() for x in arg.split(',')]
AttributeError: 'bool' object has no attribute 'split'
{noformat}

Our doc build on RTD also fails with the same error: 
https://readthedocs.org/projects/airflow/builds/9511663/

This is caused where the version of Sphinx < 2

Using the latest Sphinx version solves this for us.

  was:
Currently, if you try to build on master or 1.10.4 it fails with the following 
error:


{noformat}
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
return method(match, context, next_state)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2753, in underline
self.section(title, source, style, lineno - 1, messages)
  File 

[jira] [Created] (AIRFLOW-5209) Fix Documentation build

2019-08-13 Thread Kaxil Naik (JIRA)
Kaxil Naik created AIRFLOW-5209:
---

 Summary: Fix Documentation build
 Key: AIRFLOW-5209
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5209
 Project: Apache Airflow
  Issue Type: Bug
  Components: dependencies
Affects Versions: 1.10.4
Reporter: Kaxil Naik
Assignee: Kaxil Naik
 Fix For: 1.10.5


Currently, if you try to build on master or 1.10.4 it fails with the following 
error:


{noformat}
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
return method(match, context, next_state)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2753, in underline
self.section(title, source, style, lineno - 1, messages)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 327, in section
self.new_subsection(title, lineno, messages)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 395, in new_subsection
node=section_node, match_titles=True)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 282, in nested_parse
node=node, match_titles=match_titles)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 196, in run
results = StateMachineWS.run(self, input_lines, input_offset)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 239, in run
context, state, transitions)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/statemachine.py",
 line 460, in check_line
return method(match, context, next_state)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2326, in explicit_markup
nodelist, blank_finish = self.explicit_construct(match)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2338, in explicit_construct
return method(self, expmatch)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2081, in directive
directive_class, match, type_name, option_presets)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/parsers/rst/states.py",
 line 2130, in run_directive
result = directive_instance.run()
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 121, in run
documenter_options = process_documenter_options(doccls, self.config, 
self.options)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/directive.py",
 line 73, in process_documenter_options
return Options(assemble_option_dict(options.items(), 
documenter.option_spec))
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/docutils/utils/__init__.py",
 line 328, in assemble_option_dict
options[name] = convertor(value)
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
return [x.strip() for x in arg.split(',')]
AttributeError: 'bool' object has no attribute 'split'

Exception occurred:
  File 
"/home/docs/checkouts/readthedocs.org/user_builds/airflow/envs/latest/lib/python3.7/site-packages/sphinx/ext/autodoc/__init__.py",
 line 82, in members_option
return [x.strip() for x in arg.split(',')]
AttributeError: 'bool' object has no attribute 'split'
{noformat}

This is caused where the version of Sphinx < 2

Using the latest Sphinx version solves this for us.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4526) KubernetesPodOperator gets stuck in Running state when get_logs is set to True and there is a long gap without logs from pod

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4526?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906667#comment-16906667
 ] 

ASF GitHub Bot commented on AIRFLOW-4526:
-

thealmightygrant commented on pull request #5813: [AIRFLOW-4526] 
KubernetesPodOperator gets stuck in Running state when get_logs is set to True 
and there is a long gap without logs from pod
URL: https://github.com/apache/airflow/pull/5813
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-4526) issues and references 
them in the PR title.
 - https://issues.apache.org/jira/browse/AIRFLOW-4526
   
   ### Description
   
   - [x] I delved through the airflow/k8s python code, and it looks as if we 
can alleviate the above referenced JIRA issue by adding the ability for users 
to supply a timeout for the connection to the logs of the image running in the 
KubernetesPodOperator. From what I can glean of the codebase, the 
`pod_launcher` calls 
[`read_namespaced_pod_log`](https://github.com/apache/airflow/blob/5899cec01e0ea69a54e650a9e1abdbcd5370e120/airflow/kubernetes/pod_launcher.py#L149),
 which then calls the k8s API via a `GET` request to 
/api/v1/namespaces/{namespace}/pods/{name}/log`. This `GET` request is 
fulfilled by the API client through a 
[`request`](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/api_client.py#L354).
 Every request has the option to provide several options, one of which is 
`_preload_content` which is used in Airflow currently. In this PR, I am adding 
the ability to use another option `_request_timeout`. This `_request_timeout` 
is fulfilled by the `RESTClientObject` which is implemented 
[here](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/rest.py#L57).
 Within the REST client, the `_request_timeout` is taken care of 
[here](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/rest.py#L142)
 by `urllib3` using a 
[`Timeout`](https://urllib3.readthedocs.io/en/latest/user-guide.html#using-timeouts).
 
   
   A `urllib3` Timeout allows you to "to control how long (in seconds) requests 
are allowed to run before being aborted. This means that when logs are not 
received within 10 minutes, this call will properly be aborted and the retry 
logic recently implemented in 
[PR-5284](https://github.com/apache/airflow/pull/5284) will then kick in. After 
three retries, as far as I can tell, the pod operator task would then be marked 
as a failure.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: this is an extremely hard to reproduce failure 
case...we have been unable to reproduce it in a reliable manner during testing, 
but it does occur on long running tasks (> 1hr). Any task that runs for a long 
period of time and has connection issues is not easy to test. 
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> KubernetesPodOperator gets stuck in Running state when get_logs is set to 
> True and there is a long gap without logs from pod
> 
>
> Key: AIRFLOW-4526
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4526
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
> Environment: Azure Kubernetes 

[GitHub] [airflow] thealmightygrant opened a new pull request #5813: [AIRFLOW-4526] KubernetesPodOperator gets stuck in Running state when get_logs is set to True and there is a long gap without logs

2019-08-13 Thread GitBox
thealmightygrant opened a new pull request #5813: [AIRFLOW-4526] 
KubernetesPodOperator gets stuck in Running state when get_logs is set to True 
and there is a long gap without logs from pod
URL: https://github.com/apache/airflow/pull/5813
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-4526) issues and references 
them in the PR title.
 - https://issues.apache.org/jira/browse/AIRFLOW-4526
   
   ### Description
   
   - [x] I delved through the airflow/k8s python code, and it looks as if we 
can alleviate the above referenced JIRA issue by adding the ability for users 
to supply a timeout for the connection to the logs of the image running in the 
KubernetesPodOperator. From what I can glean of the codebase, the 
`pod_launcher` calls 
[`read_namespaced_pod_log`](https://github.com/apache/airflow/blob/5899cec01e0ea69a54e650a9e1abdbcd5370e120/airflow/kubernetes/pod_launcher.py#L149),
 which then calls the k8s API via a `GET` request to 
/api/v1/namespaces/{namespace}/pods/{name}/log`. This `GET` request is 
fulfilled by the API client through a 
[`request`](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/api_client.py#L354).
 Every request has the option to provide several options, one of which is 
`_preload_content` which is used in Airflow currently. In this PR, I am adding 
the ability to use another option `_request_timeout`. This `_request_timeout` 
is fulfilled by the `RESTClientObject` which is implemented 
[here](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/rest.py#L57).
 Within the REST client, the `_request_timeout` is taken care of 
[here](https://github.com/kubernetes-client/python/blob/master/kubernetes/client/rest.py#L142)
 by `urllib3` using a 
[`Timeout`](https://urllib3.readthedocs.io/en/latest/user-guide.html#using-timeouts).
 
   
   A `urllib3` Timeout allows you to "to control how long (in seconds) requests 
are allowed to run before being aborted. This means that when logs are not 
received within 10 minutes, this call will properly be aborted and the retry 
logic recently implemented in 
[PR-5284](https://github.com/apache/airflow/pull/5284) will then kick in. After 
three retries, as far as I can tell, the pod operator task would then be marked 
as a failure.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: this is an extremely hard to reproduce failure 
case...we have been unable to reproduce it in a reliable manner during testing, 
but it does occur on long running tasks (> 1hr). Any task that runs for a long 
period of time and has connection issues is not easy to test. 
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dimberman commented on issue #5481: [AIRFLOW-4851] Refactor K8S codebase with k8s API models

2019-08-13 Thread GitBox
dimberman commented on issue #5481: [AIRFLOW-4851] Refactor K8S codebase with 
k8s API models
URL: https://github.com/apache/airflow/pull/5481#issuecomment-521027225
 
 
   @davlum I appreciate your being conservative about this. I'm fine to wait on 
more testing as I agree this does affect a lot of things. Let me know when 
you're ready for me to review again.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-4222) Add cli autocomplete

2019-08-13 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4222?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-4222.
-
   Resolution: Fixed
Fix Version/s: 1.10.5

> Add cli autocomplete
> 
>
> Key: AIRFLOW-4222
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4222
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.10.2
>Reporter: zhongjiajie
>Assignee: Kaxil Naik
>Priority: Major
>  Labels: patch
> Fix For: 1.10.5
>
>
> Airflow cli can not autocomplete so far, this ticket will make it happen to 
> let cli more user-friendliness



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4222) Add cli autocomplete

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4222?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906633#comment-16906633
 ] 

ASF GitHub Bot commented on AIRFLOW-4222:
-

kaxil commented on pull request #5789: [AIRFLOW-4222] Add cli autocomplete for 
bash & zsh
URL: https://github.com/apache/airflow/pull/5789
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add cli autocomplete
> 
>
> Key: AIRFLOW-4222
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4222
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.10.2
>Reporter: zhongjiajie
>Assignee: Kaxil Naik
>Priority: Major
>  Labels: patch
>
> Airflow cli can not autocomplete so far, this ticket will make it happen to 
> let cli more user-friendliness



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4222) Add cli autocomplete

2019-08-13 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4222?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906635#comment-16906635
 ] 

ASF subversion and git services commented on AIRFLOW-4222:
--

Commit 44eb89d672784107c317cfe882c45d45d302621f in airflow's branch 
refs/heads/master from Kaxil Naik
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=44eb89d ]

[AIRFLOW-4222] Add cli autocomplete for bash & zsh (#5789)



> Add cli autocomplete
> 
>
> Key: AIRFLOW-4222
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4222
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 1.10.2
>Reporter: zhongjiajie
>Assignee: Kaxil Naik
>Priority: Major
>  Labels: patch
>
> Airflow cli can not autocomplete so far, this ticket will make it happen to 
> let cli more user-friendliness



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] kaxil commented on a change in pull request #5789: [AIRFLOW-4222] Add cli autocomplete for bash & zsh

2019-08-13 Thread GitBox
kaxil commented on a change in pull request #5789: [AIRFLOW-4222] Add cli 
autocomplete for bash & zsh
URL: https://github.com/apache/airflow/pull/5789#discussion_r313623139
 
 

 ##
 File path: docs/howto/cli-completion.rst
 ##
 @@ -0,0 +1,42 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+..http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Set Up Bash/Zsh Completion
+==
+
+When using bash (or ``zsh``) as your shell, ``airflow`` can use
+`argcomplete `_ for auto-completion.
+
+For global activation of all argcomplete enabled python applications run:
+
+.. code-block:: bash
+
+  sudo activate-global-python-argcomplete
+
+For permanent (but not global) airflow activation, use:
+
+.. code-block:: bash
+
+  register-python-argcomplete airflow >> ~/.bashrc
+
+For one-time activation of argcomplete for airflow only, use:
+
+.. code-block:: bash
+
+  eval "$(register-python-argcomplete airflow)"
 
 Review comment:
   Well this isn't as trivial as I thought it would because just running 
`airflow` prints out the following:
   
   ```
   
/Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/models/dagbag.py:21:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
 import imp
   [2019-08-13 22:30:04,803] {settings.py:173} INFO - settings.configure_orm(): 
Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=86597
   
/Users/kaxilnaik/.virtualenvs/airflow_cli_test/lib/python3.7/site-packages/psycopg2/__init__.py:144:
 UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in 
order to keep installing from binary please use "pip install psycopg2-binary" 
instead. For details see: 
.
 """)
   [2019-08-13 22:30:04,876] {__init__.py:51} INFO - Using executor 
LocalExecutor
   ```
   
   **Example implementation**:
   
![image](https://user-images.githubusercontent.com/8811558/62978759-eeb7e600-be19-11e9-89d5-ada08536adae.png)
   
   
   If we then run `eval $(airflow autocomplete)` it gives the following error:
   ```
   
/Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/models/dagbag.py:21:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
 import imp
   
/Users/kaxilnaik/.virtualenvs/airflow_cli_test/lib/python3.7/site-packages/psycopg2/__init__.py:144:
 UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in 
order to keep installing from binary please use "pip install psycopg2-binary" 
instead. For details see: 
.
 """)
   zsh: bad pattern: [2019-08-13
   ```
   
   as it is trying to parse the log line `[2019-08-13 22:29:39,045] 
{settings.py:173} INFO - settings.configure_orm(): Using pool settings. 
pool_size=5, max_overflow=10, pool_recycle=1800, pid=86465`
   
   Hence, not implementing this as part of this PR. Let's revisit this later.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #5789: [AIRFLOW-4222] Add cli autocomplete for bash & zsh

2019-08-13 Thread GitBox
kaxil merged pull request #5789: [AIRFLOW-4222] Add cli autocomplete for bash & 
zsh
URL: https://github.com/apache/airflow/pull/5789
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] davlum commented on issue #5481: [AIRFLOW-4851] Refactor K8S codebase with k8s API models

2019-08-13 Thread GitBox
davlum commented on issue #5481: [AIRFLOW-4851] Refactor K8S codebase with k8s 
API models
URL: https://github.com/apache/airflow/pull/5481#issuecomment-521016863
 
 
   @dimberman Tests are passing, but I'd like to add some more as there were a 
significant number of changes, all of which I believe are backwards compatible. 
I also noted that a number of the integration tests don't actually assert 
anything, for example [this integration test that changes the security 
context](https://github.com/apache/airflow/blob/3e2a02751cf890b780bc26b40c7cee7f1f4e0bd9/airflow/contrib/example_dags/example_kubernetes_executor_config.py#L83)
 isn't actually supported, as such the user never gets changed. My changes 
support the argument `security_context` to the executor_config, and I tried a 
version which also supports `securityContext`, but this caused the Dag run to 
fail, as user 1000 didn't have access to the `/root/airflow` directory.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kanetkarster commented on issue #5812: [AIRFLOW-4449] updated default permission for custom roles to 'Viewer'

2019-08-13 Thread GitBox
kanetkarster commented on issue #5812: [AIRFLOW-4449] updated default 
permission for custom roles to 'Viewer'
URL: https://github.com/apache/airflow/pull/5812#issuecomment-521007814
 
 
   cc @feng-tao - this follows up from your change in #3197


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kanetkarster commented on a change in pull request #5812: [AIRFLOW-4449] updated default permission for custom roles to 'Viewer'

2019-08-13 Thread GitBox
kanetkarster commented on a change in pull request #5812: [AIRFLOW-4449] 
updated default permission for custom roles to 'Viewer'
URL: https://github.com/apache/airflow/pull/5812#discussion_r313611238
 
 

 ##
 File path: airflow/www/security.py
 ##
 @@ -416,7 +416,7 @@ def merge_pv(perm, view_menu):
 # for all the dag-level role, add the permission of viewer
 
 Review comment:
   this comment is now up-to-date


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4449) Default permissions for custom roles

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4449?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906610#comment-16906610
 ] 

ASF GitHub Bot commented on AIRFLOW-4449:
-

kanetkarster commented on pull request #5812: [AIRFLOW-4449] updated default 
permission for custom roles to 'Viewer'
URL: https://github.com/apache/airflow/pull/5812
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
   - https://issues.apache.org/jira/browse/AIRFLOW-4449
   
   ### Description
   
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Previously, it was impossible to create a custom role with permissions
   less than viewer. This was suboptimal if, for example, we wanted to
   create a user without the `set_success` permission
   
   This also makes the behavior of the mode match the associated comment.\
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: it's a pretty trivial change and is already covered 
by `test_init_role_baseview`
   
   ![manually tested that the web interface behaved as 
expected](https://user-images.githubusercontent.com/3322724/62976495-b8af3d80-bdea-11e9-836e-8ffdf548274a.png)
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Default permissions for custom roles
> 
>
> Key: AIRFLOW-4449
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4449
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: database, webserver
>Reporter: Alec Taggart
>Assignee: Tao Feng
>Priority: Minor
> Attachments: Custom role post default addition.png, Custom role pre 
> default addition.png
>
>
> By default, there are 4 core airflow user roles. These roles are well made 
> and perform nicely. However, adding new custom roles seems to (by default) 
> apply all "User" permissions to the new custom role. I attached some 
> screen-shots showing custom roles being changed by the web server to include 
> default "User" permissions. This is an issue as it prevents strict control of 
> specific pipelines. At most, default permissions applied to custom roles 
> should only include viewing privileges. This way the system admins can add 
> read/edit/pause/etc. permissions for specific dags. 
>  
> I suggest changing the default permissions that are applied to all custom 
> roles to a list of permissions similar to the "Viewer" role OR simply do not 
> apply default permissions to custom roles and let admins handle assigning 
> permissions or multiple custom roles to users. The latter is definitely the 
> preferred functionality. 
> Please note I am not suggesting a removal on the four base roles that come 
> with airflow, simply different behavior when creating new roles. 
> Below is a list of changed permissions to apply to custom roles if it is 
> decided this is the best approach. (very similar to "Viewer" role) 
> [can tries on Airflow, can graph on Airflow, can task on Airflow, can code on 
> Airflow, can duration on Airflow, can landing times on Airflow, can pickle 
> info on Airflow, can tree on Airflow, can rendered on Airflow, can gantt on 
> Airflow, can blocked on Airflow, can task instances on Airflow, can 

[GitHub] [airflow] kanetkarster opened a new pull request #5812: [AIRFLOW-4449] updated default permission for custom roles to 'Viewer'

2019-08-13 Thread GitBox
kanetkarster opened a new pull request #5812: [AIRFLOW-4449] updated default 
permission for custom roles to 'Viewer'
URL: https://github.com/apache/airflow/pull/5812
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
   - https://issues.apache.org/jira/browse/AIRFLOW-4449
   
   ### Description
   
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Previously, it was impossible to create a custom role with permissions
   less than viewer. This was suboptimal if, for example, we wanted to
   create a user without the `set_success` permission
   
   This also makes the behavior of the mode match the associated comment.\
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: it's a pretty trivial change and is already covered 
by `test_init_role_baseview`
   
   ![manually tested that the web interface behaved as 
expected](https://user-images.githubusercontent.com/3322724/62976495-b8af3d80-bdea-11e9-836e-8ffdf548274a.png)
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5207) Mark Success and Mark Failed views error out due to DAG reassignment

2019-08-13 Thread Marcus Levine (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5207:
---
Description: 
When trying to clear a task after upgrading to 1.10.4, I get the following 
traceback:
{code:java}
File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.{code}
This should be a simple fix by either dropping the offending line, or if it is 
required to keep things working, just set the private attribute instead:
{code:java}
task._dag = dag
{code}

  was:
When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
{code:java}
File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.{code}
This should be a simple fix by either dropping the offending line, or if it is 
required to keep things working, just set the private attribute instead:
{code:java}
task._dag = dag
{code}


> Mark Success and Mark Failed views error out due to DAG reassignment
> 
>
> Key: AIRFLOW-5207
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5207
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Major
> Fix For: 1.10.5
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When trying to clear a task after upgrading to 1.10.4, I get the following 
> traceback:
> {code:java}
> File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 
> 1451, in failed future, past, State.FAILED) File 
> "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
> _mark_task_instance_state task.dag = dag File 
> "/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
> 509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
> airflow.exceptions.AirflowException: The DAG assigned to 
>  can not be changed.{code}
> This should be a simple fix by either dropping the offending line, or if it 
> is required to keep things working, just set the private attribute instead:
> {code:java}
> task._dag = dag
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] ryanyuan commented on issue #5546: [AIRFLOW-4908] BigQuery Hooks/Operators for update_dataset, patch_dataset, get_dataset

2019-08-13 Thread GitBox
ryanyuan commented on issue #5546: [AIRFLOW-4908] BigQuery Hooks/Operators for 
update_dataset, patch_dataset, get_dataset
URL: https://github.com/apache/airflow/pull/5546#issuecomment-520984706
 
 
   @mik-laj Done. PTAL


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5207) Mark Success and Mark Failed views error out due to DAG reassignment

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5207?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906569#comment-16906569
 ] 

ASF GitHub Bot commented on AIRFLOW-5207:
-

marcusianlevine commented on pull request #5811: [AIRFLOW-5207] Fix Mark 
Success and Failure views
URL: https://github.com/apache/airflow/pull/5811
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5207
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   After upgrading to 1.10.4, the `Mark Success` and `Mark Failure` buttons now 
lead to the following error in the UI:
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: fixes a broken view
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Mark Success and Mark Failed views error out due to DAG reassignment
> 
>
> Key: AIRFLOW-5207
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5207
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Major
> Fix For: 1.10.5
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When trying to clear a task after upgrading to `1.10.4`, I get the following 
> traceback:
> {code:java}
> File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 
> 1451, in failed future, past, State.FAILED) File 
> "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
> _mark_task_instance_state task.dag = dag File 
> "/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
> 509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
> airflow.exceptions.AirflowException: The DAG assigned to 
>  can not be changed.{code}
> This should be a simple fix by either dropping the offending line, or if it 
> is required to keep things working, just set the private attribute instead:
> {code:java}
> task._dag = dag
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] marcusianlevine opened a new pull request #5811: [AIRFLOW-5207] Fix Mark Success and Failure views

2019-08-13 Thread GitBox
marcusianlevine opened a new pull request #5811: [AIRFLOW-5207] Fix Mark 
Success and Failure views
URL: https://github.com/apache/airflow/pull/5811
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5207
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   After upgrading to 1.10.4, the `Mark Success` and `Mark Failure` buttons now 
lead to the following error in the UI:
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason: fixes a broken view
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] Fokko commented on issue #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
Fokko commented on issue #5808:  [AIRFLOW-5205] Check xml files depends on  
AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808#issuecomment-520968474
 
 
   I've noticed that the PR's in the list are sequential, it starts with this 
one: https://github.com/apache/airflow/pull/5777


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jsurloppe commented on issue #5680: [AIRFLOW-5066] allow k8s fieldref substitution

2019-08-13 Thread GitBox
jsurloppe commented on issue #5680: [AIRFLOW-5066] allow k8s fieldref 
substitution
URL: https://github.com/apache/airflow/pull/5680#issuecomment-520968366
 
 
   You're both right, it was faster than expected, test added. :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on issue #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
feluelle commented on issue #5808:  [AIRFLOW-5205] Check xml files depends on  
AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808#issuecomment-520964271
 
 
   Agree with Fokko. I personally don't want to be responsible for reviewing 
such a huge PR and maybe miss something - but I really like reviewing PR's in 
general :(
   
   @potiuk do you think it is possible to split it into smaller ones?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aijamalnk edited a comment on issue #5763: [AIRFLOW-5148] Add Google Analytics to the Airflow doc website

2019-08-13 Thread GitBox
aijamalnk edited a comment on issue #5763: [AIRFLOW-5148] Add Google Analytics 
to the Airflow doc website
URL: https://github.com/apache/airflow/pull/5763#issuecomment-520946607
 
 
   Hi @mik-laj , 
   As Kaxil said, it was my request and I can answer all of your questions. 
   
   First of all, no one at Google (except for me, but I am wearing by Apache & 
Airflow hat here) has access to GA. And be assured I used some screenshots only 
(about # of visitors per day)  without any personally identifying info to make 
the case to get funding for the website development from Google. These 
statistics are valuable because we are investing in the website and we want to 
make sure that new UX/UI works and brings value to Airflow users. It is also 
important for improving the most valuable pages of documentation (including 
measuring the effects from Season of Docs effort) and see how user behavior 
changes throughout time. Without GA, i don't think we can have a similar amount 
of fine-grained information around pageviews, navigation of the website, and 
time spent on each page (not counting also geographical interest, which can 
help us fund meetups around the world).
   
   Besides me, Sid, Kaxil and Ash have currently access to GA, all of them are 
Project Management Committee and someone from Apache as you say. But I am also 
sending email to add everyone in the PMC. 
   
   About GDPR it seems we can comply with it if we don't track any User login 
info (which we don't) and so we don't strictly have to have any notice [1]. The 
practice in other Apache Projects is to add the following notice to our website:
   
   -
   WEBSITE USAGE PRIVACY POLICY
   Information about your use of this website is collected using server access 
logs and a tracking cookie. The collected information consists of the following:
   - The IP address from which you access the website;
   - The type of browser and operating system you use to access our site;
   - The date and time you access our site;
   - The pages you visit; and
   - The addresses of pages from where you followed a link to our site.
   Part of this information is gathered using a tracking cookie set by the 
Google Analytics service and handled by Google as described in their privacy 
policy. See your browser documentation for instructions on how to disable the 
cookie if you prefer not to share this data with Google.
   We use the gathered information to help us make our site more useful to 
visitors and to better understand how and when our site is used. We do not 
track or collect personally identifiable information or associate gathered data 
with any personally identifying information from other sources.
   By using this website, you consent to the collection of this data in the 
manner and for the purpose described above.
   The ASF welcomes your questions or comments regarding this Privacy Policy. 
Send them to d...@airflow.apache.org
   -
   And lastly, i strongly -1 the alternative that you suggested. I don't see a 
point of abandoning the 'standard' web analytics tool that we can configure to 
be GRDP compliant and using far less popular project that personally I don't 
know how to use and have no bandwidth to study. 
   
   [1] https://www.cookiebot.com/en/google-analytics-gdpr/
   [2] https://www.apache.org/foundation/policies/privacy.html
   [3] https://activemq.apache.org/privacy-policy.html
   [4] https://mahout.apache.org/general/privacy-policy


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5208) Passed **kwargs to push_by_returning

2019-08-13 Thread Gaurav Prachchhak (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5208?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gaurav Prachchhak updated AIRFLOW-5208:
---
Description: 
Without **kwargs push_by_returning was giving error, so added that parameter.

[https://github.com/apache/airflow/blob/master/airflow/example_dags/example_xcom.py]

  was:Without **kwargs push_by_returning was giving error, so added that 
parameter.


> Passed **kwargs to push_by_returning
> 
>
> Key: AIRFLOW-5208
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5208
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: examples
>Affects Versions: 1.10.4
> Environment: Linux
>Reporter: Gaurav Prachchhak
>Priority: Minor
> Attachments: Screen Shot 2019-08-13 at 11.20.14 AM.png
>
>
> Without **kwargs push_by_returning was giving error, so added that parameter.
> [https://github.com/apache/airflow/blob/master/airflow/example_dags/example_xcom.py]



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Created] (AIRFLOW-5208) Passed **kwargs to push_by_returning

2019-08-13 Thread Gaurav Prachchhak (JIRA)
Gaurav Prachchhak created AIRFLOW-5208:
--

 Summary: Passed **kwargs to push_by_returning
 Key: AIRFLOW-5208
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5208
 Project: Apache Airflow
  Issue Type: Improvement
  Components: examples
Affects Versions: 1.10.4
 Environment: Linux
Reporter: Gaurav Prachchhak
 Attachments: Screen Shot 2019-08-13 at 11.20.14 AM.png

Without **kwargs push_by_returning was giving error, so added that parameter.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] gauravprachchhak opened a new pull request #5810: Passed **kwargs to push_by_returning

2019-08-13 Thread GitBox
gauravprachchhak opened a new pull request #5810: Passed **kwargs to 
push_by_returning
URL: https://github.com/apache/airflow/pull/5810
 
 
   Without **kwargs push_by_returning was giving error, so added that parameter.
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aijamalnk edited a comment on issue #5763: [AIRFLOW-5148] Add Google Analytics to the Airflow doc website

2019-08-13 Thread GitBox
aijamalnk edited a comment on issue #5763: [AIRFLOW-5148] Add Google Analytics 
to the Airflow doc website
URL: https://github.com/apache/airflow/pull/5763#issuecomment-520946607
 
 
   Hi @mik-laj , 
   As Kaxil said, it was my request and I can answer all of your questions. 
   
   First of all, no one at Google (except for me, but I am wearing by Apache & 
Airflow hat here) has access to GA. And be assured I used some screenshots only 
(about # of visitors per day)  without any personally identifying info to make 
the case to get funding for the website development from Google. These 
statistics are valuable because we are investing in the website and we want to 
make sure that new UX/UI works and brings value to Airflow users. It is also 
important for improving the most valuable pages of documentation (including 
measuring the effects from Season of Docs effort) and see how user behavior 
changes throughout time. Without GA, i don't think we can have a similar amount 
of fine-grained information around pageviews, navigation of the website, and 
time spent on each page (not counting also geographical interest, which can 
help us fund meetups around the world).
   
   Besides me, Sid, Kaxil and Ash have currently access to GA, all of them are 
Project Management Committee and someone from Apache as you say. But I am also 
sending email to private@ to add everyone in the PMC. 
   
   About GDPR it seems we can comply with it if we don't track any User login 
info (which we don't) and so we don't strictly have to have any notice [1]. The 
practice in other Apache Projects is to add the following notice to our website:
   
   -
   WEBSITE USAGE PRIVACY POLICY
   Information about your use of this website is collected using server access 
logs and a tracking cookie. The collected information consists of the following:
   - The IP address from which you access the website;
   - The type of browser and operating system you use to access our site;
   - The date and time you access our site;
   - The pages you visit; and
   - The addresses of pages from where you followed a link to our site.
   Part of this information is gathered using a tracking cookie set by the 
Google Analytics service and handled by Google as described in their privacy 
policy. See your browser documentation for instructions on how to disable the 
cookie if you prefer not to share this data with Google.
   We use the gathered information to help us make our site more useful to 
visitors and to better understand how and when our site is used. We do not 
track or collect personally identifiable information or associate gathered data 
with any personally identifying information from other sources.
   By using this website, you consent to the collection of this data in the 
manner and for the purpose described above.
   The ASF welcomes your questions or comments regarding this Privacy Policy. 
Send them to d...@airflow.apache.org
   -
   And lastly, i strongly -1 the alternative that you suggested. I don't see a 
point of abandoning the 'standard' web analytics tool that we can configure to 
be GRDP compliant and using far less popular project that personally I don't 
know how to use and have no bandwidth to study. 
   
   [1] https://www.cookiebot.com/en/google-analytics-gdpr/
   [2] https://www.apache.org/foundation/policies/privacy.html
   [3] https://activemq.apache.org/privacy-policy.html
   [4] https://mahout.apache.org/general/privacy-policy


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aijamalnk commented on issue #5763: [AIRFLOW-5148] Add Google Analytics to the Airflow doc website

2019-08-13 Thread GitBox
aijamalnk commented on issue #5763: [AIRFLOW-5148] Add Google Analytics to the 
Airflow doc website
URL: https://github.com/apache/airflow/pull/5763#issuecomment-520946607
 
 
   Hi @mik-laj , 
   As Kaxil said, it was my request and I can answer all of your questions. 
   
   First of all, no one at Google (except for me, but I am wearing by Apache & 
Airflow hat here) has access to GA. And be assured I used some screenshots only 
(about # of visitors per day)  without any personally identifying info to make 
the case to get funding for the website development from Google. These 
statistics are valuable because we are investing in the website and we want to 
make sure that new UX/UI works and brings value to Airflow users. It is also 
important for improving the most valuable pages of documentation (including 
measuring the effects from Season of Docs effort) and see how user behavior 
changes throughout time. Without GA, i don't think we can have a similar amount 
of fine-grained information around pageviews, navigation of the website, and 
time spent on each page (not counting also geographical interest, which can 
help us fund meetups around the world).
   
   Besides me, Sid, Kaxil and Ash have currently access to GA, all of them are 
Project Management Committee and someone from Apache as you say. But I am also 
sending email to private@ to add everyone in the PMC. 
   
   About GDPR it seems we can comply with it if we don't track any User login 
info (which we don't) and so we don't strictly have to have any notice [1]. The 
practice in other Apache Projects is to add the following notice to our 
website: 
   -
   WEBSITE USAGE PRIVACY POLICY
   Information about your use of this website is collected using server access 
logs and a tracking cookie. The collected information consists of the following:
   - The IP address from which you access the website;
   - The type of browser and operating system you use to access our site;
   - The date and time you access our site;
   - The pages you visit; and
   - The addresses of pages from where you followed a link to our site.
   Part of this information is gathered using a tracking cookie set by the 
Google Analytics service and handled by Google as described in their privacy 
policy. See your browser documentation for instructions on how to disable the 
cookie if you prefer not to share this data with Google.
   We use the gathered information to help us make our site more useful to 
visitors and to better understand how and when our site is used. We do not 
track or collect personally identifiable information or associate gathered data 
with any personally identifying information from other sources.
   By using this website, you consent to the collection of this data in the 
manner and for the purpose described above.
   The ASF welcomes your questions or comments regarding this Privacy Policy. 
Send them to d...@airflow.apache.org
   -
   And lastly, i strongly -1 the alternative that you suggested. I don't see a 
point of abandoning the 'standard' web analytics tool that we can configure to 
be GRDP compliant and using far less popular project that personally I don't 
know how to use and have no bandwidth to study. 
   
   [1] https://www.cookiebot.com/en/google-analytics-gdpr/
   [2] https://www.apache.org/foundation/policies/privacy.html
   [3] https://activemq.apache.org/privacy-policy.html
   [4] https://mahout.apache.org/general/privacy-policy


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5207) Mark Success and Mark Failed views error out due to DAG reassignment

2019-08-13 Thread Marcus Levine (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5207:
---
Description: 
When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
{code:java}
File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.{code}
This should be a simple fix by either dropping the `task.dag = dag` line, or if 
it is required to keep things working, just set the private attribute instead: 
`task._dag = dag`

  was:
When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
```

File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.
```

This should be a simple fix by either dropping the `task.dag = dag` line, or if 
it is required to keep things working, just set the private attribute instead: 
`task._dag = dag`


> Mark Success and Mark Failed views error out due to DAG reassignment
> 
>
> Key: AIRFLOW-5207
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5207
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Major
> Fix For: 1.10.5
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When trying to clear a task after upgrading to `1.10.4`, I get the following 
> traceback:
> {code:java}
> File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 
> 1451, in failed future, past, State.FAILED) File 
> "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
> _mark_task_instance_state task.dag = dag File 
> "/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
> 509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
> airflow.exceptions.AirflowException: The DAG assigned to 
>  can not be changed.{code}
> This should be a simple fix by either dropping the `task.dag = dag` line, or 
> if it is required to keep things working, just set the private attribute 
> instead: `task._dag = dag`



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Updated] (AIRFLOW-5207) Mark Success and Mark Failed views error out due to DAG reassignment

2019-08-13 Thread Marcus Levine (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5207?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5207:
---
Description: 
When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
{code:java}
File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.{code}
This should be a simple fix by either dropping the offending line, or if it is 
required to keep things working, just set the private attribute instead:
{code:java}
task._dag = dag
{code}

  was:
When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
{code:java}
File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.{code}
This should be a simple fix by either dropping the `task.dag = dag` line, or if 
it is required to keep things working, just set the private attribute instead: 
`task._dag = dag`


> Mark Success and Mark Failed views error out due to DAG reassignment
> 
>
> Key: AIRFLOW-5207
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5207
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Major
> Fix For: 1.10.5
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When trying to clear a task after upgrading to `1.10.4`, I get the following 
> traceback:
> {code:java}
> File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 
> 1451, in failed future, past, State.FAILED) File 
> "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
> _mark_task_instance_state task.dag = dag File 
> "/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
> 509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
> airflow.exceptions.AirflowException: The DAG assigned to 
>  can not be changed.{code}
> This should be a simple fix by either dropping the offending line, or if it 
> is required to keep things working, just set the private attribute instead:
> {code:java}
> task._dag = dag
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Created] (AIRFLOW-5207) Mark Success and Mark Failed views error out due to DAG reassignment

2019-08-13 Thread Marcus Levine (JIRA)
Marcus Levine created AIRFLOW-5207:
--

 Summary: Mark Success and Mark Failed views error out due to DAG 
reassignment
 Key: AIRFLOW-5207
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5207
 Project: Apache Airflow
  Issue Type: Bug
  Components: ui
Affects Versions: 1.10.4
Reporter: Marcus Levine
Assignee: Marcus Levine
 Fix For: 1.10.5


When trying to clear a task after upgrading to `1.10.4`, I get the following 
traceback:
```

File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1451, 
in failed future, past, State.FAILED) File 
"/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1396, in 
_mark_task_instance_state task.dag = dag File 
"/usr/local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 
509, in dag "The DAG assigned to {} can not be changed.".format(self)) 
airflow.exceptions.AirflowException: The DAG assigned to  can not be changed.
```

This should be a simple fix by either dropping the `task.dag = dag` line, or if 
it is required to keep things working, just set the private attribute instead: 
`task._dag = dag`



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] mik-laj commented on issue #5789: [AIRFLOW-4222] Add cli autocomplete for bash & zsh

2019-08-13 Thread GitBox
mik-laj commented on issue #5789: [AIRFLOW-4222] Add cli autocomplete for bash 
& zsh
URL: https://github.com/apache/airflow/pull/5789#issuecomment-520921311
 
 
   Docs errors:
   ```
   /opt/airflow/docs/howto/cli-completion.rst:43: WARNING: image file not 
readable: howto/img/cli_completion.gif
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] Move GCP MLEngine to core

2019-08-13 Thread GitBox
mik-laj commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] Move 
GCP MLEngine to core
URL: https://github.com/apache/airflow/pull/5805#issuecomment-520920930
 
 
   Pylint is sad.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5799: [AIRFLOW-5195][depend on AIRFLOW-5183] Move GCP Dataflow to core

2019-08-13 Thread GitBox
mik-laj commented on issue #5799:  [AIRFLOW-5195][depend on AIRFLOW-5183] Move 
GCP Dataflow to core
URL: https://github.com/apache/airflow/pull/5799#issuecomment-520920768
 
 
   ```
   ==
   45) ERROR: testSuccessfulRun 
(tests.contrib.utils.test_mlengine_operator_utils.CreateEvaluateOpsTest)
   --
  Traceback (most recent call last):
   tests/contrib/utils/test_mlengine_operator_utils.py line 100 in 
testSuccessfulRun
 with patch('airflow.contrib.operators.dataflow_operator.'
   /usr/local/lib/python3.6/unittest/mock.py line 1247 in __enter__
 original, local = self.get_original()
   /usr/local/lib/python3.6/unittest/mock.py line 1221 in get_original
 "%s does not have the attribute %r" % (target, name)
  AttributeError:  does not 
have the attribute 'DataFlowHook'
   >> begin captured logging << 
  py.warnings: WARNING: 
/opt/airflow/airflow/contrib/operators/dataflow_operator.py:32: 
DeprecationWarning: This module is deprecated. Please use 
`airflow.gcp.operators.dataflow`.
DeprecationWarning,
  
  - >> end captured logging << -
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5801: [AIRFLOW-5197][depend on AIRFLOW-5183] Move GCP Datastore to core

2019-08-13 Thread GitBox
mik-laj commented on issue #5801: [AIRFLOW-5197][depend on AIRFLOW-5183] Move 
GCP Datastore to core
URL: https://github.com/apache/airflow/pull/5801#issuecomment-520920115
 
 
   Pylint is sad.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5803: [AIRFLOW-5200][depend on AIRFLOW 5183] Move GCP PubSub to core

2019-08-13 Thread GitBox
mik-laj commented on issue #5803: [AIRFLOW-5200][depend on AIRFLOW 5183] Move 
GCP PubSub to core
URL: https://github.com/apache/airflow/pull/5803#issuecomment-520920021
 
 
   Pylint is sad :-/ 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-2308) Method poll_job_complete in BigQueryBaseCursor class doesn't work outside its class

2019-08-13 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula closed AIRFLOW-2308.
--
Resolution: Cannot Reproduce

> Method poll_job_complete in BigQueryBaseCursor class doesn't work outside its 
> class
> ---
>
> Key: AIRFLOW-2308
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2308
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, gcp
>Affects Versions: 2.0.0
>Reporter: Guillermo Rodríguez Cano
>Priority: Major
>
> We have encountered an strange behaviour in the aforementioned method 
> `poll_job_complete` of the class `BigQueryBaseCursor` when we were create a 
> sensor that should simply poll to check for the completion of a BigQuery job.
> After creating a `BigQueryBaseCursor` object, when we call the method 
> `poll_job_complete` on that object we get the following error: 
> `AttributeError: ‘BigQueryBaseCursor’ object has no attribute 
> ‘poll_job_complete’`.
>  However, if we copy and paste the code contained in the  `poll_job_complete` 
> function to our sensor, it works.
> We are not sure what is the problem exactly or even why...
> Ideally we would want to do something like this, but we get such error:
> {code:java}
> hook = BigQueryHook(bigquery_conn_id=self.bigquery_conn_id, 
> delegate_to=self.delegate_to)
> service = hook.get_service()
> bqCursor = BigQueryBaseCursor(service, self.project_id)
> return bqCursor.poll_job_complete(self.job_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-2308) Method poll_job_complete in BigQueryBaseCursor class doesn't work outside its class

2019-08-13 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906400#comment-16906400
 ] 

Kamil Bregula commented on AIRFLOW-2308:


It seems that the author is not interested in solving the problem. I close the 
ticket. It can be reopen when the author provides more information that allows 
to reproduce the problem.

> Method poll_job_complete in BigQueryBaseCursor class doesn't work outside its 
> class
> ---
>
> Key: AIRFLOW-2308
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2308
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, gcp
>Affects Versions: 2.0.0
>Reporter: Guillermo Rodríguez Cano
>Priority: Major
>
> We have encountered an strange behaviour in the aforementioned method 
> `poll_job_complete` of the class `BigQueryBaseCursor` when we were create a 
> sensor that should simply poll to check for the completion of a BigQuery job.
> After creating a `BigQueryBaseCursor` object, when we call the method 
> `poll_job_complete` on that object we get the following error: 
> `AttributeError: ‘BigQueryBaseCursor’ object has no attribute 
> ‘poll_job_complete’`.
>  However, if we copy and paste the code contained in the  `poll_job_complete` 
> function to our sensor, it works.
> We are not sure what is the problem exactly or even why...
> Ideally we would want to do something like this, but we get such error:
> {code:java}
> hook = BigQueryHook(bigquery_conn_id=self.bigquery_conn_id, 
> delegate_to=self.delegate_to)
> service = hook.get_service()
> bqCursor = BigQueryBaseCursor(service, self.project_id)
> return bqCursor.poll_job_complete(self.job_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] iroddis commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge scheduling

2019-08-13 Thread GitBox
iroddis commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge 
scheduling
URL: https://github.com/apache/airflow/pull/5787#issuecomment-520903928
 
 
   Yup, I'll make those updates. After more thought, I'm also going to change 
the variable from a str to a bool, and likely the name to something like 
schedule_at_interval_end. Any suggestions for better names?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313477375
 
 

 ##
 File path: tests/core.py
 ##
 @@ -41,6 +42,7 @@
 from numpy.testing import assert_array_almost_equal
 from pendulum import utcnow
 
+import airflow
 
 Review comment:
   Please import airflow.hooks.hdfs_hook instead as that is what we actually use


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313476797
 
 

 ##
 File path: tests/contrib/utils/gcp_authenticator.py
 ##
 @@ -55,7 +55,7 @@ class GcpAuthenticator(LoggingCommandExecutor):
 connection - it can authenticate with the gcp key name specified
 """
 
-original_account = None
+original_account = None  # type: None
 
 Review comment:
   ```suggestion
   original_account = None  # type: Optional[str]
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313476178
 
 

 ##
 File path: tests/contrib/hooks/test_gcp_cloud_build_hook.py
 ##
 @@ -46,7 +47,7 @@
 
 
 class TestCloudBuildHookWithPassedProjectId(unittest.TestCase):
-hook = None
+hook = None  # type: Optional[CloudBuildHook]
 
 Review comment:
   Similar here - although this is `None` here it is always set to a 
CloudBuildHook for every test case (cos of setUp). I forget how mypy behaves 
with none-but-initalized-later...?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313475553
 
 

 ##
 File path: setup.py
 ##
 @@ -358,6 +358,8 @@ def do_setup():
 'tzlocal>=1.4,<2.0.0',
 'unicodecsv>=0.14.1',
 'zope.deprecation>=4.0, <5.0',
+# this can be removed once we drop support for python <= 3.6
+'typing-extensions>=3.7.4',
 
 Review comment:
   ```suggestion
   'typing-extensions>=3.7.4';python_version<"3.7"',
   ```
   
   perhaps?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all missing type annotation errors from dmypy

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5664: [AIRFLOW-5140] fix all 
missing type annotation errors from dmypy
URL: https://github.com/apache/airflow/pull/5664#discussion_r313475114
 
 

 ##
 File path: airflow/settings.py
 ##
 @@ -63,13 +67,13 @@
 LOG_FORMAT = conf.get('core', 'log_format')
 SIMPLE_LOG_FORMAT = conf.get('core', 'simple_log_format')
 
-SQL_ALCHEMY_CONN = None
-DAGS_FOLDER = None
-PLUGINS_FOLDER = None
-LOGGING_CLASS_PATH = None
+SQL_ALCHEMY_CONN = None  # type: Optional[str]
+DAGS_FOLDER = None  # type: Optional[str]
+PLUGINS_FOLDER = None  # type: Optional[str]
 
 Review comment:
   Do these need to be `Optional[]` -- by the after `import airflow` is done 
these will _always_ be set.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb edited a comment on issue #5787: [AIRFLOW-5172] Add choice of interval edge scheduling

2019-08-13 Thread GitBox
ashb edited a comment on issue #5787: [AIRFLOW-5172] Add choice of interval 
edge scheduling
URL: https://github.com/apache/airflow/pull/5787#issuecomment-520888290
 
 
   See also https://github.com/apache/airflow/pull/4768/files for previous 
discussion about this. I think I'm okay with this generally, but we'll need to 
make sure the docs about it are clear - i.e. more than just a comment in a 
config file; likely a section in that scheduler page with some images/pictures 
of timelines.
   
   Page 14 of 
https://drive.google.com/open?id=1DVN4HXtOC-HXvv00sEkoB90mxLDnCIKc may be a 
starting point for that image.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge scheduling

2019-08-13 Thread GitBox
ashb commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge 
scheduling
URL: https://github.com/apache/airflow/pull/5787#issuecomment-520888290
 
 
   See also https://github.com/apache/airflow/pull/4768/files for previous 
discussion about this. I think I'm okay with this generally, but we'll need to 
make sure the docs about it are clear - i.e. more than just a comment in a 
config file; likely a section in that scheduler page with some images/pictures 
of timelines.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] george-miller commented on issue #5475: [AIRFLOW-4846] Allow specification of an existing secret containing git credentials for init containers

2019-08-13 Thread GitBox
george-miller commented on issue #5475: [AIRFLOW-4846] Allow specification of 
an existing secret containing git credentials for init containers
URL: https://github.com/apache/airflow/pull/5475#issuecomment-520886163
 
 
   @ashb Sorry to bother you.  Mind looking at this again?  Thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge scheduling

2019-08-13 Thread GitBox
ashb commented on issue #5787: [AIRFLOW-5172] Add choice of interval edge 
scheduling
URL: https://github.com/apache/airflow/pull/5787#issuecomment-520886052
 
 
   This needs some docs re-writing as a result 
-https://airflow.apache.org/scheduler.html (which lives under docs/ somewhere) 
at the least.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk opened a new pull request #5809: [AIRFLOW-5206] Licences for md files depedns on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204, AIRFLOW-52065

2019-08-13 Thread GitBox
potiuk opened a new pull request #5809:  [AIRFLOW-5206] Licences for md files 
depedns on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,AIRFLOW-52065
URL: https://github.com/apache/airflow/pull/5809
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5206
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5206) .md files should have all common licence

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5206?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906312#comment-16906312
 ] 

ASF GitHub Bot commented on AIRFLOW-5206:
-

potiuk commented on pull request #5809:  [AIRFLOW-5206] Licences for md files 
depedns on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,AIRFLOW-52065
URL: https://github.com/apache/airflow/pull/5809
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5206
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> .md files should have all common licence
> 
>
> Key: AIRFLOW-5206
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5206
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Created] (AIRFLOW-5206) .md files should have all common licence

2019-08-13 Thread Jarek Potiuk (JIRA)
Jarek Potiuk created AIRFLOW-5206:
-

 Summary: .md files should have all common licence
 Key: AIRFLOW-5206
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5206
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci
Affects Versions: 2.0.0
Reporter: Jarek Potiuk






--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-5175) Update pagination symbols for dag run and task instances

2019-08-13 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5175?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906307#comment-16906307
 ] 

Ash Berlin-Taylor commented on AIRFLOW-5175:


This may be harder to fix as those tables are generated by Flask-AppBuilder.

> Update pagination symbols for dag run and task instances
> 
>
> Key: AIRFLOW-5175
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5175
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Thomas Hillyer
>Assignee: Thomas Hillyer
>Priority: Trivial
> Fix For: 1.10.5
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> same as -AIRFLOW-5067-
> (for DAGS page) but for dag runs and task instances pages



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Resolved] (AIRFLOW-5100) Airflow scheduler does not respect safe mode setting

2019-08-13 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5100.

   Resolution: Fixed
Fix Version/s: 1.10.5

> Airflow scheduler does not respect safe mode setting
> 
>
> Key: AIRFLOW-5100
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5100
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.3
>Reporter: Jonathan Lange
>Priority: Major
> Fix For: 1.10.5
>
>
> We recently disabled safe mode in our Airflow 1.10.3 deployment and then 
> removed some needless comments from our DAGs that mentioned "airflow" and 
> "DAG".
> After deploying (and after several days!), we found that although these DAGs 
> still appeared in the UI, they were not running. They didn't have "squares" 
> in the tree view indicating that they should be run. 
> We restored the words "airflow" and "DAG" to these jobs, and they were 
> scheduled again.
> After digging into the code, it looks like the {{SchedulerJob}} calls 
> {{list_py_file_paths}} without specifying {{safe_mode}}, and 
> {{list_py_file_paths}} defaults to {{safe_mode=True}}, rather than consulting 
> the configuration as it does for {{include_examples}}:
> [https://github.com/apache/airflow/blob/master/airflow/jobs/scheduler_job.py#L1278]
> [https://github.com/apache/airflow/blob/master/airflow/utils/dag_processing.py#L291-L304]
> I suggest the following change, to make the behaviour of 
> {{list_py_file_paths}} more consistent with itself:
> {code:python}
> modified   airflow/utils/dag_processing.py
> @@ -287,7 +287,7 @@ def correct_maybe_zipped(fileloc):
>  COMMENT_PATTERN = re.compile(r"\s*#.*")
>  
>  
> -def list_py_file_paths(directory, safe_mode=True,
> +def list_py_file_paths(directory, safe_mode=None,
> include_examples=None):
>  """
>  Traverse a directory and look for Python files.
> @@ -299,6 +299,8 @@ def list_py_file_paths(directory, safe_mode=True,
>  :return: a list of paths to Python files in the specified directory
>  :rtype: list[unicode]
>  """
> +if safe_mode is None:
> +safe_mode = conf.getboolean('core', 'DAG_DISCOVERY_SAFE_MODE')
>  if include_examples is None:
>  include_examples = conf.getboolean('core', 'LOAD_EXAMPLES')
>  file_paths = []
> {code}
> I tried to find a way to write tests for this, but I couldn't figure it out. 
> I sort of expected a function that looked at a bunch of files and returned a 
> collection of DAGs, but I couldn't find it, and couldn't really get the theme 
> behind {{DagFileProcessorAgent}} and friends.
>  
> I haven't tried to produce a minimal example of this error, and have not 
> confirmed that the above patch fixes the problem.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] Fokko commented on issue #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
Fokko commented on issue #5808:  [AIRFLOW-5205] Check xml files depends on  
AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808#issuecomment-520877649
 
 
   What's the point of opening this PR? It is impossible to review with some 
many dependencies on other PRs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5807: [AIRFLOW-5204] Shellcheck + common licence in shell files (depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180)

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5807:  [AIRFLOW-5204] Shellcheck + 
common licence in shell files (depends on  AIRFLOW-5161,  AIRFLOW-5170,  
AIRFLOW-5180) 
URL: https://github.com/apache/airflow/pull/5807#discussion_r313453766
 
 

 ##
 File path: airflow/example_dags/entrypoint.sh
 ##
 @@ -1,20 +1,20 @@
-# -*- coding: utf-8 -*-
+#!/usr/bin/env bash
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
 #
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
+#http://www.apache.org/licenses/LICENSE-2.0
 #
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
 
-["/bin/bash", "-c", "/bin/sleep 30; /bin/mv {{params.source_location}}/{{ 
ti.xcom_pull('view_file') }} {{params.target_location}}; /bin/echo 
'{{params.target_location}}/{{ ti.xcom_pull('view_file') }}';"]
+# TODO: Uncomment this code when we start using it
+#[ "/bin/bash", "-c", "/bin/sleep 30; /bin/mv {{params.source_location}}/{{ 
ti.xcom_pull('view_file') }} {{params.target_location}}; /bin/echo 
'{{params.target_location}}/{{ ti.xcom_pull('view_file') }}';" ]  # shellcheck 
disable=SC1073,SC1072,SC1035
 
 Review comment:
   What's gone on here?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5808:  [AIRFLOW-5205] Check xml 
files depends on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808#discussion_r313452455
 
 

 ##
 File path: airflow/_vendor/slugify/slugify.py
 ##
 @@ -1,3 +1,6 @@
+# -*- coding: utf-8 -*-
+# pylint: skip-file
+"""Slugify !"""
 
 Review comment:
   (Oh this change is probably from one of the other PRs isn't it? So that 
comment applies somewhere else)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
ashb commented on a change in pull request #5808:  [AIRFLOW-5205] Check xml 
files depends on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808#discussion_r313452023
 
 

 ##
 File path: airflow/_vendor/slugify/slugify.py
 ##
 @@ -1,3 +1,6 @@
+# -*- coding: utf-8 -*-
+# pylint: skip-file
+"""Slugify !"""
 
 Review comment:
   I'd rather we don't touch the files under airflow/_vendor/ unless we _have_ 
to.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk opened a new pull request #5808: [AIRFLOW-5205] Check xml files depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180, AIRFLOW-5204,

2019-08-13 Thread GitBox
potiuk opened a new pull request #5808:  [AIRFLOW-5205] Check xml files depends 
on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5205
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5205) XML files automatically checked with xmllint

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5205?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906299#comment-16906299
 ] 

ASF GitHub Bot commented on AIRFLOW-5205:
-

potiuk commented on pull request #5808:  [AIRFLOW-5205] Check xml files depends 
on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180,  AIRFLOW-5204, 
URL: https://github.com/apache/airflow/pull/5808
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5205
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> XML files automatically checked with xmllint
> 
>
> Key: AIRFLOW-5205
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5205
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Created] (AIRFLOW-5205) XML files automatically checked with xmllint

2019-08-13 Thread Jarek Potiuk (JIRA)
Jarek Potiuk created AIRFLOW-5205:
-

 Summary: XML files automatically checked with xmllint
 Key: AIRFLOW-5205
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5205
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci
Affects Versions: 2.0.0
Reporter: Jarek Potiuk






--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] nuclearpinguin commented on issue #5769: [AIRFLOW-4964][WIP-DONT-MERGE] Add BigQuery Data Transfer Hook and Operator

2019-08-13 Thread GitBox
nuclearpinguin commented on issue #5769: [AIRFLOW-4964][WIP-DONT-MERGE] Add 
BigQuery Data Transfer Hook and Operator
URL: https://github.com/apache/airflow/pull/5769#issuecomment-520868184
 
 
   Depends on #5791.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5204) Shell files should be checked with shellcheck and have identical licence

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5204?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906278#comment-16906278
 ] 

ASF GitHub Bot commented on AIRFLOW-5204:
-

potiuk commented on pull request #5807:  [AIRFLOW-5204] Shellcheck + common 
licence in shell files (depends on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180) 
URL: https://github.com/apache/airflow/pull/5807
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5204
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Shell files should be checked with shellcheck and have identical licence
> 
>
> Key: AIRFLOW-5204
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5204
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] potiuk opened a new pull request #5807: [AIRFLOW-5204] Shellcheck + common licence in shell files (depends on AIRFLOW-5161, AIRFLOW-5170, AIRFLOW-5180)

2019-08-13 Thread GitBox
potiuk opened a new pull request #5807:  [AIRFLOW-5204] Shellcheck + common 
licence in shell files (depends on  AIRFLOW-5161,  AIRFLOW-5170,  AIRFLOW-5180) 
URL: https://github.com/apache/airflow/pull/5807
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5204
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5204) Shell files should be checked with shellcheck and have identical licence

2019-08-13 Thread Jarek Potiuk (JIRA)
Jarek Potiuk created AIRFLOW-5204:
-

 Summary: Shell files should be checked with shellcheck and have 
identical licence
 Key: AIRFLOW-5204
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5204
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci
Affects Versions: 2.0.0
Reporter: Jarek Potiuk






--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Resolved] (AIRFLOW-4843) Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`

2019-08-13 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-4843.

   Resolution: Fixed
Fix Version/s: 2.0.0

> Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`
> --
>
> Key: AIRFLOW-4843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4843
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Akshesh Doshi
>Assignee: Akshesh Doshi
>Priority: Major
>  Labels: Docker, docker, orchestration, swarm
> Fix For: 2.0.0
>
>
> Currently, Airflow supports spawning Docker containers for running tasks via 
> the {color:#707070}_DockerOperator_{color} but these containers are run on 
> the same node as the scheduler.
> It would be helpful for our use-case to be able to spawn these tasks wherever 
> resources are available in our Docker Swarm cluster.
>  
> This can be achieved by creating a Docker swarm service, waiting for its run 
> and removing it after it has completed execution.
> This approach has been suggested/discussed at various places (and implemented 
> in Golang for Swarm-cronjob):
> [https://blog.alexellis.io/containers-on-swarm/]
> [https://forums.docker.com/t/running-one-off-commands-in-swarm-containers/42436/3]
> [https://gist.github.com/alexellis/e11321b8fbfc595c208ea3e74bf5e54b]
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4843) Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`

2019-08-13 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906256#comment-16906256
 ] 

ASF subversion and git services commented on AIRFLOW-4843:
--

Commit 3e2a02751cf890b780bc26b40c7cee7f1f4e0bd9 in airflow's branch 
refs/heads/master from Akshesh Doshi
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=3e2a027 ]

[AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator) (#5489)

* [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

Add support for running Docker containers via Docker Swarm
which allows the task to run on any machine (node) which
is a part of your Swarm cluster

More details: https://issues.apache.org/jira/browse/AIRFLOW-4843

Built with <3 at Agoda!

> Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`
> --
>
> Key: AIRFLOW-4843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4843
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Akshesh Doshi
>Assignee: Akshesh Doshi
>Priority: Major
>  Labels: Docker, docker, orchestration, swarm
>
> Currently, Airflow supports spawning Docker containers for running tasks via 
> the {color:#707070}_DockerOperator_{color} but these containers are run on 
> the same node as the scheduler.
> It would be helpful for our use-case to be able to spawn these tasks wherever 
> resources are available in our Docker Swarm cluster.
>  
> This can be achieved by creating a Docker swarm service, waiting for its run 
> and removing it after it has completed execution.
> This approach has been suggested/discussed at various places (and implemented 
> in Golang for Swarm-cronjob):
> [https://blog.alexellis.io/containers-on-swarm/]
> [https://forums.docker.com/t/running-one-off-commands-in-swarm-containers/42436/3]
> [https://gist.github.com/alexellis/e11321b8fbfc595c208ea3e74bf5e54b]
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4843) Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`

2019-08-13 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906258#comment-16906258
 ] 

ASF subversion and git services commented on AIRFLOW-4843:
--

Commit 3e2a02751cf890b780bc26b40c7cee7f1f4e0bd9 in airflow's branch 
refs/heads/master from Akshesh Doshi
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=3e2a027 ]

[AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator) (#5489)

* [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

Add support for running Docker containers via Docker Swarm
which allows the task to run on any machine (node) which
is a part of your Swarm cluster

More details: https://issues.apache.org/jira/browse/AIRFLOW-4843

Built with <3 at Agoda!

> Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`
> --
>
> Key: AIRFLOW-4843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4843
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Akshesh Doshi
>Assignee: Akshesh Doshi
>Priority: Major
>  Labels: Docker, docker, orchestration, swarm
>
> Currently, Airflow supports spawning Docker containers for running tasks via 
> the {color:#707070}_DockerOperator_{color} but these containers are run on 
> the same node as the scheduler.
> It would be helpful for our use-case to be able to spawn these tasks wherever 
> resources are available in our Docker Swarm cluster.
>  
> This can be achieved by creating a Docker swarm service, waiting for its run 
> and removing it after it has completed execution.
> This approach has been suggested/discussed at various places (and implemented 
> in Golang for Swarm-cronjob):
> [https://blog.alexellis.io/containers-on-swarm/]
> [https://forums.docker.com/t/running-one-off-commands-in-swarm-containers/42436/3]
> [https://gist.github.com/alexellis/e11321b8fbfc595c208ea3e74bf5e54b]
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4843) Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906255#comment-16906255
 ] 

ASF GitHub Bot commented on AIRFLOW-4843:
-

mik-laj commented on pull request #5489: [AIRFLOW-4843] Allow orchestration via 
Docker Swarm (SwarmOperator)
URL: https://github.com/apache/airflow/pull/5489
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`
> --
>
> Key: AIRFLOW-4843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4843
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Akshesh Doshi
>Assignee: Akshesh Doshi
>Priority: Major
>  Labels: Docker, docker, orchestration, swarm
>
> Currently, Airflow supports spawning Docker containers for running tasks via 
> the {color:#707070}_DockerOperator_{color} but these containers are run on 
> the same node as the scheduler.
> It would be helpful for our use-case to be able to spawn these tasks wherever 
> resources are available in our Docker Swarm cluster.
>  
> This can be achieved by creating a Docker swarm service, waiting for its run 
> and removing it after it has completed execution.
> This approach has been suggested/discussed at various places (and implemented 
> in Golang for Swarm-cronjob):
> [https://blog.alexellis.io/containers-on-swarm/]
> [https://forums.docker.com/t/running-one-off-commands-in-swarm-containers/42436/3]
> [https://gist.github.com/alexellis/e11321b8fbfc595c208ea3e74bf5e54b]
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (AIRFLOW-4843) Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`

2019-08-13 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906257#comment-16906257
 ] 

ASF subversion and git services commented on AIRFLOW-4843:
--

Commit 3e2a02751cf890b780bc26b40c7cee7f1f4e0bd9 in airflow's branch 
refs/heads/master from Akshesh Doshi
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=3e2a027 ]

[AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator) (#5489)

* [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

Add support for running Docker containers via Docker Swarm
which allows the task to run on any machine (node) which
is a part of your Swarm cluster

More details: https://issues.apache.org/jira/browse/AIRFLOW-4843

Built with <3 at Agoda!

> Allow orchestration of tasks with Docker Swarm aka `SwarmOperator`
> --
>
> Key: AIRFLOW-4843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4843
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.10.3
>Reporter: Akshesh Doshi
>Assignee: Akshesh Doshi
>Priority: Major
>  Labels: Docker, docker, orchestration, swarm
>
> Currently, Airflow supports spawning Docker containers for running tasks via 
> the {color:#707070}_DockerOperator_{color} but these containers are run on 
> the same node as the scheduler.
> It would be helpful for our use-case to be able to spawn these tasks wherever 
> resources are available in our Docker Swarm cluster.
>  
> This can be achieved by creating a Docker swarm service, waiting for its run 
> and removing it after it has completed execution.
> This approach has been suggested/discussed at various places (and implemented 
> in Golang for Swarm-cronjob):
> [https://blog.alexellis.io/containers-on-swarm/]
> [https://forums.docker.com/t/running-one-off-commands-in-swarm-containers/42436/3]
> [https://gist.github.com/alexellis/e11321b8fbfc595c208ea3e74bf5e54b]
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] mik-laj merged pull request #5489: [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

2019-08-13 Thread GitBox
mik-laj merged pull request #5489: [AIRFLOW-4843] Allow orchestration via 
Docker Swarm (SwarmOperator)
URL: https://github.com/apache/airflow/pull/5489
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #5489: [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

2019-08-13 Thread GitBox
mik-laj commented on issue #5489: [AIRFLOW-4843] Allow orchestration via Docker 
Swarm (SwarmOperator)
URL: https://github.com/apache/airflow/pull/5489#issuecomment-520858118
 
 
   No additional approvals are needed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] akki commented on issue #5489: [AIRFLOW-4843] Allow orchestration via Docker Swarm (SwarmOperator)

2019-08-13 Thread GitBox
akki commented on issue #5489: [AIRFLOW-4843] Allow orchestration via Docker 
Swarm (SwarmOperator)
URL: https://github.com/apache/airflow/pull/5489#issuecomment-520856661
 
 
   @mik-laj Thank you for reviewing the PR in so much detail and approval.
   
   As far as I understand, I need another approval to move this PR ahead. 
@potiuk since you already had a look earlier and I addressed your requested 
changes, can you please have another look at it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5203) Move GCP BigTable to core

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5203?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906249#comment-16906249
 ] 

ASF GitHub Bot commented on AIRFLOW-5203:
-

nuclearpinguin commented on pull request #5806: [AIRFLOW-5203][depend on 
AIRFLOW-5183] Move GCP BigTable to core
URL: https://github.com/apache/airflow/pull/5806
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Move GCP BigTable to core
> -
>
> Key: AIRFLOW-5203
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5203
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] nuclearpinguin opened a new pull request #5806: [AIRFLOW-5203][depend on AIRFLOW-5183] Move GCP BigTable to core

2019-08-13 Thread GitBox
nuclearpinguin opened a new pull request #5806: [AIRFLOW-5203][depend on 
AIRFLOW-5183] Move GCP BigTable to core
URL: https://github.com/apache/airflow/pull/5806
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on issue #5806: [AIRFLOW-5203][depend on AIRFLOW-5183] Move GCP BigTable to core

2019-08-13 Thread GitBox
nuclearpinguin commented on issue #5806: [AIRFLOW-5203][depend on AIRFLOW-5183] 
Move GCP BigTable to core
URL: https://github.com/apache/airflow/pull/5806#issuecomment-520854777
 
 
   Depends on #5791.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5203) Move GCP BigTable to core

2019-08-13 Thread Tomasz Urbaszek (JIRA)
Tomasz Urbaszek created AIRFLOW-5203:


 Summary: Move GCP BigTable to core
 Key: AIRFLOW-5203
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5203
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] nuclearpinguin commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] Move GCP MLEngine to core

2019-08-13 Thread GitBox
nuclearpinguin commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] 
Move GCP MLEngine to core
URL: https://github.com/apache/airflow/pull/5805#issuecomment-520843012
 
 
   This PR will require changes after #5799 due to usage of Dataflow in 
MLEngine utils.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] Move GCP MLEngine to core

2019-08-13 Thread GitBox
nuclearpinguin commented on issue #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] 
Move GCP MLEngine to core
URL: https://github.com/apache/airflow/pull/5805#issuecomment-520834833
 
 
   Depends on #5791.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5202) Move GCP MLEngine to core

2019-08-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906198#comment-16906198
 ] 

ASF GitHub Bot commented on AIRFLOW-5202:
-

nuclearpinguin commented on pull request #5805: [AIRFLOW-5202][depend on 
AIRFLOW-5183] Move GCP MLEngine to core
URL: https://github.com/apache/airflow/pull/5805
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Move GCP MLEngine to core
> -
>
> Key: AIRFLOW-5202
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5202
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[GitHub] [airflow] nuclearpinguin opened a new pull request #5805: [AIRFLOW-5202][depend on AIRFLOW-5183] Move GCP MLEngine to core

2019-08-13 Thread GitBox
nuclearpinguin opened a new pull request #5805: [AIRFLOW-5202][depend on 
AIRFLOW-5183] Move GCP MLEngine to core
URL: https://github.com/apache/airflow/pull/5805
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   3   >