[jira] [Commented] (AIRFLOW-3977) Incorrect example about the interaction between skipped tasks and trigger rules in documentation.

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782306#comment-16782306
 ] 

ASF subversion and git services commented on AIRFLOW-3977:
--

Commit 45d24e79eab98589b1b0509e920811cbf778048b in airflow's branch 
refs/heads/master from Chen Tong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=45d24e7 ]

[AIRFLOW-3977] Add examples of trigger rules in doc (#4805)

The current LatestOnlyOperator will skip all downstream tasks blindly.
The doc shows a wrong behavior. It also shows an incorrect example about
the interaction between skipped tasks and trigger rules. I replace it with
another example using BranchingOperator in schedule level.

This fix can resolve this ticket:
https://issues.apache.org/jira/browse/AIRFLOW-3977
Also,
https://issues.apache.org/jira/browse/AIRFLOW-1784

> Incorrect example about the interaction between skipped tasks and trigger 
> rules in documentation.
> -
>
> Key: AIRFLOW-3977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3977
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.2, 1.9.0, 1.10.0, 1.10.1, 1.10.2
>Reporter: cixuuz
>Assignee: cixuuz
>Priority: Major
>  Labels: documentaion
> Fix For: 1.10.3
>
>
> Current LatestOnlyOperator will skip all downstream tasks blindly. 
> BranchingOperator could be a better example to show how trigger rules 
> interacted with skipped tasks in schedule level. 
> This fix can also resolve this ticket:
> https://issues.apache.org/jira/browse/AIRFLOW-1784 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3977) Incorrect example about the interaction between skipped tasks and trigger rules in documentation.

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782305#comment-16782305
 ] 

ASF GitHub Bot commented on AIRFLOW-3977:
-

feng-tao commented on pull request #4805: [AIRFLOW-3977] Fix/Add examples of 
how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Incorrect example about the interaction between skipped tasks and trigger 
> rules in documentation.
> -
>
> Key: AIRFLOW-3977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3977
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.2, 1.9.0, 1.10.0, 1.10.1, 1.10.2
>Reporter: cixuuz
>Assignee: cixuuz
>Priority: Major
>  Labels: documentaion
> Fix For: 1.10.3
>
>
> Current LatestOnlyOperator will skip all downstream tasks blindly. 
> BranchingOperator could be a better example to show how trigger rules 
> interacted with skipped tasks in schedule level. 
> This fix can also resolve this ticket:
> https://issues.apache.org/jira/browse/AIRFLOW-1784 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1784) SKIPPED status is being cascading wrongly

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782308#comment-16782308
 ] 

ASF subversion and git services commented on AIRFLOW-1784:
--

Commit 45d24e79eab98589b1b0509e920811cbf778048b in airflow's branch 
refs/heads/master from Chen Tong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=45d24e7 ]

[AIRFLOW-3977] Add examples of trigger rules in doc (#4805)

The current LatestOnlyOperator will skip all downstream tasks blindly.
The doc shows a wrong behavior. It also shows an incorrect example about
the interaction between skipped tasks and trigger rules. I replace it with
another example using BranchingOperator in schedule level.

This fix can resolve this ticket:
https://issues.apache.org/jira/browse/AIRFLOW-3977
Also,
https://issues.apache.org/jira/browse/AIRFLOW-1784

> SKIPPED status is being cascading wrongly
> -
>
> Key: AIRFLOW-1784
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1784
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.2
> Environment: Ubuntu 16.04.3 LTS 
> Python 2.7.12 
> CeleryExecutor: 2-nodes cluster
>Reporter: Dmytro Kulyk
>Priority: Critical
>  Labels: documentation, latestonly, operators
> Attachments: Capture_graph.JPG, Capture_tree2.JPG, cube_update.py
>
>
> After implementation of AIRFLOW-1296 within 1.8.2 there is an wrong behavior 
> of LatestOnlyOperator which is forcing SKIPPED status cascading despite of 
> TriggerRule='all_done' set
> Which is opposite to documented 
> [here|https://airflow.incubator.apache.org/concepts.html#latest-run-only]
> *Expected Behavior:*
> dummy task and all downstreams (update_*) should not be skipped
> Full listings are attached
> 1.8.1 did not have such issue



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3977) Incorrect example about the interaction between skipped tasks and trigger rules in documentation.

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782307#comment-16782307
 ] 

ASF subversion and git services commented on AIRFLOW-3977:
--

Commit 45d24e79eab98589b1b0509e920811cbf778048b in airflow's branch 
refs/heads/master from Chen Tong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=45d24e7 ]

[AIRFLOW-3977] Add examples of trigger rules in doc (#4805)

The current LatestOnlyOperator will skip all downstream tasks blindly.
The doc shows a wrong behavior. It also shows an incorrect example about
the interaction between skipped tasks and trigger rules. I replace it with
another example using BranchingOperator in schedule level.

This fix can resolve this ticket:
https://issues.apache.org/jira/browse/AIRFLOW-3977
Also,
https://issues.apache.org/jira/browse/AIRFLOW-1784

> Incorrect example about the interaction between skipped tasks and trigger 
> rules in documentation.
> -
>
> Key: AIRFLOW-3977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3977
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.2, 1.9.0, 1.10.0, 1.10.1, 1.10.2
>Reporter: cixuuz
>Assignee: cixuuz
>Priority: Major
>  Labels: documentaion
> Fix For: 1.10.3
>
>
> Current LatestOnlyOperator will skip all downstream tasks blindly. 
> BranchingOperator could be a better example to show how trigger rules 
> interacted with skipped tasks in schedule level. 
> This fix can also resolve this ticket:
> https://issues.apache.org/jira/browse/AIRFLOW-1784 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3977) Incorrect example about the interaction between skipped tasks and trigger rules in documentation.

2019-03-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-3977.
---
Resolution: Fixed

> Incorrect example about the interaction between skipped tasks and trigger 
> rules in documentation.
> -
>
> Key: AIRFLOW-3977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3977
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.2, 1.9.0, 1.10.0, 1.10.1, 1.10.2
>Reporter: cixuuz
>Assignee: cixuuz
>Priority: Major
>  Labels: documentaion
> Fix For: 1.10.3
>
>
> Current LatestOnlyOperator will skip all downstream tasks blindly. 
> BranchingOperator could be a better example to show how trigger rules 
> interacted with skipped tasks in schedule level. 
> This fix can also resolve this ticket:
> https://issues.apache.org/jira/browse/AIRFLOW-1784 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao merged pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
feng-tao merged pull request #4805: [AIRFLOW-3977] Fix/Add examples of how 
trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3995) Tests for examples

2019-03-01 Thread Kamil Bregula (JIRA)
Kamil Bregula created AIRFLOW-3995:
--

 Summary: Tests for examples
 Key: AIRFLOW-3995
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3995
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Kamil Bregula






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] fenglu-g commented on a change in pull request #4749: [AIRFLOW-3934] Increase standard Dataproc PD size

2019-03-01 Thread GitBox
fenglu-g commented on a change in pull request #4749: [AIRFLOW-3934] Increase 
standard Dataproc PD size
URL: https://github.com/apache/airflow/pull/4749#discussion_r261815528
 
 

 ##
 File path: airflow/contrib/operators/dataproc_operator.py
 ##
 @@ -160,10 +160,10 @@ def __init__(self,
  properties=None,
  master_machine_type='n1-standard-4',
  master_disk_type='pd-standard',
- master_disk_size=500,
+ master_disk_size=1024,
 
 Review comment:
   @DanSedov what's your thoughts? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] fenglu-g commented on issue #4790: [AIRFLOW-3968][2/3] Refactor base GCP hook

2019-03-01 Thread GitBox
fenglu-g commented on issue #4790: [AIRFLOW-3968][2/3] Refactor base GCP hook
URL: https://github.com/apache/airflow/pull/4790#issuecomment-468892112
 
 
   @mik-laj please help to make sure all changes are integration tested, 
especially the dataflow_hook. Thanks. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3975) Handle null values in attr renderers

2019-03-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-3975.
---
Resolution: Fixed

> Handle null values in attr renderers
> 
>
> Key: AIRFLOW-3975
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3975
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
>
> Some renderers in `attr_renderers` raise unhandled exceptions when given null 
> inputs. For example, the `python_callable` renderer raises an error if passed 
> `None`. Some operators allow null values for this attribute, such as 
> `TriggerDagRunOperator`. I think all renderers should handle null input by 
> returning the empty string and not raising an exception.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao merged pull request #4799: [AIRFLOW-3975] Handle null inputs in attribute renderers.

2019-03-01 Thread GitBox
feng-tao merged pull request #4799: [AIRFLOW-3975] Handle null inputs in 
attribute renderers.
URL: https://github.com/apache/airflow/pull/4799
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3975) Handle null values in attr renderers

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782296#comment-16782296
 ] 

ASF subversion and git services commented on AIRFLOW-3975:
--

Commit 4ceb27186a9f2a3b5a9b0161f980ec5bcbfebce2 in airflow's branch 
refs/heads/master from Joshua Carp
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=4ceb271 ]

[AIRFLOW-3975] Handle null inputs in attribute renderers. (#4799)



> Handle null values in attr renderers
> 
>
> Key: AIRFLOW-3975
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3975
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
>
> Some renderers in `attr_renderers` raise unhandled exceptions when given null 
> inputs. For example, the `python_callable` renderer raises an error if passed 
> `None`. Some operators allow null values for this attribute, such as 
> `TriggerDagRunOperator`. I think all renderers should handle null input by 
> returning the empty string and not raising an exception.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3983) Exclude node_modules from being linted by flake8

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782294#comment-16782294
 ] 

ASF subversion and git services commented on AIRFLOW-3983:
--

Commit ba5289271c33f0b02fccda4c4b223c33c102d401 in airflow's branch 
refs/heads/master from Felix
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ba52892 ]

[AIRFLOW-3983] Exclude node_modules from being linted by flake8 (#4809)



> Exclude node_modules from being linted by flake8
> 
>
> Key: AIRFLOW-3983
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3983
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Minor
>
> After installing node modules via {noformat}npm install{noformat}. Flake8 
> (our current linting tool for python) will also lint these packages that we 
> are not responsible for keeping clean.
> So it should be excluded for our flake8 linting in general.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] jmcarp commented on a change in pull request #4799: [AIRFLOW-3975] Handle null inputs in attribute renderers.

2019-03-01 Thread GitBox
jmcarp commented on a change in pull request #4799: [AIRFLOW-3975] Handle null 
inputs in attribute renderers.
URL: https://github.com/apache/airflow/pull/4799#discussion_r261814337
 
 

 ##
 File path: airflow/www/utils.py
 ##
 @@ -351,9 +355,8 @@ def get_attr_renderer():
 'doc_yaml': lambda x: render(x, lexers.YamlLexer),
 'doc_md': wrapped_markdown,
 'python_callable': lambda x: render(
-inspect.getsource(x), lexers.PythonLexer),
+inspect.getsource(x) if x is not None else None, 
lexers.PythonLexer),
 
 Review comment:
   The behavior is correct either way: `PythonLexer` can handle `None`, but 
`inspect.getsource`. can't. The unit tests make sure that this is the case.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3983) Exclude node_modules from being linted by flake8

2019-03-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3983?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-3983.
---
Resolution: Fixed

> Exclude node_modules from being linted by flake8
> 
>
> Key: AIRFLOW-3983
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3983
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Minor
>
> After installing node modules via {noformat}npm install{noformat}. Flake8 
> (our current linting tool for python) will also lint these packages that we 
> are not responsible for keeping clean.
> So it should be excluded for our flake8 linting in general.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3983) Exclude node_modules from being linted by flake8

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782293#comment-16782293
 ] 

ASF GitHub Bot commented on AIRFLOW-3983:
-

feng-tao commented on pull request #4809: [AIRFLOW-3983] Exclude node_modules 
from being linted by flake8
URL: https://github.com/apache/airflow/pull/4809
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Exclude node_modules from being linted by flake8
> 
>
> Key: AIRFLOW-3983
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3983
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Minor
>
> After installing node modules via {noformat}npm install{noformat}. Flake8 
> (our current linting tool for python) will also lint these packages that we 
> are not responsible for keeping clean.
> So it should be excluded for our flake8 linting in general.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
feng-tao commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261814113
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   @mik-laj , I don't think this should be a blocker as we have done it for 
other doc files as well. If we want to extract this file from doc and put it in 
example folder, we could do it in a later pr. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812936
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   I will try to extract all examples to separate files and create automatic 
tests. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812565
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   Files can be stored in the `airflow/example_dags` directory or 
`airflow/contrib/example_dags` directory. Files from this directory can be 
automatically tested to confirm their correctness.
   Example: 
   ```
   .. literalinclude:: ../../airflow/example_dags/example_python_operator.py
   :language: python
   :start-after: [START howto_operator_python_kwargs]
   :end-before: [END howto_operator_python_kwargs]
   ```
   Source: 
https://raw.githubusercontent.com/apache/airflow/master/docs/howto/operator.rst
   
   Other scripts are stored in py files.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812565
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   Files can be stored in the `airflow/example_dags` directory or 
`airflow/contrib/example_dags` directory or . Files from this directory can be 
automatically tested to confirm their correctness.
   Example: 
   ```
   .. literalinclude:: ../../airflow/example_dags/example_python_operator.py
   :language: python
   :start-after: [START howto_operator_python_kwargs]
   :end-before: [END howto_operator_python_kwargs]
   ```
   Source: 
https://raw.githubusercontent.com/apache/airflow/master/docs/howto/operator.rst
   
   Other scripts are stored in py files.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812565
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   Files can be stored in the `airflow/example_dags` directory or 
`airflow/contrib/example_dags` directory or . Files from this directory can be 
automatically tested to confirm their correctness.
   Example: 
   ```
   .. literalinclude:: ../../airflow/example_dags/example_python_operator.py
   :language: python
   :start-after: [START howto_operator_python_kwargs]
   :end-before: [END howto_operator_python_kwargs]
   ```
   Source: 
https://raw.githubusercontent.com/apache/airflow/master/docs/howto/operator.rst


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812565
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   Files can be stored in the `airflow/example_dags` directory. Files from this 
directory can be automatically tested to confirm their correctness.
   Example: 
   ```
   .. literalinclude:: ../../airflow/example_dags/example_python_operator.py
   :language: python
   :start-after: [START howto_operator_python_kwargs]
   :end-before: [END howto_operator_python_kwargs]
   ```
   Source: 
https://raw.githubusercontent.com/apache/airflow/master/docs/howto/operator.rst


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261812565
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   Files can be stored in the `airflow/example_dags` directory.
   Example: 
   ```
   .. literalinclude:: ../../airflow/example_dags/example_python_operator.py
   :language: python
   :start-after: [START howto_operator_python_kwargs]
   :end-before: [END howto_operator_python_kwargs]
   ```
   Source: 
https://raw.githubusercontent.com/apache/airflow/master/docs/howto/operator.rst


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3994) Oracle varchar maxlength exceed

2019-03-01 Thread Grant McKenzie (JIRA)
Grant McKenzie created AIRFLOW-3994:
---

 Summary: Oracle varchar maxlength exceed
 Key: AIRFLOW-3994
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3994
 Project: Apache Airflow
  Issue Type: Improvement
  Components: database
Affects Versions: 1.10.1
Reporter: Grant McKenzie


Hi

Setting up airflow against an Oracle server. In Oracle varchar columns are 
limited to 4000 chars, airflow created columns with 5000 chars. Should we 
consider a config setting for max varchar length in the database? I'll attempt 
to manually create the table with 4000 chars.

 

sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00910: specified len
gth too long for its datatype [SQL: '\nCREATE TABLE connection (\n\tid INTEGER N
OT NULL, \n\tconn_id VARCHAR2(250 CHAR), \n\tconn_type VARCHAR2(500 CHAR), \n\th
ost VARCHAR2(500 CHAR), \n\tschema VARCHAR2(500 CHAR), \n\tlogin VARCHAR2(500 CH
AR), \n\tpassword VARCHAR2(500 CHAR), \n\tport INTEGER, \n\textra VARCHAR2(5000
CHAR), \n\tPRIMARY KEY (id)\n)\n\n']

 

Thanks!

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2221) Fill up DagBag from remote locations

2019-03-01 Thread Chao-Han Tsai (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782187#comment-16782187
 ] 

Chao-Han Tsai commented on AIRFLOW-2221:


Got it!

> Fill up DagBag from remote locations
> 
>
> Key: AIRFLOW-2221
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2221
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: configuration, core
>Affects Versions: 2.0.0
>Reporter: Diogo Franco
>Assignee: Chao-Han Tsai
>Priority: Major
> Fix For: 2.0.0
>
>
> The ability to fill up the DagBag from remote locations (HDFS, S3...) seems 
> to be deemed useful, e.g. facilitating deployment processes.
> This JIRA is to propose an implementation of a *DagFetcher* abstraction on 
> the DagBag, where the collect_dags method can delegate the walking to a 
> *FileSystemDagFetcher*, *GitRepoDagFetcher*, *S3DagFetcher*, 
> *HDFSDagFetcher*, *GCSDagFetcher*, *ArtifactoryDagFetcher* or even 
> *TarballInS3DagFetcher*.
> This was briefly discussed in [this mailing list 
> thread|https://lists.apache.org/thread.html/03ddcd3a42b7fd6e3dad9711e8adea37fc00391f6053762f73af5b6a@%3Cdev.airflow.apache.org%3E]
> I'm happy to start work on this and provide an initial implementation for 
> review.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3992) run-ci.sh should be re-runable

2019-03-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-3992.
---
Resolution: Fixed

> run-ci.sh should be re-runable
> --
>
> Key: AIRFLOW-3992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3992
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>
> I am following the development setup in 
> https://github.com/apache/airflow/blob/master/CONTRIBUTING.md
> and when I try to rerun:
> {code}
> /app/scripts/ci/run-ci.sh
> {code}
> in side the container it failed with:
> {code}
> + ln -s /home/airflow/.ssh/authorized_keys /home/airflow/.ssh/authorized_keys2
> ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
> File exists
> ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
> with code 1)
> __ summary 
> ___
> ERROR:   py27-backend_sqlite-env_docker: commands failed
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3993) Add tests for SalesforceHook

2019-03-01 Thread Felix Uellendall (JIRA)
Felix Uellendall created AIRFLOW-3993:
-

 Summary: Add tests for SalesforceHook
 Key: AIRFLOW-3993
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3993
 Project: Apache Airflow
  Issue Type: Test
Reporter: Felix Uellendall
Assignee: Felix Uellendall






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4813: [AIRFLOW-3990] Compile regular expressions.

2019-03-01 Thread GitBox
codecov-io commented on issue #4813: [AIRFLOW-3990] Compile regular expressions.
URL: https://github.com/apache/airflow/pull/4813#issuecomment-468830079
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=h1) 
Report
   > Merging 
[#4813](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/693f620b40fff738a0a1a8197c7821f5dab59b21?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4813/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4813  +/-   ##
   ==
   + Coverage   74.49%   74.49%   +<.01% 
   ==
 Files 450  450  
 Lines   2899728999   +2 
   ==
   + Hits2160021602   +2 
 Misses   7397 7397
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/utils.py](https://codecov.io/gh/apache/airflow/pull/4813/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=)
 | `74.09% <100%> (+0.13%)` | :arrow_up: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/4813/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `59.15% <100%> (ø)` | :arrow_up: |
   | 
[airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/4813/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5)
 | `82.87% <100%> (+0.11%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=footer). 
Last update 
[693f620...d4c26d1](https://codecov.io/gh/apache/airflow/pull/4813?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3992) run-ci.sh should be re-runable

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782151#comment-16782151
 ] 

ASF GitHub Bot commented on AIRFLOW-3992:
-

milton0825 commented on pull request #4817: [AIRFLOW-3992] 1-setup-env.sh 
should be re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> run-ci.sh should be re-runable
> --
>
> Key: AIRFLOW-3992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3992
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>
> I am following the development setup in 
> https://github.com/apache/airflow/blob/master/CONTRIBUTING.md
> and when I try to rerun:
> {code}
> /app/scripts/ci/run-ci.sh
> {code}
> in side the container it failed with:
> {code}
> + ln -s /home/airflow/.ssh/authorized_keys /home/airflow/.ssh/authorized_keys2
> ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
> File exists
> ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
> with code 1)
> __ summary 
> ___
> ERROR:   py27-backend_sqlite-env_docker: commands failed
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3992) run-ci.sh should be re-runable

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782152#comment-16782152
 ] 

ASF GitHub Bot commented on AIRFLOW-3992:
-

milton0825 commented on pull request #4817: [AIRFLOW-3992] 1-setup-env.sh 
should be re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3992
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   We may rerun `/app/scripts/ci/run-ci.sh` to launch tests multiple times 
during development. Making it `ln -f` so that it does not fail with:
   ```
   + ln -s /home/airflow/.ssh/authorized_keys 
/home/airflow/.ssh/authorized_keys2
   ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
File exists
   ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
with code 1)
   __ summary 
___
   ERROR:   py27-backend_sqlite-env_docker: commands failed
   ```
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> run-ci.sh should be re-runable
> --
>
> Key: AIRFLOW-3992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3992
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>
> I am following the development setup in 
> https://github.com/apache/airflow/blob/master/CONTRIBUTING.md
> and when I try to rerun:
> {code}
> /app/scripts/ci/run-ci.sh
> {code}
> in side the container it failed with:
> {code}
> + ln -s /home/airflow/.ssh/authorized_keys /home/airflow/.ssh/authorized_keys2
> ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
> File exists
> ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
> with code 1)
> __ summary 
> ___
> ERROR:   py27-backend_sqlite-env_docker: commands failed
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] milton0825 closed pull request #4817: [AIRFLOW-3992] 1-setup-env.sh should be re-runable

2019-03-01 Thread GitBox
milton0825 closed pull request #4817: [AIRFLOW-3992] 1-setup-env.sh should be 
re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] milton0825 opened a new pull request #4817: [AIRFLOW-3992] 1-setup-env.sh should be re-runable

2019-03-01 Thread GitBox
milton0825 opened a new pull request #4817: [AIRFLOW-3992] 1-setup-env.sh 
should be re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3992
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   We may rerun `/app/scripts/ci/run-ci.sh` to launch tests multiple times 
during development. Making it `ln -f` so that it does not fail with:
   ```
   + ln -s /home/airflow/.ssh/authorized_keys 
/home/airflow/.ssh/authorized_keys2
   ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
File exists
   ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
with code 1)
   __ summary 
___
   ERROR:   py27-backend_sqlite-env_docker: commands failed
   ```
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3421) CSS issue on Airflow 1.10 Tree view UI

2019-03-01 Thread Felix Uellendall (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782147#comment-16782147
 ] 

Felix Uellendall commented on AIRFLOW-3421:
---

[~msadityan] Hey, can you check if this gets fixed for you when you

change 
{code:css}
#svg_container {
// overflow: scroll;
}
{code}
to
{code:css}
#svg_container {
overflow: scroll;
}
{code}
You can find it in the tree.cs file, but you can also change it in the web 
console of your browser. Just change the svg_container overflow to scroll.

I noticed this kind of behaviour you mentioned also when you display a huge 
amount of dag runs.

> CSS issue on Airflow 1.10 Tree view UI
> --
>
> Key: AIRFLOW-3421
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3421
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.0, 1.10.1
>Reporter: Adityan
>Priority: Minor
>  Labels: beginner
> Attachments: Screen Shot 2018-11-07 at 1.27.05 PM.png
>
>
> After upgrading to airflow 1.10, hovering the mouse over the rectangles in 
> tree view, causes the tooltip to popup much higher than it used to be. In the 
> screenshot attached, the right lowermost rectangle is the one that is being 
> hovered over.
>  
>  Once you scroll down on the tree view, the tooltip starts floating up. 
>   
>  Things I have tried to fix this behavior:
>  1. Change the css themes in webserver_config.py and restart the web server
>  2. Inspected the tooltip in Chrome, it seems to be a dynamically generated 
> CSS class. The CSS class controlling this behavior seem to be the same in 
> Airflow 1.9. 
>  
> *NOTE: This issue only happens with rbac is set to True in airflow.cfg. If 
> you turn off rbac, then this issue doesn't occur. Also, the dag needs to be 
> sufficiently large (vertically) so that you need to scroll with your mouse 
> for this issue to occur.* 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-3421) CSS issue on Airflow 1.10 Tree view UI

2019-03-01 Thread Felix Uellendall (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782147#comment-16782147
 ] 

Felix Uellendall edited comment on AIRFLOW-3421 at 3/1/19 10:05 PM:


[~msadityan] Hey, can you check if this gets fixed for you when you

change 
{code:css}
#svg_container {
// overflow: scroll;
}
{code}
to
{code:css}
#svg_container {
overflow: scroll;
}
{code}
You can find it in the tree.css file, but you can also change it in the web 
console of your browser. Just change the svg_container overflow to scroll.

I noticed this kind of behaviour you mentioned also when you display a huge 
amount of dag runs.


was (Author: feluelle):
[~msadityan] Hey, can you check if this gets fixed for you when you

change 
{code:css}
#svg_container {
// overflow: scroll;
}
{code}
to
{code:css}
#svg_container {
overflow: scroll;
}
{code}
You can find it in the tree.cs file, but you can also change it in the web 
console of your browser. Just change the svg_container overflow to scroll.

I noticed this kind of behaviour you mentioned also when you display a huge 
amount of dag runs.

> CSS issue on Airflow 1.10 Tree view UI
> --
>
> Key: AIRFLOW-3421
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3421
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.0, 1.10.1
>Reporter: Adityan
>Priority: Minor
>  Labels: beginner
> Attachments: Screen Shot 2018-11-07 at 1.27.05 PM.png
>
>
> After upgrading to airflow 1.10, hovering the mouse over the rectangles in 
> tree view, causes the tooltip to popup much higher than it used to be. In the 
> screenshot attached, the right lowermost rectangle is the one that is being 
> hovered over.
>  
>  Once you scroll down on the tree view, the tooltip starts floating up. 
>   
>  Things I have tried to fix this behavior:
>  1. Change the css themes in webserver_config.py and restart the web server
>  2. Inspected the tooltip in Chrome, it seems to be a dynamically generated 
> CSS class. The CSS class controlling this behavior seem to be the same in 
> Airflow 1.9. 
>  
> *NOTE: This issue only happens with rbac is set to True in airflow.cfg. If 
> you turn off rbac, then this issue doesn't occur. Also, the dag needs to be 
> sufficiently large (vertically) so that you need to scroll with your mouse 
> for this issue to occur.* 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] RosterIn commented on a change in pull request #4802: [AIRFLOW-3978] - Add TIME, BINARY, VARBINARY to MySqlToGoogleCloudStorageOperator

2019-03-01 Thread GitBox
RosterIn commented on a change in pull request #4802: [AIRFLOW-3978] - Add 
TIME, BINARY, VARBINARY to MySqlToGoogleCloudStorageOperator
URL: https://github.com/apache/airflow/pull/4802#discussion_r261769873
 
 

 ##
 File path: airflow/contrib/operators/mysql_to_gcs.py
 ##
 @@ -273,6 +273,8 @@ def type_map(cls, mysql_type):
 d = {
 FIELD_TYPE.INT24: 'INTEGER',
 FIELD_TYPE.TINY: 'INTEGER',
+FIELD_TYPE.BINARY: 'BINARY',
+FIELD_TYPE.VARBINARY: 'BINARY',
 
 Review comment:
   @feng-tao @XD-DENG  it's the other way around. They said it is supported and 
the file you were referring to has nothing to do with it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3992) run-ci.sh should be re-runable

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782095#comment-16782095
 ] 

ASF GitHub Bot commented on AIRFLOW-3992:
-

milton0825 commented on pull request #4817: [AIRFLOW-3992] 1-setup-env.sh 
should be re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> run-ci.sh should be re-runable
> --
>
> Key: AIRFLOW-3992
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3992
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>
> I am following the development setup in 
> https://github.com/apache/airflow/blob/master/CONTRIBUTING.md
> and when I try to rerun:
> {code}
> /app/scripts/ci/run-ci.sh
> {code}
> in side the container it failed with:
> {code}
> + ln -s /home/airflow/.ssh/authorized_keys /home/airflow/.ssh/authorized_keys2
> ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': 
> File exists
> ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
> with code 1)
> __ summary 
> ___
> ERROR:   py27-backend_sqlite-env_docker: commands failed
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] milton0825 opened a new pull request #4817: [AIRFLOW-3992] 1-setup-env.sh should be re-runable

2019-03-01 Thread GitBox
milton0825 opened a new pull request #4817: [AIRFLOW-3992] 1-setup-env.sh 
should be re-runable
URL: https://github.com/apache/airflow/pull/4817
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3992) run-ci.sh should be re-runable

2019-03-01 Thread Chao-Han Tsai (JIRA)
Chao-Han Tsai created AIRFLOW-3992:
--

 Summary: run-ci.sh should be re-runable
 Key: AIRFLOW-3992
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3992
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Chao-Han Tsai
Assignee: Chao-Han Tsai


I am following the development setup in 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.md

and when I try to rerun:
{code}
/app/scripts/ci/run-ci.sh
{code}
in side the container it failed with:
{code}
+ ln -s /home/airflow/.ssh/authorized_keys /home/airflow/.ssh/authorized_keys2
ln: failed to create symbolic link '/home/airflow/.ssh/authorized_keys2': File 
exists
ERROR: InvocationError for command '/app/scripts/ci/1-setup-env.sh' (exited 
with code 1)
__ summary 
___
ERROR:   py27-backend_sqlite-env_docker: commands failed
{code}




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feng-tao merged pull request #4815: [AIRFLOW-XXX] Pin version of tornado pulled in by Celery.

2019-03-01 Thread GitBox
feng-tao merged pull request #4815: [AIRFLOW-XXX] Pin version of tornado pulled 
in by Celery.
URL: https://github.com/apache/airflow/pull/4815
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ttanay commented on a change in pull request #4601: [AIRFLOW-3758] Fix circular import in WasbTaskHandler

2019-03-01 Thread GitBox
ttanay commented on a change in pull request #4601: [AIRFLOW-3758] Fix circular 
import in WasbTaskHandler
URL: https://github.com/apache/airflow/pull/4601#discussion_r261756279
 
 

 ##
 File path: tests/test_logging_config.py
 ##
 @@ -251,6 +251,14 @@ def test_1_9_config(self):
 finally:
 conf.remove_option('core', 'task_log_reader', remove_default=False)
 
+def test_loading_remote_logging_with_wasb_handler(self):
+"""Test if logging can be configured successfully for Azure Blob 
Storage"""
+from airflow.logging_config import configure_logging
+conf.set('core', 'remote_logging', 'True')
+conf.set('core', 'remote_log_conn_id', 'some_wasb')
+conf.set('core', 'remote_base_log_folder', 'wasb://some-folder')
+configure_logging()
 
 Review comment:
   Hey @ashb 
   Didn't get a chance to look at it. Exams :/
   Will get it done by Tuesday.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3991) Retrieve Airflow config from command in environment variable

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3991?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16782057#comment-16782057
 ] 

ASF GitHub Bot commented on AIRFLOW-3991:
-

sdevani commented on pull request #4816: [AIRFLOW-3991] Retrieve configs from 
commands in environment
URL: https://github.com/apache/airflow/pull/4816
 
 
   https://issues.apache.org/jira/browse/AIRFLOW-3991
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   When looking for configurations via commands, airflow can now look at 
environment variables to find the command.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   test_command_from_env
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Retrieve Airflow config from command in environment variable
> 
>
> Key: AIRFLOW-3991
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3991
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: configuration
>Reporter: Shehzan Devani
>Assignee: Shehzan Devani
>Priority: Minor
>  Labels: easyfix
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Airflow configurations are loaded from either environment variables, config 
> file, commands, or defaults. The command feature can only be added in the 
> config file. It would be useful to allow the command to be set as an 
> environment variable.
> For example, I'd like to set the variable 
> AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD="some script".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] sdevani opened a new pull request #4816: [AIRFLOW-3991] Retrieve configs from commands in environment

2019-03-01 Thread GitBox
sdevani opened a new pull request #4816: [AIRFLOW-3991] Retrieve configs from 
commands in environment
URL: https://github.com/apache/airflow/pull/4816
 
 
   https://issues.apache.org/jira/browse/AIRFLOW-3991
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   When looking for configurations via commands, airflow can now look at 
environment variables to find the command.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   test_command_from_env
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3991) Retrieve Airflow config from command in environment variable

2019-03-01 Thread Shehzan Devani (JIRA)
Shehzan Devani created AIRFLOW-3991:
---

 Summary: Retrieve Airflow config from command in environment 
variable
 Key: AIRFLOW-3991
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3991
 Project: Apache Airflow
  Issue Type: Improvement
  Components: configuration
Reporter: Shehzan Devani
Assignee: Shehzan Devani


Airflow configurations are loaded from either environment variables, config 
file, commands, or defaults. The command feature can only be added in the 
config file. It would be useful to allow the command to be set as an 
environment variable.

For example, I'd like to set the variable 
AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD="some script".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] mik-laj commented on a change in pull request #4792: [AIRFLOW-3659] Create Google Cloud Transfer Service Operators

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4792:  [AIRFLOW-3659] Create 
Google Cloud Transfer Service Operators 
URL: https://github.com/apache/airflow/pull/4792#discussion_r261743037
 
 

 ##
 File path: docs/howto/operator.rst
 ##
 @@ -2327,3 +2327,403 @@ More information
 
 See `Google Cloud Vision Product delete documentation
 
`_.
+
+Google Cloud Transfer Service Operators
 
 Review comment:
   I created a PR that split documentation for operators to multiple file: 
https://github.com/apache/airflow/pull/4814
   When the change is accepted, it will expand the descriptions for each set of 
operators to explain what service the operators are integrating with. 
Currently, there is no place to put such information.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4815: [AIRFLOW-XXX] Pin version of tornado pulled in by Celery.

2019-03-01 Thread GitBox
feng-tao commented on issue #4815: [AIRFLOW-XXX] Pin version of tornado pulled 
in by Celery.
URL: https://github.com/apache/airflow/pull/4815#issuecomment-468787658
 
 
   thanks. lgtm


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4815: [AIRFLOW-XXX] Pin version of tornado pulled in by Celery.

2019-03-01 Thread GitBox
ashb commented on issue #4815: [AIRFLOW-XXX] Pin version of tornado pulled in 
by Celery.
URL: https://github.com/apache/airflow/pull/4815#issuecomment-468787471
 
 
   @feng-tao @Fokko We've got another module to pin. Dep-of-a-dep this time :(


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb opened a new pull request #4815: [AIRFLOW-XXX] Pin version of tornado pulled in by Celery.

2019-03-01 Thread GitBox
ashb opened a new pull request #4815: [AIRFLOW-XXX] Pin version of tornado 
pulled in by Celery.
URL: https://github.com/apache/airflow/pull/4815
 
 
   https://github.com/tornadoweb/tornado/issues/2604
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj opened a new pull request #4814: [AIRFLOW-XXX] Split guide for operators to multiple files

2019-03-01 Thread GitBox
mik-laj opened a new pull request #4814: [AIRFLOW-XXX] Split guide for 
operators to multiple files
URL: https://github.com/apache/airflow/pull/4814
 
 
   I divided the guides for operators into several files, because the page 
became very long, which made it difficult to use it. In addition, I corrected 
the table of contents to include only the information that is needed. It is 
worth adding that I used the glob to create a table of contents for operators 
for GCP.  Adding a new service is just adding a new file.
   
   Preview: http://tasteful-able.surge.sh/


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] cixuuz commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
cixuuz commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261724152
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   This is a very good idea. But there is no such folder for those scripts. 
What do you think if we make a dir under docs, like 
/docs/example_dags/branch_without_trigger.py ? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb edited a comment on issue #4769: [AIRFLOW-2511] Fix improper failed session commit handling

2019-03-01 Thread GitBox
ashb edited a comment on issue #4769: [AIRFLOW-2511] Fix improper failed 
session commit handling
URL: https://github.com/apache/airflow/pull/4769#issuecomment-468763186
 
 
   @fenglu-g this PR might have caused a failure on py3 on matter. Could you 
check? (I'm on mobile right now, so hard to say if it was this PR or another 
one.)
   
   Nevermind - it's Tornado 6.0.0 which was released 4 hours ago and breaks on 
our version of  python.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4796: [AIRFLOW-3970] Pull out the action buttons from the tabs

2019-03-01 Thread GitBox
ashb commented on issue #4796: [AIRFLOW-3970] Pull out the action buttons from 
the tabs
URL: https://github.com/apache/airflow/pull/4796#issuecomment-468771793
 
 
   I would go with "not adding any new ones" personally - there are some but 
not lots (82 in rbac tree of 1.10 which is what I had checked out)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3990) Use compiled regular expressions

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781961#comment-16781961
 ] 

ASF GitHub Bot commented on AIRFLOW-3990:
-

jmcarp commented on pull request #4813: [AIRFLOW-3990] Compile regular 
expressions.
URL: https://github.com/apache/airflow/pull/4813
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3990
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Use compiled regular expressions
> 
>
> Key: AIRFLOW-3990
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3990
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
>
> Some regular expressions are evaluated many times. To save a little time, we 
> should compile regular expressions with `re.compile` in advance instead of 
> potentially recompiling for each use.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] jmcarp opened a new pull request #4813: [AIRFLOW-3990] Compile regular expressions.

2019-03-01 Thread GitBox
jmcarp opened a new pull request #4813: [AIRFLOW-3990] Compile regular 
expressions.
URL: https://github.com/apache/airflow/pull/4813
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3990
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3990) Use compiled regular expressions

2019-03-01 Thread Josh Carp (JIRA)
Josh Carp created AIRFLOW-3990:
--

 Summary: Use compiled regular expressions
 Key: AIRFLOW-3990
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3990
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Josh Carp
Assignee: Josh Carp


Some regular expressions are evaluated many times. To save a little time, we 
should compile regular expressions with `re.compile` in advance instead of 
potentially recompiling for each use.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4769: [AIRFLOW-2511] Fix improper failed session commit handling

2019-03-01 Thread GitBox
ashb commented on issue #4769: [AIRFLOW-2511] Fix improper failed session 
commit handling
URL: https://github.com/apache/airflow/pull/4769#issuecomment-468763186
 
 
   @fenglu-g this PR might have caused a failure on py3 on matter. Could you 
check? (I'm on mobile right now, so hard to say if it was this PR or another 
one.)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4796: [AIRFLOW-3970] Pull out the action buttons from the tabs

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4796: [AIRFLOW-3970] Pull out 
the action buttons from the tabs
URL: https://github.com/apache/airflow/pull/4796#discussion_r261699310
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -26,24 +26,52 @@
 
 {% block content %}
 
-
-  {% if dag.parent_dag is defined and dag.parent_dag %}
-SUBDAG:   {{ dag.dag_id 
}}
-  {% else %}
-
-DAG:   {{ dag.dag_id }} 
 {{ dag.description }} 
-  {% endif %}
-  {% if root %}
-ROOT:   {{ root }}
-  {% endif %}
-
-
-  
-schedule: {{ dag.schedule_interval }}
-  
-
+  
+
+  
+
+  
+  Trigger DAG
+
+
+  
+  Refresh
+
+
+  
+  Delete
+
+  
+
+
+  
+{% if dag.parent_dag is defined and dag.parent_dag %}
+  SUBDAG:   {{ dag.dag_id 
}}
+{% else %}
+  
+  DAG:   {{ dag.dag_id 
}}  {{ dag.description }} 
+{% endif %}
+{% if root %}
+  ROOT:   {{ root }}
+{% endif %}
+  
+  
 
 Review comment:
   I am counting on understanding that this code is enough at the moment. 
However, this is not a long-term solution.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4796: [AIRFLOW-3970] Pull out the action buttons from the tabs

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4796: [AIRFLOW-3970] Pull out 
the action buttons from the tabs
URL: https://github.com/apache/airflow/pull/4796#discussion_r261698848
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -26,24 +26,52 @@
 
 {% block content %}
 
-
-  {% if dag.parent_dag is defined and dag.parent_dag %}
-SUBDAG:   {{ dag.dag_id 
}}
-  {% else %}
-
-DAG:   {{ dag.dag_id }} 
 {{ dag.description }} 
-  {% endif %}
-  {% if root %}
-ROOT:   {{ root }}
-  {% endif %}
-
-
-  
-schedule: {{ dag.schedule_interval }}
-  
-
+  
+
+  
+
+  
+  Trigger DAG
+
+
+  
+  Refresh
+
+
+  
+  Delete
+
+  
+
+
+  
+{% if dag.parent_dag is defined and dag.parent_dag %}
+  SUBDAG:   {{ dag.dag_id 
}}
+{% else %}
+  
+  DAG:   {{ dag.dag_id 
}}  {{ dag.description }} 
+{% endif %}
+{% if root %}
+  ROOT:   {{ root }}
+{% endif %}
+  
+  
 
 Review comment:
   I would like to soon update Bootstrap to a newer version. Then i will use a 
bootstrap spacing classes.
   
   See: 
   https://getbootstrap.com/docs/4.2/utilities/spacing/


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on issue #4796: [AIRFLOW-3970] Pull out the action buttons from the tabs

2019-03-01 Thread GitBox
mik-laj commented on issue #4796: [AIRFLOW-3970] Pull out the action buttons 
from the tabs
URL: https://github.com/apache/airflow/pull/4796#issuecomment-468751665
 
 
   @ashb Such styles also occur in other places of the project. I would like to 
delete them, but in another PR. Now I would prefer to be consistent with the 
style of the project. I plan a lot more UI changes, which I will gradually 
introduce.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #4796: [AIRFLOW-3970] Pull out the action buttons from the tabs

2019-03-01 Thread GitBox
ashb commented on a change in pull request #4796: [AIRFLOW-3970] Pull out the 
action buttons from the tabs
URL: https://github.com/apache/airflow/pull/4796#discussion_r261696019
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -26,24 +26,52 @@
 
 {% block content %}
 
-
-  {% if dag.parent_dag is defined and dag.parent_dag %}
-SUBDAG:   {{ dag.dag_id 
}}
-  {% else %}
-
-DAG:   {{ dag.dag_id }} 
 {{ dag.description }} 
-  {% endif %}
-  {% if root %}
-ROOT:   {{ root }}
-  {% endif %}
-
-
-  
-schedule: {{ dag.schedule_interval }}
-  
-
+  
+
+  
+
+  
+  Trigger DAG
+
+
+  
+  Refresh
+
+
+  
+  Delete
+
+  
+
+
+  
+{% if dag.parent_dag is defined and dag.parent_dag %}
+  SUBDAG:   {{ dag.dag_id 
}}
+{% else %}
+  
+  DAG:   {{ dag.dag_id 
}}  {{ dag.description }} 
+{% endif %}
+{% if root %}
+  ROOT:   {{ root }}
+{% endif %}
+  
+  
 
 Review comment:
   Mnior style point: I'd prefer it if we didn't use inline styles.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261692915
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1409,108 +1057,21 @@ def run_table_upsert(self, dataset_id, 
table_resource, project_id=None):
 :param table_resource: a table resource. see
 https://cloud.google.com/bigquery/docs/reference/v2/tables#resource
 :type table_resource: dict
-:param project_id: the project to upsert the table into.  If None,
+:param project_id: the project to upsert the table into. If None,
 project will be self.project_id.
 :return:
 """
-# check to see if the table exists
-table_id = table_resource['tableReference']['tableId']
 project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   And here :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261693178
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1716,252 +1219,63 @@ def insert_all(self, project_id, dataset_id, table_id,
 even if any insertion errors occur.
 :type fail_on_error: bool
 """
+project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   here too :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r253853551
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -119,113 +109,37 @@ def get_pandas_df(self, sql, parameters=None, 
dialect=None):
 verbose=False,
 private_key=private_key)
 
-def table_exists(self, project_id, dataset_id, table_id):
+def table_exists(self, dataset_id, table_id, project_id=None):
 """
 Checks for the existence of a table in Google BigQuery.
 
-:param project_id: The Google cloud project in which to look for the
-table. The connection supplied to the hook must provide access to
-the specified project.
-:type project_id: str
 :param dataset_id: The name of the dataset in which to look for the
 table.
 :type dataset_id: str
 :param table_id: The name of the table to check the existence of.
 :type table_id: str
+:param project_id: The Google cloud project in which to look for the
+table. The connection supplied to the hook must provide access to
+the specified project.
+:type project_id: str
 """
-service = self.get_service()
+project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   I think you could use fallback_to_default_project_id decorator. It has 
additional logic to raise the exception if none of the project_id s is 
specified. and you could remove this if altogether then. It forces to use 
keyword parameters though.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261689198
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -54,33 +50,27 @@ class BigQueryHook(GoogleCloudBaseHook, DbApiHook, 
LoggingMixin):
 def __init__(self,
  bigquery_conn_id='bigquery_default',
  delegate_to=None,
- use_legacy_sql=True,
+ use_legacy_sql=False,
  location=None):
 super(BigQueryHook, self).__init__(
 gcp_conn_id=bigquery_conn_id, delegate_to=delegate_to)
 self.use_legacy_sql = use_legacy_sql
 self.location = location
 
-def get_conn(self):
-"""
-Returns a BigQuery PEP 249 connection object.
-"""
-service = self.get_service()
-project = self._get_field('project')
-return BigQueryConnection(
-service=service,
-project_id=project,
-use_legacy_sql=self.use_legacy_sql,
+def get_client(self, project_id=None):
+project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   Same as below - fallback_to_default_project_id decorator is nicer way I think


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261691725
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -706,29 +566,10 @@ def run_query(self,
 
https://cloud.google.com/bigquery/docs/locations#specifying_your_location
 :type location: str
 """
+project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   Again - I think using fallback decorator is nicer as it keeps the project_id 
logic in one place.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261692414
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1066,6 +870,8 @@ def run_load(self,
 time_partitioning. The order of columns given determines the sort 
order.
 :type cluster_fields: list of str
 """
+project_id = project_id if project_id is not None else self.project_id
 
 Review comment:
   And here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google cloud bigquery

2019-03-01 Thread GitBox
potiuk commented on a change in pull request #4607: WIP: [AIRFLOW-1894] Google 
cloud bigquery
URL: https://github.com/apache/airflow/pull/4607#discussion_r261691017
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -419,91 +319,46 @@ def create_external_table(self,
  "Please use one of the following types: {1}"
  .format(compression, allowed_compressions))
 
-table_resource = {
-'externalDataConfiguration': {
-'autodetect': autodetect,
-'sourceFormat': source_format,
-'sourceUris': source_uris,
-'compression': compression,
-'ignoreUnknownValues': ignore_unknown_values
-},
-'tableReference': {
-'projectId': project_id,
-'datasetId': dataset_id,
-'tableId': external_table_id,
-}
-}
+external_config = bigquery.ExternalConfig(source_format)
+external_config.autodetect = autodetect
+external_config.compression = compression
+external_config.source_uris = source_uris
+external_config.ignore_unknown_values = ignore_unknown_values
 
-if schema_fields:
-table_resource['externalDataConfiguration'].update({
-'schema': {
-'fields': schema_fields
-}
-})
+if external_config_options is not None:
+if not isinstance(external_config_options, 
type(external_config.options)):
 
 Review comment:
   I think you should use dict explicitly here. What if external_config.options 
are None (seem to be default).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4772: [AIRFLOW-3937] KubernetesPodOperator support for envFrom configMapRef…

2019-03-01 Thread GitBox
feng-tao commented on issue #4772: [AIRFLOW-3937] KubernetesPodOperator support 
for envFrom configMapRef…
URL: https://github.com/apache/airflow/pull/4772#issuecomment-468749005
 
 
   @ashb may know better for the date.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4769: [AIRFLOW-2511] Fix improper failed session commit handling

2019-03-01 Thread GitBox
ashb commented on issue #4769: [AIRFLOW-2511] Fix improper failed session 
commit handling
URL: https://github.com/apache/airflow/pull/4769#issuecomment-468748736
 
 
   (Sorry - inbox bankruptcy)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2511) Subdag failed by scheduler deadlock

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2511?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781911#comment-16781911
 ] 

ASF subversion and git services commented on AIRFLOW-2511:
--

Commit 2ac689c73a78bfcfe61cea6be1676c82e62b703f in airflow's branch 
refs/heads/master from fenglu-g
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=2ac689c ]

[AIRFLOW-2511] Fix improper failed session commit handling causing deadlocks 
(#4769)



> Subdag failed by scheduler deadlock
> ---
>
> Key: AIRFLOW-2511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2511
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Yohei Onishi
>Assignee: Feng Lu
>Priority: Major
>
> I am using subdag and sometimes main dag marked failed because of the 
> following error. In this case, tasks in the subdag stopped.
> {code:java}
> hourly_dag = DAG(
>   hourly_dag_name,
>   default_args=dag_default_args,
>   params=dag_custom_params,
>   schedule_interval=config_values.hourly_job_interval,
>   max_active_runs=2)
> hourly_subdag = SubDagOperator(
>   task_id='s3_to_hive',
>   subdag=LoadFromS3ToHive(
>   hourly_dag,
>   's3_to_hive'),
>   dag=hourly_dag)
> {code}
> I got this error in main dag. bug in scheduler?
> {code:java}
> [2018-05-22 21:52:19,683] {models.py:1595} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> Traceback (most recent call last):
> sqlalchemy.exc.InvalidRequestError: This Session's transaction has been 
> rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> [2018-05-22 21:52:19,687] {models.py:1624} INFO - Marking task as FAILED.
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> [2018-05-22 21:52:19,688] {slack_hook.py:143} INFO - Message is prepared: 
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> {"attachments": [{"color": "danger", "text": "", "fields": [{"title": "DAG", 
> "value": 
> "",
>  "short": true}, {"title": "Owner", "value": "airflow", "short": true}, 
> {"title": "Task", "value": "s3_to_hive", "short": false}, {"title": "Status", 
> "value": "FAILED", "short": false}, {"title": "Execution Time", "value": 
> "2018-05-07T05:02:00", "short": true}, {"title": "Duration", "value": 
> "826.305929", "short": true}, {"value": 
> "  Task Log>", "short": false}]}]}
> [2018-05-22 21:52:19,688] {models.py:1638} ERROR - Failed at executing 
> callback
> [2018-05-22 21:52:19,688] {models.py:1639} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> {code}
> 

[jira] [Commented] (AIRFLOW-2511) Subdag failed by scheduler deadlock

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2511?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781910#comment-16781910
 ] 

ASF GitHub Bot commented on AIRFLOW-2511:
-

ashb commented on pull request #4769: [AIRFLOW-2511] Fix improper failed 
session commit handling
URL: https://github.com/apache/airflow/pull/4769
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Subdag failed by scheduler deadlock
> ---
>
> Key: AIRFLOW-2511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2511
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Yohei Onishi
>Assignee: Feng Lu
>Priority: Major
>
> I am using subdag and sometimes main dag marked failed because of the 
> following error. In this case, tasks in the subdag stopped.
> {code:java}
> hourly_dag = DAG(
>   hourly_dag_name,
>   default_args=dag_default_args,
>   params=dag_custom_params,
>   schedule_interval=config_values.hourly_job_interval,
>   max_active_runs=2)
> hourly_subdag = SubDagOperator(
>   task_id='s3_to_hive',
>   subdag=LoadFromS3ToHive(
>   hourly_dag,
>   's3_to_hive'),
>   dag=hourly_dag)
> {code}
> I got this error in main dag. bug in scheduler?
> {code:java}
> [2018-05-22 21:52:19,683] {models.py:1595} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> Traceback (most recent call last):
> sqlalchemy.exc.InvalidRequestError: This Session's transaction has been 
> rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> [2018-05-22 21:52:19,687] {models.py:1624} INFO - Marking task as FAILED.
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> [2018-05-22 21:52:19,688] {slack_hook.py:143} INFO - Message is prepared: 
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> {"attachments": [{"color": "danger", "text": "", "fields": [{"title": "DAG", 
> "value": 
> "",
>  "short": true}, {"title": "Owner", "value": "airflow", "short": true}, 
> {"title": "Task", "value": "s3_to_hive", "short": false}, {"title": "Status", 
> "value": "FAILED", "short": false}, {"title": "Execution Time", "value": 
> "2018-05-07T05:02:00", "short": true}, {"title": "Duration", "value": 
> "826.305929", "short": true}, {"value": 
> "  Task Log>", "short": false}]}]}
> [2018-05-22 21:52:19,688] {models.py:1638} ERROR - Failed at executing 
> callback
> [2018-05-22 21:52:19,688] {models.py:1639} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 

[jira] [Commented] (AIRFLOW-2511) Subdag failed by scheduler deadlock

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2511?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781912#comment-16781912
 ] 

ASF GitHub Bot commented on AIRFLOW-2511:
-

ashb commented on pull request #4807: AIRFLOW-2511 - Investigation to mitigate 
Deadlocking
URL: https://github.com/apache/airflow/pull/4807
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Subdag failed by scheduler deadlock
> ---
>
> Key: AIRFLOW-2511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2511
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Yohei Onishi
>Assignee: Feng Lu
>Priority: Major
>
> I am using subdag and sometimes main dag marked failed because of the 
> following error. In this case, tasks in the subdag stopped.
> {code:java}
> hourly_dag = DAG(
>   hourly_dag_name,
>   default_args=dag_default_args,
>   params=dag_custom_params,
>   schedule_interval=config_values.hourly_job_interval,
>   max_active_runs=2)
> hourly_subdag = SubDagOperator(
>   task_id='s3_to_hive',
>   subdag=LoadFromS3ToHive(
>   hourly_dag,
>   's3_to_hive'),
>   dag=hourly_dag)
> {code}
> I got this error in main dag. bug in scheduler?
> {code:java}
> [2018-05-22 21:52:19,683] {models.py:1595} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> Traceback (most recent call last):
> sqlalchemy.exc.InvalidRequestError: This Session's transaction has been 
> rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> [2018-05-22 21:52:19,687] {models.py:1624} INFO - Marking task as FAILED.
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> [2018-05-22 21:52:19,688] {slack_hook.py:143} INFO - Message is prepared: 
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> {"attachments": [{"color": "danger", "text": "", "fields": [{"title": "DAG", 
> "value": 
> "",
>  "short": true}, {"title": "Owner", "value": "airflow", "short": true}, 
> {"title": "Task", "value": "s3_to_hive", "short": false}, {"title": "Status", 
> "value": "FAILED", "short": false}, {"title": "Execution Time", "value": 
> "2018-05-07T05:02:00", "short": true}, {"title": "Duration", "value": 
> "826.305929", "short": true}, {"value": 
> "  Task Log>", "short": false}]}]}
> [2018-05-22 21:52:19,688] {models.py:1638} ERROR - Failed at executing 
> callback
> [2018-05-22 21:52:19,688] {models.py:1639} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 

[jira] [Resolved] (AIRFLOW-2511) Subdag failed by scheduler deadlock

2019-03-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-2511.

   Resolution: Fixed
Fix Version/s: 1.10.3

> Subdag failed by scheduler deadlock
> ---
>
> Key: AIRFLOW-2511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2511
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Yohei Onishi
>Assignee: Feng Lu
>Priority: Major
> Fix For: 1.10.3
>
>
> I am using subdag and sometimes main dag marked failed because of the 
> following error. In this case, tasks in the subdag stopped.
> {code:java}
> hourly_dag = DAG(
>   hourly_dag_name,
>   default_args=dag_default_args,
>   params=dag_custom_params,
>   schedule_interval=config_values.hourly_job_interval,
>   max_active_runs=2)
> hourly_subdag = SubDagOperator(
>   task_id='s3_to_hive',
>   subdag=LoadFromS3ToHive(
>   hourly_dag,
>   's3_to_hive'),
>   dag=hourly_dag)
> {code}
> I got this error in main dag. bug in scheduler?
> {code:java}
> [2018-05-22 21:52:19,683] {models.py:1595} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> Traceback (most recent call last):
> sqlalchemy.exc.InvalidRequestError: This Session's transaction has been 
> rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> [2018-05-22 21:52:19,687] {models.py:1624} INFO - Marking task as FAILED.
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> [2018-05-22 21:52:19,688] {slack_hook.py:143} INFO - Message is prepared: 
> [2018-05-22 21:52:19,688] {base_task_runner.py:98} INFO - Subtask: 
> {"attachments": [{"color": "danger", "text": "", "fields": [{"title": "DAG", 
> "value": 
> "",
>  "short": true}, {"title": "Owner", "value": "airflow", "short": true}, 
> {"title": "Task", "value": "s3_to_hive", "short": false}, {"title": "Status", 
> "value": "FAILED", "short": false}, {"title": "Execution Time", "value": 
> "2018-05-07T05:02:00", "short": true}, {"title": "Duration", "value": 
> "826.305929", "short": true}, {"value": 
> "  Task Log>", "short": false}]}]}
> [2018-05-22 21:52:19,688] {models.py:1638} ERROR - Failed at executing 
> callback
> [2018-05-22 21:52:19,688] {models.py:1639} ERROR - This Session's transaction 
> has been rolled back due to a previous exception during flush. To begin a new 
> transaction with this Session, first issue Session.rollback(). Original 
> exception was: (_mysql_exceptions.OperationalError) (1213, 'Deadlock found 
> when trying to get lock; try restarting transaction') [SQL: 'UPDATE 
> task_instance SET state=%s WHERE task_instance.task_id = %s AND 
> task_instance.dag_id = %s AND task_instance.execution_date = %s'] 
> [parameters: ('queued', 'transfer_from_tmp_table_into_cleaned_table', 
> 'rfid_warehouse_carton_wh_g_dl_dwh_csv_uqjp_1h.s3_to_hive', 
> datetime.datetime(2018, 5, 7, 5, 2))] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate Deadlocking

2019-03-01 Thread GitBox
ashb commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate 
Deadlocking
URL: https://github.com/apache/airflow/pull/4807#issuecomment-468748566
 
 
   Closing in favour of #4769 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb closed pull request #4807: AIRFLOW-2511 - Investigation to mitigate Deadlocking

2019-03-01 Thread GitBox
ashb closed pull request #4807: AIRFLOW-2511 - Investigation to mitigate 
Deadlocking
URL: https://github.com/apache/airflow/pull/4807
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb merged pull request #4769: [AIRFLOW-2511] Fix improper failed session commit handling

2019-03-01 Thread GitBox
ashb merged pull request #4769: [AIRFLOW-2511] Fix improper failed session 
commit handling
URL: https://github.com/apache/airflow/pull/4769
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4636: [AIRFLOW-3737] Kubernetes executor cannot handle long dag/task names

2019-03-01 Thread GitBox
ashb commented on issue #4636: [AIRFLOW-3737] Kubernetes executor cannot handle 
long dag/task names
URL: https://github.com/apache/airflow/pull/4636#issuecomment-468747076
 
 
   @PaulW great. Ping me when you have pushed the changes!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
mik-laj commented on a change in pull request #4805: [AIRFLOW-3977] Fix/Add 
examples of how trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#discussion_r261684802
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -752,6 +752,67 @@ Note that these can be used in conjunction with 
``depends_on_past`` (boolean)
 that, when set to ``True``, keeps a task from getting triggered if the
 previous schedule for the task hasn't succeeded.
 
+One must be aware of the interaction between trigger rules and skipped tasks
+in schedule level. Skipped tasks will cascade through trigger rules 
+``all_success`` and ``all_failed`` but not ``all_done``, ``one_failed``, 
``one_success``,
+``none_failed`` and ``dummy``. 
+
+For example, consider the following DAG:
+
+.. code:: python
+
+  #dags/branch_without_trigger.py
 
 Review comment:
   If we have this file in repo, we should include it instead of copy it's 
content. It's allow to keep example allways working and fresh. 
   See: 
   http://www.sphinx-doc.org/en/1.5/markup/code.html#directive-literalinclude
   https://github.com/apache/airflow/blob/master/docs/howto/operator.rst


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] PaulW edited a comment on issue #4636: [AIRFLOW-3737] Kubernetes executor cannot handle long dag/task names

2019-03-01 Thread GitBox
PaulW edited a comment on issue #4636: [AIRFLOW-3737] Kubernetes executor 
cannot handle long dag/task names
URL: https://github.com/apache/airflow/pull/4636#issuecomment-468724986
 
 
   @pgagnon Yes sorry I understand what you mean now.  I've reworked the 
changes now to reflect your suggestion & once tested I'll add them to the PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] PaulW commented on issue #4636: [AIRFLOW-3737] Kubernetes executor cannot handle long dag/task names

2019-03-01 Thread GitBox
PaulW commented on issue #4636: [AIRFLOW-3737] Kubernetes executor cannot 
handle long dag/task names
URL: https://github.com/apache/airflow/pull/4636#issuecomment-468724986
 
 
   @pgagnon Yes sorry I understand what you mean now.  I've reworked the 
changes now to reflect your suggestion & once tested I'll re-submit the changes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko opened a new pull request #4812: [Airflow-XXX] Add Daniel to the list of GoDataDriven

2019-03-01 Thread GitBox
Fokko opened a new pull request #4812: [Airflow-XXX] Add Daniel to the list of 
GoDataDriven
URL: https://github.com/apache/airflow/pull/4812
 
 
   Took a while, but Daniel joined GDD :-)
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3981) Make Airflow UI timezone aware

2019-03-01 Thread zhongjiajie (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3981?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781842#comment-16781842
 ] 

zhongjiajie commented on AIRFLOW-3981:
--

If no one assignee when I finish my other PR in hand. I will do it.

> Make Airflow UI timezone aware
> --
>
> Key: AIRFLOW-3981
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3981
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Tao Feng
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on a change in pull request #4601: [AIRFLOW-3758] Fix circular import in WasbTaskHandler

2019-03-01 Thread GitBox
ashb commented on a change in pull request #4601: [AIRFLOW-3758] Fix circular 
import in WasbTaskHandler
URL: https://github.com/apache/airflow/pull/4601#discussion_r261652145
 
 

 ##
 File path: tests/test_logging_config.py
 ##
 @@ -251,6 +251,14 @@ def test_1_9_config(self):
 finally:
 conf.remove_option('core', 'task_log_reader', remove_default=False)
 
+def test_loading_remote_logging_with_wasb_handler(self):
+"""Test if logging can be configured successfully for Azure Blob 
Storage"""
+from airflow.logging_config import configure_logging
+conf.set('core', 'remote_logging', 'True')
+conf.set('core', 'remote_log_conn_id', 'some_wasb')
+conf.set('core', 'remote_base_log_folder', 'wasb://some-folder')
+configure_logging()
 
 Review comment:
   @ttanay Any chance you'd have a chance to look at this last change?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3918) Adding a parameter in Airflow-kubernetes config to support git-sync with SSH credential

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781748#comment-16781748
 ] 

ASF GitHub Bot commented on AIRFLOW-3918:
-

ashb commented on pull request #4777: [AIRFLOW-3918] Add ssh private-key 
support to git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Adding a parameter in Airflow-kubernetes config to support git-sync with SSH 
> credential
> ---
>
> Key: AIRFLOW-3918
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3918
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Daniel Mateus Pires
>Assignee: Daniel Mateus Pires
>Priority: Minor
>
> It's the preferred pattern in my work place to integrate deployment systems 
> with GitHub using the SSH deploy key feature that can easily be scoped to 
> read-only on a single repository
> I would like to support this feature by supporting a "git_ssh_key_file" 
> parameter in the kubernetes section of the config, which would be an 
> alternate authentication method to the already supported git_user + 
> git_password
> It will use the following feature: 
> https://github.com/kubernetes/git-sync/blob/7bb3262084ac1ad64321856c1e769358cf18f67d/cmd/git-sync/main.go#L88
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
ashb commented on issue #4777: [AIRFLOW-3918] Add ssh private-key support to 
git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777#issuecomment-468693066
 
 
   I'll mark this for 10.3 - and we'll attempt to cherry-pick it in to the 
release branch.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3918) Adding a parameter in Airflow-kubernetes config to support git-sync with SSH credential

2019-03-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3918?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781749#comment-16781749
 ] 

ASF subversion and git services commented on AIRFLOW-3918:
--

Commit f4253a29513d6506132e55ffabc31117f916d78a in airflow's branch 
refs/heads/master from Daniel Mateus Pires
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f4253a2 ]

[AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor 
(#4777)

Add configuration for git SSH auth and update git-sync version in
template (mutually exclusive of user authentication).

Update git-sync version to the latest, current version did not
support SSH authentication environment variables

Security context was required to read the mounted SSH key for
git-sync SSH authentication

Add example of configmap + Kubernetes secret snippet in config template

> Adding a parameter in Airflow-kubernetes config to support git-sync with SSH 
> credential
> ---
>
> Key: AIRFLOW-3918
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3918
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Daniel Mateus Pires
>Assignee: Daniel Mateus Pires
>Priority: Minor
>
> It's the preferred pattern in my work place to integrate deployment systems 
> with GitHub using the SSH deploy key feature that can easily be scoped to 
> read-only on a single repository
> I would like to support this feature by supporting a "git_ssh_key_file" 
> parameter in the kubernetes section of the config, which would be an 
> alternate authentication method to the already supported git_user + 
> git_password
> It will use the following feature: 
> https://github.com/kubernetes/git-sync/blob/7bb3262084ac1ad64321856c1e769358cf18f67d/cmd/git-sync/main.go#L88
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb merged pull request #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
ashb merged pull request #4777: [AIRFLOW-3918] Add ssh private-key support to 
git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3989) No way to set Kubernetes node_selectors or annotations as environment variables

2019-03-01 Thread Thorsten K (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thorsten K updated AIRFLOW-3989:

Description: 
Hey,

After checking out this PR: [https://github.com/apache/airflow/pull/4589]

I realized, that there is no way to set kubernetes_node_selectors or 
kubernetes_annotations via environment variables, as the respective sections 
won't be looked up in environment variables.

Is it possible to add this functionality? It blocks us from using 
environment-based configuration.

I saw, that there's the `getsection()` function, but it's not being used...

  was:
Hey,

After checking out this PR: [https://github.com/apache/airflow/pull/4589]

I realized, that there is no way to set kubernetes_node_selectors or 
kubernetes_annotations via environment variables, as the respective sections 
won't be looked up in environment variables.

Is it possible to add this functionality? It blocks us from using 
environment-based configuration.


> No way to set Kubernetes node_selectors or annotations as environment 
> variables
> ---
>
> Key: AIRFLOW-3989
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3989
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: kubernetes
>Affects Versions: 1.10.2
>Reporter: Thorsten K
>Priority: Critical
>  Labels: Kubernetes, kubernetes
>
> Hey,
> After checking out this PR: [https://github.com/apache/airflow/pull/4589]
> I realized, that there is no way to set kubernetes_node_selectors or 
> kubernetes_annotations via environment variables, as the respective sections 
> won't be looked up in environment variables.
> Is it possible to add this functionality? It blocks us from using 
> environment-based configuration.
> I saw, that there's the `getsection()` function, but it's not being used...



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on a change in pull request #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
ashb commented on a change in pull request #4777: [AIRFLOW-3918] Add ssh 
private-key support to git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777#discussion_r261629071
 
 

 ##
 File path: tests/contrib/executors/test_kubernetes_executor.py
 ##
 @@ -161,6 +165,39 @@ def test_worker_configuration_no_subpaths(self):
 "subPath shouldn't be defined"
 )
 
+@mock.patch.object(conf, 'get')
+@mock.patch.object(configuration, 'as_dict')
+def test_worker_configuration_auth_both_ssh_and_user(self, 
mock_config_as_dict, mock_conf_get):
+def get_conf(*args, **kwargs):
+if(args[0] == 'core'):
+return '1'
+if(args[0] == 'kubernetes'):
+if(args[1] == 'airflow_configmap'):
+return 'airflow-configmap'
+if(args[1] == 'git_ssh_key_secret_name'):
+return 'airflow-secrets'
+if(args[1] == 'git_ssh_key_secret_key'):
+return 'gitSshKey'
+if(args[1] == 'git_user'):
+return 'some-user'
+if(args[1] == 'git_password'):
+return 'some-password'
+if(args[1] == 'git_repo'):
+return 'g...@github.com:apache/airflow.git'
+if(args[1] == 'git_branch'):
+return 'master'
+if(args[1] == 'git_dags_folder_mount_point'):
+return '/usr/local/airflow/dags'
+if(args[1] == 'delete_worker_pods'):
+return True
+return '1'
+return None
+
+mock_conf_get.side_effect = get_conf
+mock_config_as_dict.return_value = {'core': ''}
+with self.assertRaises(AirflowConfigException):
+KubeConfig()
 
 Review comment:
   Thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3989) No way to set Kubernetes node_selectors or annotations as environment variables

2019-03-01 Thread Thorsten K (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thorsten K updated AIRFLOW-3989:

Description: 
Hey,

After checking out this PR: [https://github.com/apache/airflow/pull/4589]

I realized, that there is no way to set kubernetes_node_selectors or 
kubernetes_annotations via environment variables, as the respective sections 
won't be looked up in environment variables.

Is it possible to add this functionality? It blocks us from using 
environment-based configuration.

  was:
Hey,

After checking out this PR: [https://github.com/apache/airflow/pull/4589
]I realized, that there is no way to set kubernetes_node_selectors or 
kubernetes_annotations via environment variables, as the respective sections 
won't be looked up in environment variables.

Is it possible to add this functionality? It blocks us from using 
environment-based configuration.


> No way to set Kubernetes node_selectors or annotations as environment 
> variables
> ---
>
> Key: AIRFLOW-3989
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3989
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: kubernetes
>Affects Versions: 1.10.2
>Reporter: Thorsten K
>Priority: Critical
>  Labels: Kubernetes, kubernetes
>
> Hey,
> After checking out this PR: [https://github.com/apache/airflow/pull/4589]
> I realized, that there is no way to set kubernetes_node_selectors or 
> kubernetes_annotations via environment variables, as the respective sections 
> won't be looked up in environment variables.
> Is it possible to add this functionality? It blocks us from using 
> environment-based configuration.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3989) No way to set Kubernetes node_selectors or annotations as environment variables

2019-03-01 Thread Thorsten K (JIRA)
Thorsten K created AIRFLOW-3989:
---

 Summary: No way to set Kubernetes node_selectors or annotations as 
environment variables
 Key: AIRFLOW-3989
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3989
 Project: Apache Airflow
  Issue Type: Wish
  Components: kubernetes
Affects Versions: 1.10.2
Reporter: Thorsten K


Hey,

After checking out this PR: [https://github.com/apache/airflow/pull/4589
]I realized, that there is no way to set kubernetes_node_selectors or 
kubernetes_annotations via environment variables, as the respective sections 
won't be looked up in environment variables.

Is it possible to add this functionality? It blocks us from using 
environment-based configuration.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
codecov-io edited a comment on issue #4777: [AIRFLOW-3918] Add ssh private-key 
support to git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777#issuecomment-467378309
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=h1) 
Report
   > Merging 
[#4777](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0044260be1b6d52220e17fd9e45fcafb16b19250?src=pr=desc)
 will **increase** coverage by `0.03%`.
   > The diff coverage is `97.22%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4777/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4777  +/-   ##
   ==
   + Coverage   74.44%   74.47%   +0.03% 
   ==
 Files 450  450  
 Lines   2897028994  +24 
   ==
   + Hits2156721594  +27 
   + Misses   7403 7400   -3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../kubernetes\_request\_factory/pod\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkvcG9kX3JlcXVlc3RfZmFjdG9yeS5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/kubernetes/pod.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMvcG9kLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/kubernetes/worker\_configuration.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMvd29ya2VyX2NvbmZpZ3VyYXRpb24ucHk=)
 | `93.45% <100%> (+2.06%)` | :arrow_up: |
   | 
[airflow/contrib/executors/kubernetes\_executor.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4ZWN1dG9ycy9rdWJlcm5ldGVzX2V4ZWN1dG9yLnB5)
 | `65.12% <100%> (+1.17%)` | :arrow_up: |
   | 
[...etes\_request\_factory/kubernetes\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3Rvcnkva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkucHk=)
 | `72.63% <66.66%> (-0.2%)` | :arrow_down: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `95.83% <0%> (ø)` | :arrow_up: |
   | 
[...irflow/contrib/example\_dags/example\_gcp\_spanner.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9zcGFubmVyLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/utils/gcp\_field\_validator.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3V0aWxzL2djcF9maWVsZF92YWxpZGF0b3IucHk=)
 | `91.52% <0%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/dep\_context.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcF9jb250ZXh0LnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [8 
more](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=footer). 
Last update 
[0044260...9cf874f](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
codecov-io edited a comment on issue #4777: [AIRFLOW-3918] Add ssh private-key 
support to git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777#issuecomment-467378309
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=h1) 
Report
   > Merging 
[#4777](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0044260be1b6d52220e17fd9e45fcafb16b19250?src=pr=desc)
 will **increase** coverage by `0.03%`.
   > The diff coverage is `97.22%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4777/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4777  +/-   ##
   ==
   + Coverage   74.44%   74.47%   +0.03% 
   ==
 Files 450  450  
 Lines   2897028994  +24 
   ==
   + Hits2156721594  +27 
   + Misses   7403 7400   -3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../kubernetes\_request\_factory/pod\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkvcG9kX3JlcXVlc3RfZmFjdG9yeS5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/kubernetes/pod.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMvcG9kLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/kubernetes/worker\_configuration.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMvd29ya2VyX2NvbmZpZ3VyYXRpb24ucHk=)
 | `93.45% <100%> (+2.06%)` | :arrow_up: |
   | 
[airflow/contrib/executors/kubernetes\_executor.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4ZWN1dG9ycy9rdWJlcm5ldGVzX2V4ZWN1dG9yLnB5)
 | `65.12% <100%> (+1.17%)` | :arrow_up: |
   | 
[...etes\_request\_factory/kubernetes\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3Rvcnkva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkucHk=)
 | `72.63% <66.66%> (-0.2%)` | :arrow_down: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `95.83% <0%> (ø)` | :arrow_up: |
   | 
[...irflow/contrib/example\_dags/example\_gcp\_spanner.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9zcGFubmVyLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/utils/gcp\_field\_validator.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3V0aWxzL2djcF9maWVsZF92YWxpZGF0b3IucHk=)
 | `91.52% <0%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/dep\_context.py](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcF9jb250ZXh0LnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [8 
more](https://codecov.io/gh/apache/airflow/pull/4777/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=footer). 
Last update 
[0044260...9cf874f](https://codecov.io/gh/apache/airflow/pull/4777?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4810: [AIRFLOW-3985] Added unary + operator for easier linear DAG building

2019-03-01 Thread GitBox
codecov-io commented on issue #4810: [AIRFLOW-3985] Added unary + operator for 
easier linear DAG building
URL: https://github.com/apache/airflow/pull/4810#issuecomment-468680866
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=h1) 
Report
   > Merging 
[#4810](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/b51712ca9377163fea554b298ba68dcf64712267?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `25.8%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4810/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4810  +/-   ##
   ==
   + Coverage   74.44%   74.45%   +0.01% 
   ==
 Files 450  450  
 Lines   2897128977   +6 
   ==
   + Hits2156821576   +8 
   + Misses   7403 7401   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/example\_dags/example\_gcp\_sql.py](https://codecov.io/gh/apache/airflow/pull/4810/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF9zcWwucHk=)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4810/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.67% <100%> (+0.02%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=footer). 
Last update 
[b51712c...7acd8c5](https://codecov.io/gh/apache/airflow/pull/4810?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4805: [AIRFLOW-3977] Fix/Add examples of how trigger rules interacted with skipped tasks

2019-03-01 Thread GitBox
codecov-io commented on issue #4805: [AIRFLOW-3977] Fix/Add examples of how 
trigger rules interacted with skipped tasks
URL: https://github.com/apache/airflow/pull/4805#issuecomment-468671653
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=h1) 
Report
   > Merging 
[#4805](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/991d1cfc24b718ece51d7763746c3c9657b9aec1?src=pr=desc)
 will **increase** coverage by `0.11%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4805/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4805  +/-   ##
   ==
   + Coverage   74.33%   74.44%   +0.11% 
   ==
 Files 450  450  
 Lines   2897128971  
   ==
   + Hits2153621568  +32 
   + Misses   7435 7403  -32
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.64% <0%> (+0.05%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `76.46% <0%> (+0.71%)` | :arrow_up: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `59.15% <0%> (+1.05%)` | :arrow_up: |
   | 
[airflow/executors/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvX19pbml0X18ucHk=)
 | `63.46% <0%> (+3.84%)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `81.81% <0%> (+4.54%)` | :arrow_up: |
   | 
[airflow/executors/sequential\_executor.py](https://codecov.io/gh/apache/airflow/pull/4805/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvc2VxdWVudGlhbF9leGVjdXRvci5weQ==)
 | `100% <0%> (+50%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=footer). 
Last update 
[991d1cf...80dbf9a](https://codecov.io/gh/apache/airflow/pull/4805?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dmateusp commented on a change in pull request #4777: [AIRFLOW-3918] Add ssh private-key support to git-sync for KubernetesExecutor

2019-03-01 Thread GitBox
dmateusp commented on a change in pull request #4777: [AIRFLOW-3918] Add ssh 
private-key support to git-sync for KubernetesExecutor
URL: https://github.com/apache/airflow/pull/4777#discussion_r261607868
 
 

 ##
 File path: tests/contrib/executors/test_kubernetes_executor.py
 ##
 @@ -161,6 +165,39 @@ def test_worker_configuration_no_subpaths(self):
 "subPath shouldn't be defined"
 )
 
+@mock.patch.object(conf, 'get')
+@mock.patch.object(configuration, 'as_dict')
+def test_worker_configuration_auth_both_ssh_and_user(self, 
mock_config_as_dict, mock_conf_get):
+def get_conf(*args, **kwargs):
+if(args[0] == 'core'):
+return '1'
+if(args[0] == 'kubernetes'):
+if(args[1] == 'airflow_configmap'):
+return 'airflow-configmap'
+if(args[1] == 'git_ssh_key_secret_name'):
+return 'airflow-secrets'
+if(args[1] == 'git_ssh_key_secret_key'):
+return 'gitSshKey'
+if(args[1] == 'git_user'):
+return 'some-user'
+if(args[1] == 'git_password'):
+return 'some-password'
+if(args[1] == 'git_repo'):
+return 'g...@github.com:apache/airflow.git'
+if(args[1] == 'git_branch'):
+return 'master'
+if(args[1] == 'git_dags_folder_mount_point'):
+return '/usr/local/airflow/dags'
+if(args[1] == 'delete_worker_pods'):
+return True
+return '1'
+return None
+
+mock_conf_get.side_effect = get_conf
+mock_config_as_dict.return_value = {'core': ''}
+with self.assertRaises(AirflowConfigException):
+KubeConfig()
 
 Review comment:
   is it better?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3985) Adding linear DAG relationships with unary operator

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781682#comment-16781682
 ] 

ASF GitHub Bot commented on AIRFLOW-3985:
-

potiuk commented on pull request #4810: [AIRFLOW-3985] Added unary + operator 
for easier linear DAG building
URL: https://github.com/apache/airflow/pull/4810
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. 
 - https://issues.apache.org/jira/browse/AIRFLOW-3985
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   The unary operator makes it easy to do development and testing with linear 
DAGs
   
   ### Tests
   
   - [x] My PR adds the following unit tests:
   tests.models.test_unary_compose_pos_operator
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Adding linear DAG relationships with unary operator
> ---
>
> Key: AIRFLOW-3985
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3985
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Jarek Potiuk
>Priority: Major
>
> It would be nice for development and testing to be able to build linear DAG 
> dependencies with unary + operator rather than with << with automated 
> upstream dependency to the previously added task.
> Often during development we need to inject/remove tasks between two other 
> tasks and with unary operator it might be very easy to comment out whole 
> lines of code.
> Example:
> {code:java}
>  +op1
>  +op2
>  +op3{code}
> might be equivalent to:
> {code:java}
> op1 >> op2 >> op3{code}
> This way we can comment out one line:
> {code:java}
>  +op1
>  # +op2
>  +op3{code}
>  equivalent to:
> {code:java}
> op1 >> op3{code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] potiuk closed pull request #4810: [AIRFLOW-3985] Added unary + operator for easier linear DAG building

2019-03-01 Thread GitBox
potiuk closed pull request #4810: [AIRFLOW-3985] Added unary + operator for 
easier linear DAG building
URL: https://github.com/apache/airflow/pull/4810
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] PaulW commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate Deadlocking

2019-03-01 Thread GitBox
PaulW commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate 
Deadlocking
URL: https://github.com/apache/airflow/pull/4807#issuecomment-468663579
 
 
   @fenglu-g Yes it looks that way.  There was no PR attached to the Jira 
ticket so assumed that there was nothing raised.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] fenglu-g commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate Deadlocking

2019-03-01 Thread GitBox
fenglu-g commented on issue #4807: AIRFLOW-2511 - Investigation to mitigate 
Deadlocking
URL: https://github.com/apache/airflow/pull/4807#issuecomment-468662705
 
 
   @PaulW I made a similar PR a few days back, how about we consolidate and 
move the discussion to https://github.com/apache/airflow/pull/4769? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3984) Add tests for WinRMHook

2019-03-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16781664#comment-16781664
 ] 

ASF GitHub Bot commented on AIRFLOW-3984:
-

feluelle commented on pull request #4811: [AIRFLOW-3984] Add tests for WinRMHook
URL: https://github.com/apache/airflow/pull/4811
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3984
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   - fix docs
   - refactoring code
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   Add tests for
   * test_get_conn_exists
   * test_get_conn_missing_remote_host
   * test_get_conn_error
   * test_get_conn_from_connection
   * test_get_conn_no_username
   * test_get_conn_no_endpoint
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add tests for WinRMHook
> ---
>
> Key: AIRFLOW-3984
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3984
> Project: Apache Airflow
>  Issue Type: Test
>Reporter: Felix Uellendall
>Assignee: Felix Uellendall
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >