[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697589#comment-16697589
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente opened a new pull request #4231: [AIRFLOW-3066] Adding support for 
AWS Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4231
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [x] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [x] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread Hugo Prudente (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697590#comment-16697590
 ] 

Hugo Prudente commented on AIRFLOW-3066:


PR Sumitted: https://github.com/apache/incubator-airflow/pull/4231

> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hugoprudente opened a new pull request #4231: [AIRFLOW-3066] Adding support for AWS Batch parameters

2018-11-23 Thread GitBox
hugoprudente opened a new pull request #4231: [AIRFLOW-3066] Adding support for 
AWS Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4231
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [x] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [x] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697578#comment-16697578
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [x] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [x] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for Parameters on AWS Batch Operator

2018-11-23 Thread GitBox
hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [x] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [x] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697579#comment-16697579
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_awsbatch_operator.py 
b/airflow/contrib/example_dags/example_awsbatch_operator.py
new file mode 100644
index 00..6174873118
--- /dev/null
+++ b/airflow/contrib/example_dags/example_awsbatch_operator.py
@@ -0,0 +1,95 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions andf limitations
+# under the License.
+
+import airflow
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.models import DAG
+from datetime import timedelta
+
+log = LoggingMixin().log
+
+try:
+# AWS Batch is optional, so not available in vanilla Airflow
+# pip install apache-airflow[boto3]
+from airflow.contrib.operators.awsbatch_operator import AWSBatchOperator
+
+default_args = {
+'owner': 'airflow',
+'depends_on_past': False,
+'start_date': airflow.utils.dates.days_ago(2),
+'email': ['airf...@airflow.com'],
+'email_on_failure': False,
+'email_on_retry': False,
+'retries': 1,
+'retry_delay': timedelta(minutes=5),
+}
+
+dag = DAG(
+'example_awsbatch_dag', default_args=default_args, 
schedule_interval=timedelta(1))
+
+# vanilla example
+t0 = AWSBatchOperator(
+task_id='airflow-vanilla',
+job_name='airflow-vanilla',
+job_queue='airflow',
+job_definition='airflow',
+overrides={},
+queue='airflow',
+dag=dag)
+
+# overrides example
+t1 = AWSBatchOperator(
+job_name='airflow-overrides',
+task_id='airflow-overrides',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"overrides"
+]
+},
+queue='airflow',
+dag=dag)
+
+# parameters example
+t2 = AWSBatchOperator(
+job_name='airflow-parameters',
+task_id='airflow-parameters',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"Ref::input"
+]
+},
+parameters={
+"input": "Airflow2000"
+},
+queue='airflow',
+dag=dag)
+
+t0.set_upstream(t1)
+t1.set_upstream(t2)
+
+except ImportError as e:
+log.warn("Could not import AWSBatchOperator: " + str(e))
+log.warn("Install AWS Batch dependencies with: "
+ "pip install apache-airflow[boto3]")
diff --git a/airflow/contrib/operators/awsbatch_operator.py 
b/airflow/contrib/operators/awsbatch_operator.py
index 3c778e6e68..8852861a24 100644
--- a/airflow/contrib/operators/awsbatch_operator.py
+++ b/airflow/contrib/operators/awsbatch_operator.py
@@ -46,6 +46,10 @@ class AWSBatchOperator(BaseOperator):
 containerOverrides (templated):
 
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
 :type overrides: dict
+:param parameters: the same parameter that boto3 will receive on
+parameters (templated):
+
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
+:type parameters: dict
 :param max_retries: exponential backoff retries while waiter is not
 merged, 4200 = 48 hours
 :type max_retries: int
@@ -61,10 +65,11 @@ class AWSBatchOperator(BaseOperator):
 ui_color = '#c3dae0'
 client = None
 arn = None
-template_fields = ('job_name', 'overrides',)
+template_fields = ('job_name', 'overrides', 'parameters',)
 
 @apply_defaults

[GitHub] hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for Parameters on AWS Batch Operator

2018-11-23 Thread GitBox
hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_awsbatch_operator.py 
b/airflow/contrib/example_dags/example_awsbatch_operator.py
new file mode 100644
index 00..6174873118
--- /dev/null
+++ b/airflow/contrib/example_dags/example_awsbatch_operator.py
@@ -0,0 +1,95 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions andf limitations
+# under the License.
+
+import airflow
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.models import DAG
+from datetime import timedelta
+
+log = LoggingMixin().log
+
+try:
+# AWS Batch is optional, so not available in vanilla Airflow
+# pip install apache-airflow[boto3]
+from airflow.contrib.operators.awsbatch_operator import AWSBatchOperator
+
+default_args = {
+'owner': 'airflow',
+'depends_on_past': False,
+'start_date': airflow.utils.dates.days_ago(2),
+'email': ['airf...@airflow.com'],
+'email_on_failure': False,
+'email_on_retry': False,
+'retries': 1,
+'retry_delay': timedelta(minutes=5),
+}
+
+dag = DAG(
+'example_awsbatch_dag', default_args=default_args, 
schedule_interval=timedelta(1))
+
+# vanilla example
+t0 = AWSBatchOperator(
+task_id='airflow-vanilla',
+job_name='airflow-vanilla',
+job_queue='airflow',
+job_definition='airflow',
+overrides={},
+queue='airflow',
+dag=dag)
+
+# overrides example
+t1 = AWSBatchOperator(
+job_name='airflow-overrides',
+task_id='airflow-overrides',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"overrides"
+]
+},
+queue='airflow',
+dag=dag)
+
+# parameters example
+t2 = AWSBatchOperator(
+job_name='airflow-parameters',
+task_id='airflow-parameters',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"Ref::input"
+]
+},
+parameters={
+"input": "Airflow2000"
+},
+queue='airflow',
+dag=dag)
+
+t0.set_upstream(t1)
+t1.set_upstream(t2)
+
+except ImportError as e:
+log.warn("Could not import AWSBatchOperator: " + str(e))
+log.warn("Install AWS Batch dependencies with: "
+ "pip install apache-airflow[boto3]")
diff --git a/airflow/contrib/operators/awsbatch_operator.py 
b/airflow/contrib/operators/awsbatch_operator.py
index 3c778e6e68..8852861a24 100644
--- a/airflow/contrib/operators/awsbatch_operator.py
+++ b/airflow/contrib/operators/awsbatch_operator.py
@@ -46,6 +46,10 @@ class AWSBatchOperator(BaseOperator):
 containerOverrides (templated):
 
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
 :type overrides: dict
+:param parameters: the same parameter that boto3 will receive on
+parameters (templated):
+
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
+:type parameters: dict
 :param max_retries: exponential backoff retries while waiter is not
 merged, 4200 = 48 hours
 :type max_retries: int
@@ -61,10 +65,11 @@ class AWSBatchOperator(BaseOperator):
 ui_color = '#c3dae0'
 client = None
 arn = None
-template_fields = ('job_name', 'overrides',)
+template_fields = ('job_name', 'overrides', 'parameters',)
 
 @apply_defaults
-def __init__(self, job_name, job_definition, job_queue, overrides, 
max_retries=4200,
+def __init__(self, job_name, job_definition, job_queue, overrides,
+ parameters=None, max_retries=4200,
  

[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697576#comment-16697576
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_awsbatch_operator.py 
b/airflow/contrib/example_dags/example_awsbatch_operator.py
new file mode 100644
index 00..6174873118
--- /dev/null
+++ b/airflow/contrib/example_dags/example_awsbatch_operator.py
@@ -0,0 +1,95 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions andf limitations
+# under the License.
+
+import airflow
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.models import DAG
+from datetime import timedelta
+
+log = LoggingMixin().log
+
+try:
+# AWS Batch is optional, so not available in vanilla Airflow
+# pip install apache-airflow[boto3]
+from airflow.contrib.operators.awsbatch_operator import AWSBatchOperator
+
+default_args = {
+'owner': 'airflow',
+'depends_on_past': False,
+'start_date': airflow.utils.dates.days_ago(2),
+'email': ['airf...@airflow.com'],
+'email_on_failure': False,
+'email_on_retry': False,
+'retries': 1,
+'retry_delay': timedelta(minutes=5),
+}
+
+dag = DAG(
+'example_awsbatch_dag', default_args=default_args, 
schedule_interval=timedelta(1))
+
+# vanilla example
+t0 = AWSBatchOperator(
+task_id='airflow-vanilla',
+job_name='airflow-vanilla',
+job_queue='airflow',
+job_definition='airflow',
+overrides={},
+queue='airflow',
+dag=dag)
+
+# overrides example
+t1 = AWSBatchOperator(
+job_name='airflow-overrides',
+task_id='airflow-overrides',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"overrides"
+]
+},
+queue='airflow',
+dag=dag)
+
+# parameters example
+t2 = AWSBatchOperator(
+job_name='airflow-parameters',
+task_id='airflow-parameters',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"Ref::input"
+]
+},
+parameters={
+"input": "Airflow2000"
+},
+queue='airflow',
+dag=dag)
+
+t0.set_upstream(t1)
+t1.set_upstream(t2)
+
+except ImportError as e:
+log.warn("Could not import AWSBatchOperator: " + str(e))
+log.warn("Install AWS Batch dependencies with: "
+ "pip install apache-airflow[boto3]")
diff --git a/airflow/contrib/operators/awsbatch_operator.py 
b/airflow/contrib/operators/awsbatch_operator.py
index 3c778e6e68..8852861a24 100644
--- a/airflow/contrib/operators/awsbatch_operator.py
+++ b/airflow/contrib/operators/awsbatch_operator.py
@@ -46,6 +46,10 @@ class AWSBatchOperator(BaseOperator):
 containerOverrides (templated):
 
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
 :type overrides: dict
+:param parameters: the same parameter that boto3 will receive on
+parameters (templated):
+
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
+:type parameters: dict
 :param max_retries: exponential backoff retries while waiter is not
 merged, 4200 = 48 hours
 :type max_retries: int
@@ -61,10 +65,11 @@ class AWSBatchOperator(BaseOperator):
 ui_color = '#c3dae0'
 client = None
 arn = None
-template_fields = ('job_name', 'overrides',)
+template_fields = ('job_name', 'overrides', 'parameters',)
 
 @apply_defaults

[GitHub] hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for Parameters on AWS Batch Operator

2018-11-23 Thread GitBox
hugoprudente closed pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/example_awsbatch_operator.py 
b/airflow/contrib/example_dags/example_awsbatch_operator.py
new file mode 100644
index 00..6174873118
--- /dev/null
+++ b/airflow/contrib/example_dags/example_awsbatch_operator.py
@@ -0,0 +1,95 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions andf limitations
+# under the License.
+
+import airflow
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.models import DAG
+from datetime import timedelta
+
+log = LoggingMixin().log
+
+try:
+# AWS Batch is optional, so not available in vanilla Airflow
+# pip install apache-airflow[boto3]
+from airflow.contrib.operators.awsbatch_operator import AWSBatchOperator
+
+default_args = {
+'owner': 'airflow',
+'depends_on_past': False,
+'start_date': airflow.utils.dates.days_ago(2),
+'email': ['airf...@airflow.com'],
+'email_on_failure': False,
+'email_on_retry': False,
+'retries': 1,
+'retry_delay': timedelta(minutes=5),
+}
+
+dag = DAG(
+'example_awsbatch_dag', default_args=default_args, 
schedule_interval=timedelta(1))
+
+# vanilla example
+t0 = AWSBatchOperator(
+task_id='airflow-vanilla',
+job_name='airflow-vanilla',
+job_queue='airflow',
+job_definition='airflow',
+overrides={},
+queue='airflow',
+dag=dag)
+
+# overrides example
+t1 = AWSBatchOperator(
+job_name='airflow-overrides',
+task_id='airflow-overrides',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"overrides"
+]
+},
+queue='airflow',
+dag=dag)
+
+# parameters example
+t2 = AWSBatchOperator(
+job_name='airflow-parameters',
+task_id='airflow-parameters',
+job_queue='airflow',
+job_definition='airflow',
+overrides={
+"command": [
+"echo",
+"Ref::input"
+]
+},
+parameters={
+"input": "Airflow2000"
+},
+queue='airflow',
+dag=dag)
+
+t0.set_upstream(t1)
+t1.set_upstream(t2)
+
+except ImportError as e:
+log.warn("Could not import AWSBatchOperator: " + str(e))
+log.warn("Install AWS Batch dependencies with: "
+ "pip install apache-airflow[boto3]")
diff --git a/airflow/contrib/operators/awsbatch_operator.py 
b/airflow/contrib/operators/awsbatch_operator.py
index 3c778e6e68..8852861a24 100644
--- a/airflow/contrib/operators/awsbatch_operator.py
+++ b/airflow/contrib/operators/awsbatch_operator.py
@@ -46,6 +46,10 @@ class AWSBatchOperator(BaseOperator):
 containerOverrides (templated):
 
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
 :type overrides: dict
+:param parameters: the same parameter that boto3 will receive on
+parameters (templated):
+
http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job
+:type parameters: dict
 :param max_retries: exponential backoff retries while waiter is not
 merged, 4200 = 48 hours
 :type max_retries: int
@@ -61,10 +65,11 @@ class AWSBatchOperator(BaseOperator):
 ui_color = '#c3dae0'
 client = None
 arn = None
-template_fields = ('job_name', 'overrides',)
+template_fields = ('job_name', 'overrides', 'parameters',)
 
 @apply_defaults
-def __init__(self, job_name, job_definition, job_queue, overrides, 
max_retries=4200,
+def __init__(self, job_name, job_definition, job_queue, overrides,
+ parameters=None, max_retries=4200,
  

[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697575#comment-16697575
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [ ] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [ ] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for Parameters on AWS Batch Operator

2018-11-23 Thread GitBox
hugoprudente opened a new pull request #4230: [AIRFLOW-3066] Adding support for 
Parameters on AWS Batch Operator
URL: https://github.com/apache/incubator-airflow/pull/4230
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) issues and references 
them in the PR "\[AIRFLOW-3066\] Add job parameters to AWSbatch Operator"
   
   ### Description
   
   - [ ] Added the parameters for using dynamic commands on Job Execution
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   ```
   nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [ ] My commits all reference Jira issues: [AIRFLOW-3066] Adding support 
for AWS Batch parameters
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697571#comment-16697571
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente closed pull request #4229: [AIRFLOW-3066] Adding support for AWS 
Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4229
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):



 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hugoprudente closed pull request #4229: [AIRFLOW-3066] Adding support for AWS Batch parameters

2018-11-23 Thread GitBox
hugoprudente closed pull request #4229: [AIRFLOW-3066] Adding support for AWS 
Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4229
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):



 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697570#comment-16697570
 ] 

ASF GitHub Bot commented on AIRFLOW-3066:
-

hugoprudente opened a new pull request #4229: [AIRFLOW-3066] Adding support for 
AWS Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4229
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) "\[AIRFLOW-3066\] Add 
job parameters to AWSbatch Operator"
   
   ### Description
   
   - [ ] This PR adds the support for the parameters Ref::"NAME", to dynamic 
commands
   
   ### Tests
   
   - [ ] My PR adds the following unit tests 
   ```
nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [ ] My commit: [AIRFLOW-3066] Adding support for AWS Batch parameters
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - Updated inline documentation
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add job parameters to AWSbatch Operator
> ---
>
> Key: AIRFLOW-3066
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3066
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, operators
>Reporter: Raphael Norman-Tenazas
>Assignee: Hugo Prudente
>Priority: Minor
>  Labels: AWS, easyfix, features, newbie
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a 
> workflow. Currently, the AWSbatchOperator does not support this, and will use 
> the default parameters defined in the AWS job description.
> This can be implemented by adding a job_description={} parameter to the 
> AWSBatchOperator's __init__ and pass that into the client.submit_job call 
> with the keyword parameters.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] hugoprudente opened a new pull request #4229: [AIRFLOW-3066] Adding support for AWS Batch parameters

2018-11-23 Thread GitBox
hugoprudente opened a new pull request #4229: [AIRFLOW-3066] Adding support for 
AWS Batch parameters
URL: https://github.com/apache/incubator-airflow/pull/4229
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3066) "\[AIRFLOW-3066\] Add 
job parameters to AWSbatch Operator"
   
   ### Description
   
   - [ ] This PR adds the support for the parameters Ref::"NAME", to dynamic 
commands
   
   ### Tests
   
   - [ ] My PR adds the following unit tests 
   ```
nosetests -v tests/contrib/operators/test_awsbatch_operator.py
   test_check_success_task_not_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_failed 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_multiple 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_check_success_tasks_raises_pending 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_with_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_execute_without_failures 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_init 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_template_fields_overrides 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   test_wait_end_tasks 
(tests.contrib.operators.test_awsbatch_operator.TestAWSBatchOperator) ... ok
   
   --
   Ran 10 tests in 0.146s
   
   OK
   [2018-11-24 00:59:23,032] {settings.py:203} DEBUG - Disposing DB connection 
pool (PID 12221)
   ```
   
   ### Commits
   
   - [ ] My commit: [AIRFLOW-3066] Adding support for AWS Batch parameters
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - Updated inline documentation
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3384) Allow higher versions of sqlalchemy and jinja

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697553#comment-16697553
 ] 

ASF GitHub Bot commented on AIRFLOW-3384:
-

jlricon closed pull request #4227: [AIRFLOW-3384] Allow higher versions of 
Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 95ce629d3b..9ab2348cc2 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2440,7 +2440,7 @@ class derived from this one results in the creation of a 
task object,
 :param trigger_rule: defines the rule by which dependencies are applied
 for the task to get triggered. Options are:
 ``{ all_success | all_failed | all_done | one_success |
-one_failed | dummy}``
+one_failed | none_failed | dummy}``
 default is ``all_success``. Options can be set as string or
 using the constants defined in the static class
 ``airflow.utils.TriggerRule``
diff --git a/airflow/ti_deps/deps/trigger_rule_dep.py 
b/airflow/ti_deps/deps/trigger_rule_dep.py
index 8c9505db71..f1d58d0057 100644
--- a/airflow/ti_deps/deps/trigger_rule_dep.py
+++ b/airflow/ti_deps/deps/trigger_rule_dep.py
@@ -152,6 +152,11 @@ def _evaluate_trigger_rule(
 elif tr == TR.ONE_FAILED:
 if upstream_done and not (failed or upstream_failed):
 ti.set_state(State.SKIPPED, session)
+elif tr == TR.NONE_FAILED:
+if upstream_failed or failed:
+ti.set_state(State.UPSTREAM_FAILED, session)
+elif skipped == upstream:
+ti.set_state(State.SKIPPED, session)
 
 if tr == TR.ONE_SUCCESS:
 if successes <= 0:
@@ -194,6 +199,15 @@ def _evaluate_trigger_rule(
 "upstream_task_ids={3}"
 .format(tr, upstream_done, upstream_tasks_state,
 task.upstream_task_ids))
+elif tr == TR.NONE_FAILED:
+num_failures = upstream - successes - skipped
+if num_failures > 0:
+yield self._failing_status(
+reason="Task's trigger rule '{0}' requires all upstream "
+"tasks to have succeeded or been skipped, but found {1} 
non-success(es). "
+"upstream_tasks_state={2}, upstream_task_ids={3}"
+.format(tr, num_failures, upstream_tasks_state,
+task.upstream_task_ids))
 else:
 yield self._failing_status(
 reason="No strategy to evaluate trigger rule 
'{0}'.".format(tr))
diff --git a/airflow/utils/trigger_rule.py b/airflow/utils/trigger_rule.py
index 7fdcbc8ca8..4f7db65f7b 100644
--- a/airflow/utils/trigger_rule.py
+++ b/airflow/utils/trigger_rule.py
@@ -29,6 +29,7 @@ class TriggerRule(object):
 ONE_SUCCESS = 'one_success'
 ONE_FAILED = 'one_failed'
 DUMMY = 'dummy'
+NONE_FAILED = 'none_failed'
 
 _ALL_TRIGGER_RULES = {}
 
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 8753958af3..8a497e499c 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -652,6 +652,7 @@ while creating tasks:
 * ``all_done``: all parents are done with their execution
 * ``one_failed``: fires as soon as at least one parent has failed, it does not 
wait for all parents to be done
 * ``one_success``: fires as soon as at least one parent succeeds, it does not 
wait for all parents to be done
+* ``none_failed``: all parents have not failed (``failed`` or 
``upstream_failed``) i.e. all parents have succeeded or been skipped
 * ``dummy``: dependencies are just for show, trigger at will
 
 Note that these can be used in conjunction with ``depends_on_past`` (boolean)
diff --git a/setup.py b/setup.py
index e651f5a66e..0c21f7d9ec 100644
--- a/setup.py
+++ b/setup.py
@@ -299,7 +299,7 @@ def do_setup():
 'dill>=0.2.2, <0.3',
 'flask>=0.12.4, <0.13',
 'flask-appbuilder==1.12.1',
-'flask-admin==1.4.1',
+'flask-admin==1.5.2',
 'flask-caching>=1.3.3, <1.4.0',
 'flask-login>=0.3, <0.5',
 'flask-swagger==0.2.13',
@@ -310,7 +310,7 @@ def do_setup():
 'gunicorn>=19.4.0, <20.0',
 'iso8601>=0.1.12',
 'json-merge-patch==0.2',
-'jinja2>=2.7.3, <2.9.0',
+'jinja2>=2.7.3, <=2.10.0',
 'lxml>=4.0.0',
 'markdown>=2.5.2, <3.0',
 'pandas>=0.17.1, <1.0.0',
@@ -322,7 +322,7 @@ def do_setup():
 'python-nvd3==0.15.0',
 'requests>=2.5.1, 

[jira] [Commented] (AIRFLOW-3384) Allow higher versions of sqlalchemy and jinja

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697554#comment-16697554
 ] 

ASF GitHub Bot commented on AIRFLOW-3384:
-

jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions 
of Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   -  My PR addresses this JIRA issue: [AIRFLOW-3384]
   
   ### Description
   
   - [This PR bumps up the allowed versions of sqlalchemy and jinja2
   
   ### Tests
   
   - This PR does not add new functionality to be tested, nor changes the 
codebase.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow higher versions of sqlalchemy and jinja
> -
>
> Key: AIRFLOW-3384
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3384
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: dependencies
>Reporter: Jose Luis Ricon
>Assignee: Jose Luis Ricon
>Priority: Major
>
> At the moment airflow doesn't allow the installation of sqlalchemy version 
> 1.2.11 and jinja2==2.10 . Airflow works with both, and there is no reason to 
> allow higher versions. Projects downstream who are currently forcing the 
> installation of said versions, overriding airflow's dependencies, will 
> benefit for this as it will allow for version-compatible installations 
> without loss in functionality.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] jlricon closed pull request #4227: [AIRFLOW-3384] Allow higher versions of Sqlalchemy and Jinja2

2018-11-23 Thread GitBox
jlricon closed pull request #4227: [AIRFLOW-3384] Allow higher versions of 
Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 95ce629d3b..9ab2348cc2 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2440,7 +2440,7 @@ class derived from this one results in the creation of a 
task object,
 :param trigger_rule: defines the rule by which dependencies are applied
 for the task to get triggered. Options are:
 ``{ all_success | all_failed | all_done | one_success |
-one_failed | dummy}``
+one_failed | none_failed | dummy}``
 default is ``all_success``. Options can be set as string or
 using the constants defined in the static class
 ``airflow.utils.TriggerRule``
diff --git a/airflow/ti_deps/deps/trigger_rule_dep.py 
b/airflow/ti_deps/deps/trigger_rule_dep.py
index 8c9505db71..f1d58d0057 100644
--- a/airflow/ti_deps/deps/trigger_rule_dep.py
+++ b/airflow/ti_deps/deps/trigger_rule_dep.py
@@ -152,6 +152,11 @@ def _evaluate_trigger_rule(
 elif tr == TR.ONE_FAILED:
 if upstream_done and not (failed or upstream_failed):
 ti.set_state(State.SKIPPED, session)
+elif tr == TR.NONE_FAILED:
+if upstream_failed or failed:
+ti.set_state(State.UPSTREAM_FAILED, session)
+elif skipped == upstream:
+ti.set_state(State.SKIPPED, session)
 
 if tr == TR.ONE_SUCCESS:
 if successes <= 0:
@@ -194,6 +199,15 @@ def _evaluate_trigger_rule(
 "upstream_task_ids={3}"
 .format(tr, upstream_done, upstream_tasks_state,
 task.upstream_task_ids))
+elif tr == TR.NONE_FAILED:
+num_failures = upstream - successes - skipped
+if num_failures > 0:
+yield self._failing_status(
+reason="Task's trigger rule '{0}' requires all upstream "
+"tasks to have succeeded or been skipped, but found {1} 
non-success(es). "
+"upstream_tasks_state={2}, upstream_task_ids={3}"
+.format(tr, num_failures, upstream_tasks_state,
+task.upstream_task_ids))
 else:
 yield self._failing_status(
 reason="No strategy to evaluate trigger rule 
'{0}'.".format(tr))
diff --git a/airflow/utils/trigger_rule.py b/airflow/utils/trigger_rule.py
index 7fdcbc8ca8..4f7db65f7b 100644
--- a/airflow/utils/trigger_rule.py
+++ b/airflow/utils/trigger_rule.py
@@ -29,6 +29,7 @@ class TriggerRule(object):
 ONE_SUCCESS = 'one_success'
 ONE_FAILED = 'one_failed'
 DUMMY = 'dummy'
+NONE_FAILED = 'none_failed'
 
 _ALL_TRIGGER_RULES = {}
 
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 8753958af3..8a497e499c 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -652,6 +652,7 @@ while creating tasks:
 * ``all_done``: all parents are done with their execution
 * ``one_failed``: fires as soon as at least one parent has failed, it does not 
wait for all parents to be done
 * ``one_success``: fires as soon as at least one parent succeeds, it does not 
wait for all parents to be done
+* ``none_failed``: all parents have not failed (``failed`` or 
``upstream_failed``) i.e. all parents have succeeded or been skipped
 * ``dummy``: dependencies are just for show, trigger at will
 
 Note that these can be used in conjunction with ``depends_on_past`` (boolean)
diff --git a/setup.py b/setup.py
index e651f5a66e..0c21f7d9ec 100644
--- a/setup.py
+++ b/setup.py
@@ -299,7 +299,7 @@ def do_setup():
 'dill>=0.2.2, <0.3',
 'flask>=0.12.4, <0.13',
 'flask-appbuilder==1.12.1',
-'flask-admin==1.4.1',
+'flask-admin==1.5.2',
 'flask-caching>=1.3.3, <1.4.0',
 'flask-login>=0.3, <0.5',
 'flask-swagger==0.2.13',
@@ -310,7 +310,7 @@ def do_setup():
 'gunicorn>=19.4.0, <20.0',
 'iso8601>=0.1.12',
 'json-merge-patch==0.2',
-'jinja2>=2.7.3, <2.9.0',
+'jinja2>=2.7.3, <=2.10.0',
 'lxml>=4.0.0',
 'markdown>=2.5.2, <3.0',
 'pandas>=0.17.1, <1.0.0',
@@ -322,7 +322,7 @@ def do_setup():
 'python-nvd3==0.15.0',
 'requests>=2.5.1, <3',
 'setproctitle>=1.1.8, <2',
-'sqlalchemy>=1.1.15, <1.2.0',
+'sqlalchemy>=1.1.15, <1.3.0',
 'tabulate>=0.7.5, <=0.8.2',
 'tenacity==4.8.0',
 'thrift>=0.9.2',
diff --git 

[GitHub] jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions of Sqlalchemy and Jinja2

2018-11-23 Thread GitBox
jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions 
of Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   -  My PR addresses this JIRA issue: [AIRFLOW-3384]
   
   ### Description
   
   - [This PR bumps up the allowed versions of sqlalchemy and jinja2
   
   ### Tests
   
   - This PR does not add new functionality to be tested, nor changes the 
codebase.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-3388) AWS Batch Operator, add support to Array Jobs

2018-11-23 Thread Hugo Prudente (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3388?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hugo Prudente reassigned AIRFLOW-3388:
--

Assignee: Hugo Prudente

> AWS Batch Operator, add support to Array Jobs
> -
>
> Key: AIRFLOW-3388
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3388
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Hugo Prudente
>Assignee: Hugo Prudente
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3388) AWS Batch Operator, add support to Array Jobs

2018-11-23 Thread Hugo Prudente (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3388?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hugo Prudente updated AIRFLOW-3388:
---
Description: Add support to Array Jobs

> AWS Batch Operator, add support to Array Jobs
> -
>
> Key: AIRFLOW-3388
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3388
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Hugo Prudente
>Assignee: Hugo Prudente
>Priority: Major
>
> Add support to Array Jobs



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3388) AWS Batch Operator, add support to Array Jobs

2018-11-23 Thread Hugo Prudente (JIRA)
Hugo Prudente created AIRFLOW-3388:
--

 Summary: AWS Batch Operator, add support to Array Jobs
 Key: AIRFLOW-3388
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3388
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Hugo Prudente






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3389) AWS Batch Operator, add support to Dependencies

2018-11-23 Thread Hugo Prudente (JIRA)
Hugo Prudente created AIRFLOW-3389:
--

 Summary: AWS Batch Operator, add support to Dependencies
 Key: AIRFLOW-3389
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3389
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Hugo Prudente
Assignee: Hugo Prudente






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3387) Add JobRetry and JobTimeout to AWS Batch Operator

2018-11-23 Thread Hugo Prudente (JIRA)
Hugo Prudente created AIRFLOW-3387:
--

 Summary: Add JobRetry and JobTimeout to AWS Batch Operator 
 Key: AIRFLOW-3387
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3387
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Hugo Prudente
Assignee: Hugo Prudente


Add the AWS Retry and AWS Batch to AWS Batch operator 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3386) AWS Batch Operator, add support to Multi-Node processing -

2018-11-23 Thread Hugo Prudente (JIRA)
Hugo Prudente created AIRFLOW-3386:
--

 Summary: AWS Batch Operator, add support to Multi-Node processing 
- 
 Key: AIRFLOW-3386
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3386
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Hugo Prudente
Assignee: Hugo Prudente


Adding support for the new feature Multi Node processing for new feature:

https://docs.aws.amazon.com/batch/latest/userguide/multi-node-parallel-jobs.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4228: [AIRFLOW-1196][AIRFLOW-2399] Add templated field in TriggerDagRunOperator

2018-11-23 Thread GitBox
codecov-io edited a comment on issue #4228: [AIRFLOW-1196][AIRFLOW-2399] Add 
templated field in TriggerDagRunOperator
URL: 
https://github.com/apache/incubator-airflow/pull/4228#issuecomment-441311757
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=h1)
 Report
   > Merging 
[#4228](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/5955db1c765aae1cd187abe7401294bd72493521?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4228/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4228  +/-   ##
   =
   + Coverage   77.79%   77.8%   +<.01% 
   =
 Files 201 201  
 Lines   16360   16360  
   =
   + Hits12728   12729   +1 
   + Misses   36323631   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/dagrun\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4228/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZGFncnVuX29wZXJhdG9yLnB5)
 | `96.15% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4228/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=footer).
 Last update 
[5955db1...817e737](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4228: [AIRFLOW-1196][AIRFLOW-2399] Add templated field in TriggerDagRunOperator

2018-11-23 Thread GitBox
codecov-io commented on issue #4228: [AIRFLOW-1196][AIRFLOW-2399] Add templated 
field in TriggerDagRunOperator
URL: 
https://github.com/apache/incubator-airflow/pull/4228#issuecomment-441311757
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=h1)
 Report
   > Merging 
[#4228](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/5955db1c765aae1cd187abe7401294bd72493521?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4228/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4228  +/-   ##
   =
   + Coverage   77.79%   77.8%   +<.01% 
   =
 Files 201 201  
 Lines   16360   16360  
   =
   + Hits12728   12729   +1 
   + Misses   36323631   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/dagrun\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4228/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZGFncnVuX29wZXJhdG9yLnB5)
 | `96.15% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4228/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=footer).
 Last update 
[5955db1...817e737](https://codecov.io/gh/apache/incubator-airflow/pull/4228?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1196) Make trigger_dag_id a templated field of TriggerDagRunOperator

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697460#comment-16697460
 ] 

ASF GitHub Bot commented on AIRFLOW-1196:
-

kaxil opened a new pull request #4228: [AIRFLOW-1196][AIRFLOW-2399] Add 
templated field in TriggerDagRunOperator
URL: https://github.com/apache/incubator-airflow/pull/4228
 
 
   Make trigger_dag_id a templated field of TriggerDagRunOperator
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title.
 - https://issues.apache.org/jira/browse/AIRFLOW-1196
 - https://issues.apache.org/jira/browse/AIRFLOW-2399
   
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   Make `trigger_dag_id` a templated field of `TriggerDagRunOperator`
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   n/a
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Make trigger_dag_id a templated field of TriggerDagRunOperator
> --
>
> Key: AIRFLOW-1196
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1196
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Reporter: Arthur Vigil
>Assignee: Arthur Vigil
>Priority: Trivial
>  Labels: easyfix, improvement
>
> TriggerDagRunOperator currently has no templated fields. Adding 
> `trigger_dag_id` as a templated field should be a trivial change that 
> improves its flexibility and usefulness.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil opened a new pull request #4228: [AIRFLOW-1196][AIRFLOW-2399] Add templated field in TriggerDagRunOperator

2018-11-23 Thread GitBox
kaxil opened a new pull request #4228: [AIRFLOW-1196][AIRFLOW-2399] Add 
templated field in TriggerDagRunOperator
URL: https://github.com/apache/incubator-airflow/pull/4228
 
 
   Make trigger_dag_id a templated field of TriggerDagRunOperator
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title.
 - https://issues.apache.org/jira/browse/AIRFLOW-1196
 - https://issues.apache.org/jira/browse/AIRFLOW-2399
   
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   Make `trigger_dag_id` a templated field of `TriggerDagRunOperator`
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   n/a
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-156) Add date option to trigger_dag

2018-11-23 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik closed AIRFLOW-156.
--
Resolution: Duplicate

> Add date option to trigger_dag
> --
>
> Key: AIRFLOW-156
> URL: https://issues.apache.org/jira/browse/AIRFLOW-156
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: cli
>Reporter: Jeffrey Picard
>Priority: Minor
>
> Currently the trigger_dag command always sets the execution date to
> datetime.now(). This seems like a rather arbitrary restriction and there
> are use cases when running dags ad-hoc where one may wish to set a
> different execution date.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4182: [AIRFLOW-3336] Add new TriggerRule that will consider skipped ancestors as success

2018-11-23 Thread GitBox
codecov-io edited a comment on issue #4182: [AIRFLOW-3336] Add new TriggerRule 
that will consider skipped ancestors as success 
URL: 
https://github.com/apache/incubator-airflow/pull/4182#issuecomment-439204135
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=h1)
 Report
   > Merging 
[#4182](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **increase** coverage by `0.1%`.
   > The diff coverage is `60%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4182/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=tree)
   
   ```diff
   @@   Coverage Diff@@
   ##   master   #4182 +/-   ##
   
   + Coverage77.7%   77.8%   +0.1% 
   
 Files 199 201  +2 
 Lines   16312   16360 +48 
   
   + Hits12675   12729 +54 
   + Misses   36373631  -6
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/trigger\_rule.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy90cmlnZ2VyX3J1bGUucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/trigger\_rule\_dep.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvdHJpZ2dlcl9ydWxlX2RlcC5weQ==)
 | `90.14% <55.55%> (-5.03%)` | :arrow_down: |
   | 
[airflow/utils/helpers.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5)
 | `82.22% <0%> (-2.16%)` | :arrow_down: |
   | 
[airflow/api/common/experimental/trigger\_dag.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC90cmlnZ2VyX2RhZy5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/sensors/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL19faW5pdF9fLnB5)
 | `100% <0%> (ø)` | |
   | 
[airflow/operators/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvX19pbml0X18ucHk=)
 | `100% <0%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `69.37% <0%> (+0.01%)` | :arrow_up: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `95.09% <0%> (+0.06%)` | :arrow_up: |
   | ... and [6 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=footer).
 Last update 
[2b707ab...8178702](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-859) airflow trigger_dag not working

2018-11-23 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-859?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik closed AIRFLOW-859.
--
Resolution: Not A Problem

> airflow trigger_dag  not working
> --
>
> Key: AIRFLOW-859
> URL: https://issues.apache.org/jira/browse/AIRFLOW-859
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.7.1.3
> Environment: Using Airflow 1.7.1.3 on Amazon Web Services (date +%Z # 
> timezone name
>  displays UTC).
> Airflow using Mariadb also on AWS (maria db system_time_zone set to UTC) 
>Reporter: Stefano Tiani-Tanzi
>Priority: Major
>
> Hi,
> I have a simple DAG ( code at end of this message ).
> The DAG loads OK using CLI "airflow list_dags".
> In Airflow webpage the DAG is set to "On".
> I try and trigger the dag using CLI "airflow trigger_dag trigger_dag_v1" and 
> get feedback as follows:
> [2017-02-10 18:20:12,969] {__init__.py:36} INFO - Using executor LocalExecutor
> [2017-02-10 18:20:13,374] {cli.py:142} INFO - Created  @ 2017-02-10 18:20:13.342199: manual__2017-02-10T18:20:13.342199, externally 
> triggered: True>
> However, the dag does not start/run.
> I can see in the web interface ( using Browse|Dag Runs ) that the run is at 
> "Failed" status but there are no logs associated with the run. Also in the 
> Tree view no runs appear.
> Airflow server and db are both set to UTC and the DAG is switched on in the 
> web UI.
> Have I missed something or am I doing something incorrectly?
> Any help would be much appreciated.
> Many thanks
> Stefano
> import json
> from airflow import DAG
> from airflow.models import Variable
> from airflow.operators import DummyOperator
> from datetime import datetime, timedelta
> default_args = {
> 'owner': 'stef',
> 'start_date': datetime(2017, 01, 01),
> 'retries': 1,
> 'retry_delay': timedelta(minutes=3),
> 'depends_on_past': False,
> 'wait_for_downstream': True,
> 'provide_context': True
> }
> dag = DAG('trigger_dag_v1', default_args=default_args, schedule_interval=None)
> task_dummy = DummyOperator(
> task_id='dummy',
> dag=dag)   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3336) Add ability for "skipped" state to be considered success

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16697432#comment-16697432
 ] 

ASF GitHub Bot commented on AIRFLOW-3336:
-

kaxil closed pull request #4182: [AIRFLOW-3336] Add new TriggerRule that will 
consider skipped ancestors as success 
URL: https://github.com/apache/incubator-airflow/pull/4182
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index bb068499fe..df349ac996 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2440,7 +2440,7 @@ class derived from this one results in the creation of a 
task object,
 :param trigger_rule: defines the rule by which dependencies are applied
 for the task to get triggered. Options are:
 ``{ all_success | all_failed | all_done | one_success |
-one_failed | dummy}``
+one_failed | none_failed | dummy}``
 default is ``all_success``. Options can be set as string or
 using the constants defined in the static class
 ``airflow.utils.TriggerRule``
diff --git a/airflow/ti_deps/deps/trigger_rule_dep.py 
b/airflow/ti_deps/deps/trigger_rule_dep.py
index 8c9505db71..f1d58d0057 100644
--- a/airflow/ti_deps/deps/trigger_rule_dep.py
+++ b/airflow/ti_deps/deps/trigger_rule_dep.py
@@ -152,6 +152,11 @@ def _evaluate_trigger_rule(
 elif tr == TR.ONE_FAILED:
 if upstream_done and not (failed or upstream_failed):
 ti.set_state(State.SKIPPED, session)
+elif tr == TR.NONE_FAILED:
+if upstream_failed or failed:
+ti.set_state(State.UPSTREAM_FAILED, session)
+elif skipped == upstream:
+ti.set_state(State.SKIPPED, session)
 
 if tr == TR.ONE_SUCCESS:
 if successes <= 0:
@@ -194,6 +199,15 @@ def _evaluate_trigger_rule(
 "upstream_task_ids={3}"
 .format(tr, upstream_done, upstream_tasks_state,
 task.upstream_task_ids))
+elif tr == TR.NONE_FAILED:
+num_failures = upstream - successes - skipped
+if num_failures > 0:
+yield self._failing_status(
+reason="Task's trigger rule '{0}' requires all upstream "
+"tasks to have succeeded or been skipped, but found {1} 
non-success(es). "
+"upstream_tasks_state={2}, upstream_task_ids={3}"
+.format(tr, num_failures, upstream_tasks_state,
+task.upstream_task_ids))
 else:
 yield self._failing_status(
 reason="No strategy to evaluate trigger rule 
'{0}'.".format(tr))
diff --git a/airflow/utils/trigger_rule.py b/airflow/utils/trigger_rule.py
index 7fdcbc8ca8..4f7db65f7b 100644
--- a/airflow/utils/trigger_rule.py
+++ b/airflow/utils/trigger_rule.py
@@ -29,6 +29,7 @@ class TriggerRule(object):
 ONE_SUCCESS = 'one_success'
 ONE_FAILED = 'one_failed'
 DUMMY = 'dummy'
+NONE_FAILED = 'none_failed'
 
 _ALL_TRIGGER_RULES = {}
 
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 2896010248..13aa508f1d 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -652,6 +652,7 @@ while creating tasks:
 * ``all_done``: all parents are done with their execution
 * ``one_failed``: fires as soon as at least one parent has failed, it does not 
wait for all parents to be done
 * ``one_success``: fires as soon as at least one parent succeeds, it does not 
wait for all parents to be done
+* ``none_failed``: all parents have not failed (``failed`` or 
``upstream_failed``) i.e. all parents have succeeded or been skipped
 * ``dummy``: dependencies are just for show, trigger at will
 
 Note that these can be used in conjunction with ``depends_on_past`` (boolean)
diff --git a/tests/ti_deps/deps/test_trigger_rule_dep.py 
b/tests/ti_deps/deps/test_trigger_rule_dep.py
index 14dc01756c..2f3258f810 100644
--- a/tests/ti_deps/deps/test_trigger_rule_dep.py
+++ b/tests/ti_deps/deps/test_trigger_rule_dep.py
@@ -163,6 +163,44 @@ def test_all_success_tr_failure(self):
 self.assertEqual(len(dep_statuses), 1)
 self.assertFalse(dep_statuses[0].passed)
 
+def test_none_failed_tr_success(self):
+"""
+All success including skip trigger rule success
+"""
+ti = self._get_task_instance(TriggerRule.NONE_FAILED,
+ upstream_task_ids=["FakeTaskID",
+"OtherFakeTaskID"])
+dep_statuses = tuple(TriggerRuleDep()._evaluate_trigger_rule(
+ti=ti,
+

[jira] [Resolved] (AIRFLOW-3336) Add ability for "skipped" state to be considered success

2018-11-23 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3336?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-3336.
-
   Resolution: Fixed
Fix Version/s: 2.0.0

Resolved by https://github.com/apache/incubator-airflow/pull/4182

> Add ability for "skipped" state to be considered success
> 
>
> Key: AIRFLOW-3336
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3336
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: DAG
>Reporter: Ryan Nowacoski
>Assignee: Ryan Nowacoski
>Priority: Trivial
>  Labels: beginner, usability
> Fix For: 2.0.0
>
>
> Take the case where a task has 2 or more upstream parents and 1 or more of 
> them can skipped. If TriggerRule ALL_DONE is used then the task will trigger 
> even when upstream tasks fail. However if TriggerRule ALL_SUCCESS is used the 
> task won't be triggered if any upstream are skipped. This creates a gap in 
> functionality where it is necessary for "skipped" to be treated as "success" 
> so that the task only runs if all parents succeed or are skipped. Said 
> another way this allows tasks to be run if all ancestors do NOT fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil closed pull request #4182: [AIRFLOW-3336] Add new TriggerRule that will consider skipped ancestors as success

2018-11-23 Thread GitBox
kaxil closed pull request #4182: [AIRFLOW-3336] Add new TriggerRule that will 
consider skipped ancestors as success 
URL: https://github.com/apache/incubator-airflow/pull/4182
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index bb068499fe..df349ac996 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2440,7 +2440,7 @@ class derived from this one results in the creation of a 
task object,
 :param trigger_rule: defines the rule by which dependencies are applied
 for the task to get triggered. Options are:
 ``{ all_success | all_failed | all_done | one_success |
-one_failed | dummy}``
+one_failed | none_failed | dummy}``
 default is ``all_success``. Options can be set as string or
 using the constants defined in the static class
 ``airflow.utils.TriggerRule``
diff --git a/airflow/ti_deps/deps/trigger_rule_dep.py 
b/airflow/ti_deps/deps/trigger_rule_dep.py
index 8c9505db71..f1d58d0057 100644
--- a/airflow/ti_deps/deps/trigger_rule_dep.py
+++ b/airflow/ti_deps/deps/trigger_rule_dep.py
@@ -152,6 +152,11 @@ def _evaluate_trigger_rule(
 elif tr == TR.ONE_FAILED:
 if upstream_done and not (failed or upstream_failed):
 ti.set_state(State.SKIPPED, session)
+elif tr == TR.NONE_FAILED:
+if upstream_failed or failed:
+ti.set_state(State.UPSTREAM_FAILED, session)
+elif skipped == upstream:
+ti.set_state(State.SKIPPED, session)
 
 if tr == TR.ONE_SUCCESS:
 if successes <= 0:
@@ -194,6 +199,15 @@ def _evaluate_trigger_rule(
 "upstream_task_ids={3}"
 .format(tr, upstream_done, upstream_tasks_state,
 task.upstream_task_ids))
+elif tr == TR.NONE_FAILED:
+num_failures = upstream - successes - skipped
+if num_failures > 0:
+yield self._failing_status(
+reason="Task's trigger rule '{0}' requires all upstream "
+"tasks to have succeeded or been skipped, but found {1} 
non-success(es). "
+"upstream_tasks_state={2}, upstream_task_ids={3}"
+.format(tr, num_failures, upstream_tasks_state,
+task.upstream_task_ids))
 else:
 yield self._failing_status(
 reason="No strategy to evaluate trigger rule 
'{0}'.".format(tr))
diff --git a/airflow/utils/trigger_rule.py b/airflow/utils/trigger_rule.py
index 7fdcbc8ca8..4f7db65f7b 100644
--- a/airflow/utils/trigger_rule.py
+++ b/airflow/utils/trigger_rule.py
@@ -29,6 +29,7 @@ class TriggerRule(object):
 ONE_SUCCESS = 'one_success'
 ONE_FAILED = 'one_failed'
 DUMMY = 'dummy'
+NONE_FAILED = 'none_failed'
 
 _ALL_TRIGGER_RULES = {}
 
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 2896010248..13aa508f1d 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -652,6 +652,7 @@ while creating tasks:
 * ``all_done``: all parents are done with their execution
 * ``one_failed``: fires as soon as at least one parent has failed, it does not 
wait for all parents to be done
 * ``one_success``: fires as soon as at least one parent succeeds, it does not 
wait for all parents to be done
+* ``none_failed``: all parents have not failed (``failed`` or 
``upstream_failed``) i.e. all parents have succeeded or been skipped
 * ``dummy``: dependencies are just for show, trigger at will
 
 Note that these can be used in conjunction with ``depends_on_past`` (boolean)
diff --git a/tests/ti_deps/deps/test_trigger_rule_dep.py 
b/tests/ti_deps/deps/test_trigger_rule_dep.py
index 14dc01756c..2f3258f810 100644
--- a/tests/ti_deps/deps/test_trigger_rule_dep.py
+++ b/tests/ti_deps/deps/test_trigger_rule_dep.py
@@ -163,6 +163,44 @@ def test_all_success_tr_failure(self):
 self.assertEqual(len(dep_statuses), 1)
 self.assertFalse(dep_statuses[0].passed)
 
+def test_none_failed_tr_success(self):
+"""
+All success including skip trigger rule success
+"""
+ti = self._get_task_instance(TriggerRule.NONE_FAILED,
+ upstream_task_ids=["FakeTaskID",
+"OtherFakeTaskID"])
+dep_statuses = tuple(TriggerRuleDep()._evaluate_trigger_rule(
+ti=ti,
+successes=1,
+skipped=1,
+failed=0,
+upstream_failed=0,
+done=2,
+flag_upstream_failed=False,
+session="Fake Session"))
+self.assertEqual(len(dep_statuses), 0)
+
+def 

[GitHub] kaxil commented on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up example DAGs without other DAGs

2018-11-23 Thread GitBox
kaxil commented on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up example 
DAGs without other DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/2635#issuecomment-441300173
 
 
   @ashb @Fokko Any comments, if none - I'll merge this. I tested this PR and 
it works fine and fixes the issue.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jlricon commented on issue #4227: [AIRFLOW-3384] Allow higher versions of Sqlalchemy and Jinja2

2018-11-23 Thread GitBox
jlricon commented on issue #4227: [AIRFLOW-3384] Allow higher versions of 
Sqlalchemy and Jinja2
URL: 
https://github.com/apache/incubator-airflow/pull/4227#issuecomment-441290699
 
 
   Ok, that should do it. Master is broken atm, I guess that I'll pass that 
final test config when master builds again.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3385) Airflow DAGs in Kubernetes ask OAUTH Google Cloud Platform when access PostgreSQL database

2018-11-23 Thread RIo (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

RIo updated AIRFLOW-3385:
-
Description: 
I have set and up Airflow in Kubernetes. This Airflow using postgresql database 
in other pods in same cluster.

When I try to run some dags that connect to same postgresql with configured 
option, the connection to postgresql pods ask for OAUTH authorize
{code:java}
Reading local file: 
/root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,821] {models.py:1547} INFO - 

Starting attempt 1 of 6


[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing 
 on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 
2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd 
DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - 
setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - 
Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - 
Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas 
/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148:
 DeprecationWarning: Deprecated parameter `bql` used in Task id: 
task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to 
be executed. `bql` parameter is deprecated and will be removed in a future 
version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas   category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - 
Running  on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 
15:56:24,837] {base_hook.py:83} INFO - Using connection to: 
anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to 
authorize this application: 
https://accounts.google.com/o/oauth2/auth?response_type=code_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com

{code}
 
Why this happen? in the same pods, one ask for OAUTH and other not?
 

  was:
I have set and up Airflow in Kubernetes. This Airflow using postgresql database 
in other pods in same cluster.

When I try to run some dags that connect to same postgresql with configured 
option, the connection to postgresql pods ask for OAUTH authorize
{code:java}
Reading local file: 
/root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,821] {models.py:1547} INFO - 

Starting attempt 1 of 6


[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing 
 on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 
2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd 
DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - 
setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - 
Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - 
Filling up the DagBag from 

[jira] [Updated] (AIRFLOW-3385) Airflow DAGs in Kubernetes ask OAUTH Google Cloud Platform when access PostgreSQL database

2018-11-23 Thread RIo (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

RIo updated AIRFLOW-3385:
-
Description: 
I have set and up Airflow in Kubernetes. This Airflow using postgresql database 
in other pods in same cluster.

When I try to run some dags that connect to same postgresql with configured 
option, the connection to postgresql pods ask for OAUTH authorize
{code:java}
Reading local file: 
/root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,821] {models.py:1547} INFO - 

Starting attempt 1 of 6


[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing 
 on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 
2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd 
DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - 
setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - 
Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - 
Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas 
/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148:
 DeprecationWarning: Deprecated parameter `bql` used in Task id: 
task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to 
be executed. `bql` parameter is deprecated and will be removed in a future 
version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas   category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - 
Running  on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 
15:56:24,837] {base_hook.py:83} INFO - Using connection to: 
anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to 
authorize this application: 
https://accounts.google.com/o/oauth2/auth?response_type=code_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery=zI3Uv6bt3ncdAGIPgWQ5PF36V3uMq0=consent_type=offline

{code}
 
Why this happen? in the same pods, one ask for OAUTH and other not?
 

  was:
I have set and up Airflow in Kubernetes. This Airflow using postgresql database 
in other pods in same cluster.

When I try to run some dags that connect to same postgresql with configured 
option, the connection to postgresql pods ask for OAUTH authorize
{code:java}
Reading local file: 
/root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,821] {models.py:1547} INFO - 

Starting attempt 1 of 6


[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing 
 on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 
2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd 
DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - 
setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - 
Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: 

[jira] [Created] (AIRFLOW-3385) Airflow DAGs in Kubernetes ask OAUTH Google Cloud Platform when access PostgreSQL database

2018-11-23 Thread RIo (JIRA)
RIo created AIRFLOW-3385:


 Summary: Airflow DAGs in Kubernetes ask OAUTH Google Cloud 
Platform when access PostgreSQL database
 Key: AIRFLOW-3385
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3385
 Project: Apache Airflow
  Issue Type: Bug
  Components: authentication, kubernetes
Affects Versions: 1.8.1
Reporter: RIo


I have set and up Airflow in Kubernetes. This Airflow using postgresql database 
in other pods in same cluster.

When I try to run some dags that connect to same postgresql with configured 
option, the connection to postgresql pods ask for OAUTH authorize
{code:java}
Reading local file: 
/root/airflow/logs/anu_funnel_analysis_v2/task_4_get_bigquery_pandas/2018-06-29T02:30:00+00:00/1.log
[2018-11-23 15:56:22,796] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,820] {models.py:1335} INFO - Dependencies all met for 

[2018-11-23 15:56:22,821] {models.py:1547} INFO - 

Starting attempt 1 of 6


[2018-11-23 15:56:22,838] {models.py:1569} INFO - Executing 
 on 2018-06-29T02:30:00+00:00
[2018-11-23 15:56:22,839] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', u'airflow run anu_funnel_analysis_v2 task_4_get_bigquery_pandas 
2018-06-29T02:30:00+00:00 --job_id 82 --raw -sd 
DAGS_FOLDER/bigquery_for_funnel.py --cfg_path /tmp/tmpJ1Pe0O']
[2018-11-23 15:56:23,656] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,655] {settings.py:174} INFO - 
setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-23 15:56:23,817] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,816] {__init__.py:51} INFO - 
Using executor SequentialExecutor
[2018-11-23 15:56:23,952] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:23,950] {models.py:258} INFO - 
Filling up the DagBag from /root/airflow/dags/bigquery_for_funnel.py
[2018-11-23 15:56:24,737] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas 
/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py:148:
 DeprecationWarning: Deprecated parameter `bql` used in Task id: 
task_2_bq_read_new_table_funnel. Use `sql` parameter instead to pass the sql to 
be executed. `bql` parameter is deprecated and will be removed in a future 
version of Airflow.
[2018-11-23 15:56:24,738] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas   category=DeprecationWarning)
[2018-11-23 15:56:24,778] {base_task_runner.py:107} INFO - Job 82: Subtask 
task_4_get_bigquery_pandas [2018-11-23 15:56:24,777] {cli.py:492} INFO - 
Running  on host airflow-5d87576f5b-v5qmg
[2018-11-23 15:56:24,837] {logging_mixin.py:95} INFO - [2018-11-23 
15:56:24,837] {base_hook.py:83} INFO - Using connection to: 
anudata-postgresql-service

[2018-11-23 15:56:25,335] {logging_mixin.py:95} INFO - Please visit this URL to 
authorize this application: 
https://accounts.google.com/o/oauth2/auth?response_type=code_id=495642085510-k0tmvj2m941jhre2nbqka17vqpjfddtd.apps.googleusercontent.com_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery=zI3Uv6bt3ncdAGIPgWQ5PF36V3uMq0=consent_type=offline

{code}
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] mrkm4ntr commented on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up example DAGs without other DAGs

2018-11-23 Thread GitBox
mrkm4ntr commented on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up 
example DAGs without other DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/2635#issuecomment-441270330
 
 
   @kaxil I rebased. Please check it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up example DAGs without other DAGs

2018-11-23 Thread GitBox
codecov-io edited a comment on issue #2635: [AIRFLOW-1561] Fix scheduler to 
pick up example DAGs without other DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/2635#issuecomment-332123323
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=h1)
 Report
   > Merging 
[#2635](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77da1cc989fd6d67c337ae934c892e5e165167a2?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `87.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/2635/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#2635  +/-   ##
   ==
   + Coverage   77.81%   77.83%   +0.01% 
   ==
 Files 201  201  
 Lines   1635016351   +1 
   ==
   + Hits1272312726   +3 
   + Misses   3627 3625   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `57.85% <100%> (+0.32%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.28% <100%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=footer).
 Last update 
[77da1cc...d1f6f55](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #2635: [AIRFLOW-1561] Fix scheduler to pick up example DAGs without other DAGs

2018-11-23 Thread GitBox
codecov-io edited a comment on issue #2635: [AIRFLOW-1561] Fix scheduler to 
pick up example DAGs without other DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/2635#issuecomment-332123323
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=h1)
 Report
   > Merging 
[#2635](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77da1cc989fd6d67c337ae934c892e5e165167a2?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `87.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/2635/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#2635  +/-   ##
   ==
   + Coverage   77.81%   77.83%   +0.01% 
   ==
 Files 201  201  
 Lines   1635016351   +1 
   ==
   + Hits1272312726   +3 
   + Misses   3627 3625   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `57.85% <100%> (+0.32%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/2635/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.28% <100%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=footer).
 Last update 
[77da1cc...d1f6f55](https://codecov.io/gh/apache/incubator-airflow/pull/2635?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3384) Allow higher versions of sqlalchemy and jinja

2018-11-23 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16696680#comment-16696680
 ] 

ASF GitHub Bot commented on AIRFLOW-3384:
-

jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions 
of Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [AIRFLOW-3384] My PR addresses the following [Airflow Jira]
   
   ### Description
   
   - [This PR bumps up the allowed versions of sqlalchemy and jinja2
   
   ### Tests
   
   - This PR does not add new functionality to be tested, nor changes the 
codebase.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow higher versions of sqlalchemy and jinja
> -
>
> Key: AIRFLOW-3384
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3384
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: dependencies
>Reporter: Jose Luis Ricon
>Assignee: Jose Luis Ricon
>Priority: Major
>
> At the moment airflow doesn't allow the installation of sqlalchemy version 
> 1.2.11 and jinja2==2.10 . Airflow works with both, and there is no reason to 
> allow higher versions. Projects downstream who are currently forcing the 
> installation of said versions, overriding airflow's dependencies, will 
> benefit for this as it will allow for version-compatible installations 
> without loss in functionality.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions of Sqlalchemy and Jinja2

2018-11-23 Thread GitBox
jlricon opened a new pull request #4227: [AIRFLOW-3384] Allow higher versions 
of Sqlalchemy and Jinja2
URL: https://github.com/apache/incubator-airflow/pull/4227
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [AIRFLOW-3384] My PR addresses the following [Airflow Jira]
   
   ### Description
   
   - [This PR bumps up the allowed versions of sqlalchemy and jinja2
   
   ### Tests
   
   - This PR does not add new functionality to be tested, nor changes the 
codebase.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3384) Allow higher versions of sqlalchemy and jinja

2018-11-23 Thread Jose Luis Ricon (JIRA)
Jose Luis Ricon created AIRFLOW-3384:


 Summary: Allow higher versions of sqlalchemy and jinja
 Key: AIRFLOW-3384
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3384
 Project: Apache Airflow
  Issue Type: Improvement
  Components: dependencies
Reporter: Jose Luis Ricon
Assignee: Jose Luis Ricon


At the moment airflow doesn't allow the installation of sqlalchemy version 
1.2.11 and jinja2==2.10 . Airflow works with both, and there is no reason to 
allow higher versions. Projects downstream who are currently forcing the 
installation of said versions, overriding airflow's dependencies, will benefit 
for this as it will allow for version-compatible installations without loss in 
functionality.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] samdjstephens commented on issue #3675: [AIRFLOW-2834] fix build script for k8s docker

2018-11-23 Thread GitBox
samdjstephens commented on issue #3675: [AIRFLOW-2834] fix build script for k8s 
docker
URL: 
https://github.com/apache/incubator-airflow/pull/3675#issuecomment-441206942
 
 
   @ashb @yeluolei I just went through the pain of trying to use 
`scripts/ci/kubernetes` to test the kubernetes executor out locally, debugging 
it and finding this PR. Can we get this merged so no-one else has to go through 
this again?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4223: [AIRFLOW-XXX] Remove unnecessary usage of "# noqa" in airflow/bin/cli.py

2018-11-23 Thread GitBox
codecov-io edited a comment on issue #4223: [AIRFLOW-XXX] Remove unnecessary 
usage of "# noqa" in airflow/bin/cli.py
URL: 
https://github.com/apache/incubator-airflow/pull/4223#issuecomment-441178549
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=h1)
 Report
   > Merging 
[#4223](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77da1cc989fd6d67c337ae934c892e5e165167a2?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4223/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4223  +/-   ##
   ==
   - Coverage   77.81%   77.81%   -0.01% 
   ==
 Files 201  201  
 Lines   1635016350  
   ==
   - Hits1272312722   -1 
   - Misses   3627 3628   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4223/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.59% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4223/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.29% <0%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=footer).
 Last update 
[77da1cc...e837c7f](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4223: [AIRFLOW-XXX] Remove unnecessary usage of "# noqa" in airflow/bin/cli.py

2018-11-23 Thread GitBox
codecov-io commented on issue #4223: [AIRFLOW-XXX] Remove unnecessary usage of 
"# noqa" in airflow/bin/cli.py
URL: 
https://github.com/apache/incubator-airflow/pull/4223#issuecomment-441178549
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=h1)
 Report
   > Merging 
[#4223](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/77da1cc989fd6d67c337ae934c892e5e165167a2?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4223/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4223  +/-   ##
   ==
   - Coverage   77.81%   77.81%   -0.01% 
   ==
 Files 201  201  
 Lines   1635016350  
   ==
   - Hits1272312722   -1 
   - Misses   3627 3628   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4223/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.59% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4223/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.29% <0%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=footer).
 Last update 
[77da1cc...e837c7f](https://codecov.io/gh/apache/incubator-airflow/pull/4223?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services