[jira] [Commented] (AIRFLOW-3900) Prevent undefined variables in templates

2020-06-22 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3900?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17142404#comment-17142404
 ] 

ASF subversion and git services commented on AIRFLOW-3900:
--

Commit 53c4e1cfd7f24d68207d9bed63862099c5e6401f in airflow's branch 
refs/heads/v1-10-test from Joshua Carp
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=53c4e1c ]

[AIRFLOW-3900] Error on undefined template variables in unit tests. (#4719)

(cherry-picked from a7586648726aa99f0976150ad376c0ce553544b0)


> Prevent undefined variables in templates
> 
>
> Key: AIRFLOW-3900
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3900
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Joshua Carp
>Assignee: Joshua Carp
>Priority: Trivial
> Fix For: 2.0.0
>
>
> To ensure that we haven't introduced regressions related to undefined 
> variables in templates, run unit tests with jinja2 configured to raise an 
> exception on undefined variables. Now that flask-appbuilder has been updated 
> to fix undefined variables in base templates, we can be confident that any 
> undefined variables are caused by regressions in airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[airflow] branch v1-10-test updated: [AIRFLOW-8902] Fix Dag Run UI execution date with timezone cannot be saved issue (#8902)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 1249b50  [AIRFLOW-8902] Fix Dag Run UI execution date with timezone 
cannot be saved issue (#8902)
1249b50 is described below

commit 1249b501ee315e579b462f30970aef74f3c99121
Author: Jacob Shao <41271167+realradi...@users.noreply.github.com>
AuthorDate: Mon May 25 09:09:02 2020 -0400

[AIRFLOW-8902] Fix Dag Run UI execution date with timezone cannot be saved 
issue (#8902)

Closes #8842
---
 airflow/www_rbac/forms.py| 61 +---
 airflow/www_rbac/utils.py|  7 +++--
 tests/www_rbac/test_views.py | 73 +---
 3 files changed, 123 insertions(+), 18 deletions(-)

diff --git a/airflow/www_rbac/forms.py b/airflow/www_rbac/forms.py
index c4f9796..48b9142 100644
--- a/airflow/www_rbac/forms.py
+++ b/airflow/www_rbac/forms.py
@@ -23,33 +23,75 @@ from __future__ import print_function
 from __future__ import unicode_literals
 
 import json
+from datetime import datetime as dt
 
+import pendulum
 from flask_appbuilder.fieldwidgets import (
 BS3PasswordFieldWidget, BS3TextAreaFieldWidget, BS3TextFieldWidget, 
Select2Widget,
 )
 from flask_appbuilder.forms import DynamicForm
 from flask_babel import lazy_gettext
 from flask_wtf import FlaskForm
+from wtforms import validators, widgets
+from wtforms.fields import (
+BooleanField, Field, IntegerField, PasswordField, SelectField, 
StringField, TextAreaField,
+)
 
-from wtforms import validators
-from wtforms.fields import (IntegerField, SelectField, TextAreaField, 
PasswordField,
-StringField, DateTimeField, BooleanField)
+from airflow.configuration import conf
 from airflow.models import Connection
 from airflow.utils import timezone
 from airflow.www_rbac.validators import ValidJson
 from airflow.www_rbac.widgets import AirflowDateTimePickerWidget
 
 
+class DateTimeWithTimezoneField(Field):
+"""
+A text field which stores a `datetime.datetime` matching a format.
+"""
+widget = widgets.TextInput()
+
+def __init__(self, label=None, validators=None, format=None, **kwargs):
+super(DateTimeWithTimezoneField, self).__init__(label, validators, 
**kwargs)
+self.format = format or "%Y-%m-%d %H:%M:%S%Z"
+
+def _value(self):
+if self.raw_data:
+return ' '.join(self.raw_data)
+else:
+return self.data and self.data.strftime(self.format) or ''
+
+def process_formdata(self, valuelist):
+if valuelist:
+date_str = ' '.join(valuelist)
+try:
+# Check if the datetime string is in the format without 
timezone, if so convert it to the
+# default timezone
+if len(date_str) == 19:
+parsed_datetime = dt.strptime(date_str, '%Y-%m-%d 
%H:%M:%S')
+defualt_timezone = pendulum.timezone('UTC')
+tz = conf.get("core", "default_timezone")
+if tz == "system":
+defualt_timezone = pendulum.local_timezone()
+else:
+defualt_timezone = pendulum.timezone(tz)
+self.data = defualt_timezone.convert(parsed_datetime)
+else:
+self.data = pendulum.parse(date_str)
+except ValueError:
+self.data = None
+raise ValueError(self.gettext('Not a valid datetime value'))
+
+
 class DateTimeForm(FlaskForm):
 # Date filter form needed for task views
-execution_date = DateTimeField(
+execution_date = DateTimeWithTimezoneField(
 "Execution date", widget=AirflowDateTimePickerWidget())
 
 
 class DateTimeWithNumRunsForm(FlaskForm):
 # Date time and number of runs form for tree view, task duration
 # and landing times
-base_date = DateTimeField(
+base_date = DateTimeWithTimezoneField(
 "Anchor date", widget=AirflowDateTimePickerWidget(), 
default=timezone.utcnow())
 num_runs = SelectField("Number of runs", default=25, choices=(
 (5, "5"),
@@ -70,10 +112,10 @@ class DagRunForm(DynamicForm):
 lazy_gettext('Dag Id'),
 validators=[validators.DataRequired()],
 widget=BS3TextFieldWidget())
-start_date = DateTimeField(
+start_date = DateTimeWithTimezoneField(
 lazy_gettext('Start Date'),
 widget=AirflowDateTimePickerWidget())
-end_date = DateTimeField(
+end_date = DateTimeWithTimezoneField(
 lazy_gettext('End Date'),
 widget=AirflowDateTimePickerWidget())
 run_id = StringField(
@@ -84,7 +126,7 @@ class DagRunForm(DynamicForm):
 lazy_gettext('State'),
 choices=(('success', 'success'), ('running', 'running'), 

[jira] [Commented] (AIRFLOW-3900) Prevent undefined variables in templates

2020-06-22 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3900?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17142438#comment-17142438
 ] 

ASF subversion and git services commented on AIRFLOW-3900:
--

Commit bb34c057244720cf25d1745297b1d89da5c8f85c in airflow's branch 
refs/heads/v1-10-test from Kaxil Naik
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=bb34c05 ]

fixup! [AIRFLOW-3900] Error on undefined template variables in unit tests. 
(#4719)


> Prevent undefined variables in templates
> 
>
> Key: AIRFLOW-3900
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3900
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Joshua Carp
>Assignee: Joshua Carp
>Priority: Trivial
> Fix For: 2.0.0
>
>
> To ensure that we haven't introduced regressions related to undefined 
> variables in templates, run unit tests with jinja2 configured to raise an 
> exception on undefined variables. Now that flask-appbuilder has been updated 
> to fix undefined variables in base templates, we can be confident that any 
> undefined variables are caused by regressions in airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[airflow] branch v1-10-test updated: fixup! [AIRFLOW-8902] Fix Dag Run UI execution date with timezone cannot be saved issue (#8902)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new c0be8c4  fixup! [AIRFLOW-8902] Fix Dag Run UI execution date with 
timezone cannot be saved issue (#8902)
c0be8c4 is described below

commit c0be8c4a93eefa60a09069af41d9fbca713d82e1
Author: Kaxil Naik 
AuthorDate: Mon Jun 22 23:45:45 2020 +0100

fixup! [AIRFLOW-8902] Fix Dag Run UI execution date with timezone cannot be 
saved issue (#8902)
---
 tests/www_rbac/test_views.py | 12 ++--
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index fb48cfe..d5eb040 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -28,7 +28,7 @@ import sys
 import tempfile
 import unittest
 import urllib
-from datetime import datetime as dt, timedelta, timezone as tz
+from datetime import datetime as dt, timedelta
 
 import pytest
 import six
@@ -2503,7 +2503,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=tz.utc))
+self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 5, 
4, 3))
 
 def test_create_dagrun_execution_date_with_timezone_edt(self):
 data = {
@@ -2519,7 +2519,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=tz(timedelta(hours=-4
+self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 9, 
4, 3))
 
 def test_create_dagrun_execution_date_with_timezone_pst(self):
 data = {
@@ -2535,7 +2535,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=tz(timedelta(hours=-8
+self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 13, 
4, 3))
 
 @conf_vars({("core", "default_timezone"): "America/Toronto"})
 def test_create_dagrun_execution_date_without_timezone_default_edt(self):
@@ -2552,7 +2552,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=tz(timedelta(hours=-4
+self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 5, 
4, 3))
 
 def test_create_dagrun_execution_date_without_timezone_default_utc(self):
 data = {
@@ -2568,7 +2568,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=tz.utc))
+self.assertEqual(dr.execution_date, dt(2018, 7, 6, 5, 4, 3, 
tzinfo=timezone.TIMEZONE))
 
 def test_create_dagrun_valid_conf(self):
 conf_value = dict(Valid=True)



[airflow] branch v1-10-test updated: Use Markup for htmlcontent for landing_times (#9242)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new b006690  Use Markup for htmlcontent for landing_times (#9242)
b006690 is described below

commit b006690fec9dbd8a4c5dca72d93e35d147eaa649
Author: Kaxil Naik 
AuthorDate: Fri Jun 12 12:28:26 2020 +0100

Use Markup for htmlcontent for landing_times (#9242)

(cherry picked from commit dcf65765e58fb8beba206454075bbd6675b65721)
---
 airflow/www/views.py  | 2 +-
 airflow/www_rbac/views.py | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 6238f02..9d8e37a 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2022,7 +2022,7 @@ class Airflow(AirflowViewMixin, BaseView):
 return self.render(
 'airflow/chart.html',
 dag=dag,
-chart=chart.htmlcontent,
+chart=Markup(chart.htmlcontent),
 height=str(chart_height + 100) + "px",
 demo_mode=conf.getboolean('webserver', 'demo_mode'),
 root=root,
diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 7c601ba..be7a384 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -1890,7 +1890,7 @@ class Airflow(AirflowBaseView):
 return self.render_template(
 'airflow/chart.html',
 dag=dag,
-chart=chart.htmlcontent,
+chart=Markup(chart.htmlcontent),
 height=str(chart_height + 100) + "px",
 demo_mode=conf.getboolean('webserver', 'demo_mode'),
 root=root,



[airflow] 01/05: Use existing DagBag for 'dag_details' & `trigger` Endpoints (#8501)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b4f7360eb4a0c10caa391f9921fc94570605e245
Author: Kaxil Naik 
AuthorDate: Tue Apr 21 22:24:58 2020 +0100

Use existing DagBag for 'dag_details' & `trigger` Endpoints (#8501)
---
 airflow/www_rbac/views.py|  4 +---
 tests/www_rbac/test_views.py | 25 +
 2 files changed, 26 insertions(+), 3 deletions(-)

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index bf33f2b..e78b61b 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -562,9 +562,7 @@ class Airflow(AirflowBaseView):
 @provide_session
 def dag_details(self, session=None):
 dag_id = request.args.get('dag_id')
-dag_orm = DagModel.get_dagmodel(dag_id, session=session)
-# FIXME: items needed for this view should move to the database
-dag = dag_orm.get_dag(STORE_SERIALIZED_DAGS)
+dag = dagbag.get_dag(dag_id)
 title = "DAG details"
 root = request.args.get('root', '')
 
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index b705cad..68a605a 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -565,6 +565,19 @@ class TestAirflowBaseViews(TestBase):
 resp = self.client.get(url, follow_redirects=True)
 self.check_content_in_response('DAG details', resp)
 
+@parameterized.expand(["graph", "tree", "dag_details"])
+@mock.patch('airflow.www_rbac.views.dagbag.get_dag')
+def test_view_uses_existing_dagbag(self, endpoint, mock_get_dag):
+"""
+Test that Graph, Tree & Dag Details View uses the DagBag already 
created in views.py
+instead of creating a new one.
+"""
+mock_get_dag.return_value = DAG(dag_id='example_bash_operator')
+url = '{}?dag_id=example_bash_operator'.format(endpoint)
+resp = self.client.get(url, follow_redirects=True)
+mock_get_dag.assert_called_once_with('example_bash_operator')
+self.check_content_in_response('example_bash_operator', resp)
+
 def test_dag_details_trigger_origin_tree_view(self):
 dag = self.dagbag.dags['test_tree_view']
 dag.create_dagrun(
@@ -2207,6 +2220,18 @@ class TestTriggerDag(TestBase):
 self.check_content_in_response(
 'Triggered example_bash_operator, it should start any moment 
now.', response)
 
+@mock.patch('airflow.www_rbac.views.dagbag.get_dag')
+def test_trigger_endpoint_uses_existing_dagbag(self, mock_get_dag):
+"""
+Test that Trigger Endpoint uses the DagBag already created in views.py
+instead of creating a new one.
+"""
+mock_get_dag.return_value = DAG(dag_id='example_bash_operator')
+url = 'trigger?dag_id=example_bash_operator'
+resp = self.client.post(url, data={}, follow_redirects=True)
+mock_get_dag.assert_called_once_with('example_bash_operator')
+self.check_content_in_response('example_bash_operator', resp)
+
 
 class TestExtraLinks(TestBase):
 def setUp(self):



[airflow] branch v1-10-test updated (d65053c -> 92a1040)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from d65053c  [AIRFLOW-4357] Fix SVG tooltip positioning with custom 
scripting (#8269)
 new b4f7360  Use existing DagBag for 'dag_details' & `trigger` Endpoints 
(#8501)
 new 8405787  Make hive macros py3 compatible (#8598)
 new 3127b0a  Enhanced documentation around Cluster Policy (#8661)
 new 731ee51  Improve tutorial - Include all imports statements (#8670)
 new 92a1040  Fix docs on creating CustomOperator (#8678)

The 5 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/example_dags/tutorial.py |  2 +-
 airflow/hooks/hive_hooks.py  |  3 ++-
 airflow/www_rbac/views.py|  4 +---
 docs/concepts.rst|  7 +++
 docs/howto/custom-operator.rst   |  6 +++---
 tests/hooks/test_hive_hook.py| 15 ---
 tests/www_rbac/test_views.py | 25 +
 7 files changed, 51 insertions(+), 11 deletions(-)



[airflow] 04/05: Improve tutorial - Include all imports statements (#8670)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 731ee51ae1ba6a8ab51f3ac861ab2b9b73bcb094
Author: Lyalpha 
AuthorDate: Sun May 3 17:25:39 2020 +0100

Improve tutorial - Include all imports statements (#8670)

(cherry picked from commit 62796b9e0154daf38de72ebca36e3175001fbf03)
---
 airflow/example_dags/tutorial.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/example_dags/tutorial.py b/airflow/example_dags/tutorial.py
index 18685d1..8858452 100644
--- a/airflow/example_dags/tutorial.py
+++ b/airflow/example_dags/tutorial.py
@@ -23,9 +23,9 @@ Documentation that goes along with the Airflow tutorial 
located
 [here](https://airflow.apache.org/tutorial.html)
 """
 # [START tutorial]
+# [START import_module]
 from datetime import timedelta
 
-# [START import_module]
 # The DAG object; we'll need this to instantiate a DAG
 from airflow import DAG
 # Operators; we need this to operate!



[airflow] 05/05: Fix docs on creating CustomOperator (#8678)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 92a1040ce152118997fe75231acc4069e3668b48
Author: Jonny Fuller 
AuthorDate: Sat May 9 20:33:45 2020 -0400

Fix docs on creating CustomOperator (#8678)

(cherry picked from commit 5e1c33a1baf0725eeb695a96b29ddd9585df51e4)
---
 docs/howto/custom-operator.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/howto/custom-operator.rst b/docs/howto/custom-operator.rst
index 7713468..a9733d2 100644
--- a/docs/howto/custom-operator.rst
+++ b/docs/howto/custom-operator.rst
@@ -148,7 +148,7 @@ the operator.
 self.name = name
 
 def execute(self, context):
-message = "Hello from {}".format(name)
+message = "Hello from {}".format(self.name)
 print(message)
 return message
 
@@ -157,9 +157,9 @@ You can use the template as follows:
 .. code:: python
 
 with dag:
-hello_task = HelloOperator(task_id='task_id_1', dag=dag, name='{{ 
task_id }}')
+hello_task = HelloOperator(task_id='task_id_1', dag=dag, name='{{ 
task_instance.task_id }}')
 
-In this example, Jinja looks for the ``name`` parameter and substitutes ``{{ 
task_id }}`` with
+In this example, Jinja looks for the ``name`` parameter and substitutes ``{{ 
task_instance.task_id }}`` with
 ``task_id_1``.
 
 



[airflow] 02/05: Make hive macros py3 compatible (#8598)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8405787b6915d67be09ed3a0b18bacb2d130d03e
Author: Ace Haidrey 
AuthorDate: Tue Jun 16 13:19:41 2020 -0700

Make hive macros py3 compatible (#8598)

Co-authored-by: Ace Haidrey 

(cherry-picked from c78e2a5)
---
 airflow/hooks/hive_hooks.py   |  3 ++-
 tests/hooks/test_hive_hook.py | 15 ---
 2 files changed, 14 insertions(+), 4 deletions(-)

diff --git a/airflow/hooks/hive_hooks.py b/airflow/hooks/hive_hooks.py
index 4e41834..ec37dfb 100644
--- a/airflow/hooks/hive_hooks.py
+++ b/airflow/hooks/hive_hooks.py
@@ -691,6 +691,7 @@ class HiveMetastoreHook(BaseHook):
pairs will be considered as candidates of max 
partition.
 :type filter_map: map
 :return: Max partition or None if part_specs is empty.
+:rtype: basestring
 """
 if not part_specs:
 return None
@@ -714,7 +715,7 @@ class HiveMetastoreHook(BaseHook):
 if not candidates:
 return None
 else:
-return max(candidates).encode('utf-8')
+return max(candidates)
 
 def max_partition(self, schema, table_name, field=None, filter_map=None):
 """
diff --git a/tests/hooks/test_hive_hook.py b/tests/hooks/test_hive_hook.py
index 98c024f..011038a 100644
--- a/tests/hooks/test_hive_hook.py
+++ b/tests/hooks/test_hive_hook.py
@@ -362,7 +362,7 @@ class TestHiveMetastoreHook(TestHiveEnvironment):
 None)
 
 # No partition will be filtered out.
-self.assertEqual(max_partition, b'value3')
+self.assertEqual(max_partition, 'value3')
 
 def test_get_max_partition_from_valid_part_specs(self):
 max_partition = \
@@ -371,7 +371,16 @@ class TestHiveMetastoreHook(TestHiveEnvironment):
  {'key1': 'value3', 'key2': 'value4'}],
 'key1',
 self.VALID_FILTER_MAP)
-self.assertEqual(max_partition, b'value1')
+self.assertEqual(max_partition, 'value1')
+
+def test_get_max_partition_from_valid_part_specs_return_type(self):
+max_partition = \
+HiveMetastoreHook._get_max_partition_from_part_specs(
+[{'key1': 'value1', 'key2': 'value2'},
+ {'key1': 'value3', 'key2': 'value4'}],
+'key1',
+self.VALID_FILTER_MAP)
+self.assertIsInstance(max_partition, str)
 
 @patch("airflow.hooks.hive_hooks.HiveMetastoreHook.get_connection",
return_value=[Connection(host="localhost", port="9802")])
@@ -522,7 +531,7 @@ class TestHiveMetastoreHook(TestHiveEnvironment):
 table_name=self.table,
 field=self.partition_by,
 filter_map=filter_map)
-self.assertEqual(partition, DEFAULT_DATE_DS.encode('utf-8'))
+self.assertEqual(partition, DEFAULT_DATE_DS)
 
 metastore.get_table.assert_called_with(
 dbname=self.database, tbl_name=self.table)



[airflow] 03/05: Enhanced documentation around Cluster Policy (#8661)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3127b0a071c66ec0d53adaef055c9cf4bdcd1fe4
Author: Vardan Gupta 
AuthorDate: Sat May 2 01:49:53 2020 +0530

Enhanced documentation around Cluster Policy (#8661)

(cherry picked from commit 6560f29fa206fe1fcc99d0ee4093d678caf74511)
---
 docs/concepts.rst | 7 +++
 1 file changed, 7 insertions(+)

diff --git a/docs/concepts.rst b/docs/concepts.rst
index 068321e..89479d4 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -1017,6 +1017,13 @@ may look like inside your ``airflow_local_settings.py``:
 if task.timeout > timedelta(hours=48):
 task.timeout = timedelta(hours=48)
 
+To define policy, add a ``airflow_local_settings`` module to your PYTHONPATH
+or to AIRFLOW_HOME/config folder that defines this ``policy`` function. It 
receives a ``TaskInstance``
+object and can alter it where needed.
+
+Please note, cluster policy currently applies to task only though you can 
access DAG via ``task.dag`` property.
+Also, cluster policy will have precedence over task attributes defined in DAG
+meaning if ``task.sla`` is defined in dag and also mutated via cluster policy 
then later will have precedence.
 
 Documentation & Notes
 =



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443754771



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)

Review comment:
   ```suggestion
   self.default_time = datetime(2020, 1, 1)
   ```
   I suggest we use default_time instead of now





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated (c2e1270 -> 215fc24)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c2e1270  Fix displaying Executor Class Name in "Base Job" table (#8679)
 new 5787102  Prevent clickable sorting on non sortable columns in TI view 
(#8681)
 new 215fc24  Fix connection add/edit for spark (#8685)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/models/connection.py  | 1 +
 airflow/www/static/connection_form.js | 4 
 airflow/www_rbac/static/js/connection_form.js | 4 
 airflow/www_rbac/views.py | 2 ++
 4 files changed, 11 insertions(+)



[airflow] 02/02: Fix connection add/edit for spark (#8685)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 215fc24b12b9923ccf572da8d10e528caae46904
Author: Xiaodong DENG 
AuthorDate: Sun May 3 20:17:21 2020 +0200

Fix connection add/edit for spark (#8685)

connection add/edit UI pages were not working correctly for Spark 
connections.

The root-cause is that "spark" is not listed in models.Connection._types.
So when www/forms.py tries to produce the UI,
"spark" is not available and it always tried to "fall back" to the option 
list
whose first entry is "Docker"

In addition, we should hide irrelevant entries for
spark connections ("schema", "login", and "password")

(cherry picked from commit 0b598a2162e9d27160f0afe19e8908df735068af)
---
 airflow/models/connection.py  | 1 +
 airflow/www/static/connection_form.js | 4 
 airflow/www_rbac/static/js/connection_form.js | 4 
 3 files changed, 9 insertions(+)

diff --git a/airflow/models/connection.py b/airflow/models/connection.py
index a1c2253..0ab305f 100644
--- a/airflow/models/connection.py
+++ b/airflow/models/connection.py
@@ -108,6 +108,7 @@ class Connection(Base, LoggingMixin):
 ('gcpcloudsql', 'Google Cloud SQL'),
 ('grpc', 'GRPC Connection'),
 ('yandexcloud', 'Yandex Cloud'),
+('spark', 'Spark'),
 ]
 
 def __init__(
diff --git a/airflow/www/static/connection_form.js 
b/airflow/www/static/connection_form.js
index 0f668ae..2ffab82 100644
--- a/airflow/www/static/connection_form.js
+++ b/airflow/www/static/connection_form.js
@@ -74,6 +74,10 @@
 hidden_fields: ['host', 'schema', 'login', 'password', 'port', 
'extra'],
 relabeling: {},
 },
+spark: {
+hidden_fields: ['schema', 'login', 'password'],
+relabeling: {},
+},
   }
   function connTypeChange(connectionType) {
 $("div.form-group").removeClass("hide");
diff --git a/airflow/www_rbac/static/js/connection_form.js 
b/airflow/www_rbac/static/js/connection_form.js
index 8e371fa..95f3ab1 100644
--- a/airflow/www_rbac/static/js/connection_form.js
+++ b/airflow/www_rbac/static/js/connection_form.js
@@ -61,6 +61,10 @@ $(document).ready(function () {
 'login': 'Username',
   }
 },
+spark: {
+  hidden_fields: ['schema', 'login', 'password'],
+  relabeling: {},
+},
   };
 
   function connTypeChange(connectionType) {



[airflow] 01/02: Prevent clickable sorting on non sortable columns in TI view (#8681)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 57871027ab15a72e70bfb2694c7b87c3b554ae67
Author: Ace Haidrey 
AuthorDate: Mon May 4 15:00:02 2020 -0700

Prevent clickable sorting on non sortable columns in TI view (#8681)

Co-authored-by: Ace Haidrey 

(cherry-picked from b31ad51)
---
 airflow/www_rbac/views.py | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index e78b61b..7c601ba 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -2646,6 +2646,8 @@ class TaskInstanceModelView(AirflowModelView):
 'unixname', 'priority_weight', 'queue', 'queued_dttm', 
'try_number',
 'pool', 'log_url']
 
+order_columns = [item for item in list_columns if item not in 
['try_number', 'log_url']]
+
 search_columns = ['state', 'dag_id', 'task_id', 'execution_date', 
'hostname',
   'queue', 'pool', 'operator', 'start_date', 'end_date']
 



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443754097



##
File path: airflow/api_connexion/endpoints/extra_link_endpoint.py
##
@@ -18,9 +18,41 @@
 # TODO(mik-laj): We have to implement it.
 # Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8140
 
+from flask import current_app
 
-def get_extra_links():
+from airflow import DAG
+from airflow.api_connexion.exceptions import NotFound
+from airflow.exceptions import TaskNotFound
+from airflow.models.dagbag import DagBag
+from airflow.models.dagrun import DagRun as DR
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_extra_links(dag_id: str, dag_run_id: str, task_id: str, session):
 """
 Get extra links for task instance
 """
-raise NotImplementedError("Not implemented yet.")
+dagbag: DagBag = current_app.dag_bag
+dag: DAG = dagbag.get_dag(dag_id)
+if not dag:
+raise NotFound("DAG not found")
+
+try:
+task = dag.get_task(task_id)
+except TaskNotFound:
+raise NotFound("Task not found")
+
+execution_date = (
+session.query(DR.execution_date).filter(DR.dag_id == 
dag_id).filter(DR.run_id == dag_run_id).scalar()

Review comment:
   ```suggestion
   session.query(DR.execution_date).filter(DR.dag_id == dag_id, 
DR.run_id == dag_run_id).one()
   ```
   I'm wrong about using `one`. `scalar()` or `first()` is better





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins commented on pull request #9476: Remove PATCH /dags/{dag_id}/dagRuns/{dag_run_id} endpoint

2020-06-22 Thread GitBox


jhtimmins commented on pull request #9476:
URL: https://github.com/apache/airflow/pull/9476#issuecomment-647782731


   This code looks fine. @mik-laj do you know if there was a reason the spec 
included the ability to update dag runs via PATCH?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins commented on pull request #9472: Add drop_partition functionality for HiveMetastoreHook

2020-06-22 Thread GitBox


jhtimmins commented on pull request #9472:
URL: https://github.com/apache/airflow/pull/9472#issuecomment-647790321


   Looks like this is failing static checks. I recommend running pre-commit 
checks



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins removed a comment on pull request #9476: Remove PATCH /dags/{dag_id}/dagRuns/{dag_run_id} endpoint

2020-06-22 Thread GitBox


jhtimmins removed a comment on pull request #9476:
URL: https://github.com/apache/airflow/pull/9476#issuecomment-647790169


   Looks like this is failing static checks. I recommend running 
[pre-commit](https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#pre-commit-hooks)
 checks



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jhtimmins commented on pull request #9476: Remove PATCH /dags/{dag_id}/dagRuns/{dag_run_id} endpoint

2020-06-22 Thread GitBox


jhtimmins commented on pull request #9476:
URL: https://github.com/apache/airflow/pull/9476#issuecomment-647790169


   Looks like this is failing static checks. I recommend running 
[pre-commit](https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#pre-commit-hooks)
 checks



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9479: Remove need of datetime.timezone in test_views.py

2020-06-22 Thread GitBox


kaxil opened a new pull request #9479:
URL: https://github.com/apache/airflow/pull/9479


   `datetime.timezone` can be easily replaced by `from airflow.utils.timezone`
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] OmairK commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


OmairK commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443746796



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # type: ignore  # 
pylint: disable=no-member
+self.app.dag_bag.sync_to_db()   # type: ignore  # pylint: 
disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])
+return dag
+
+@parameterized.expand(
+[
+(
+
"/api/v1/dags/INVALID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links",
+'DAG not found'
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/INVALID/taskInstances/TEST_SINGLE_QUERY/links",
+"DAG Run not found"
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/INVALID/links",
+"Task not found"
+),
+]
+)
+def test_should_response_404_on_invalid_task_id(self, url, expected_title):
+response = self.client.get(url)
+
+self.assertEqual(404, response.status_code)
+self.assertEqual({'detail': None, 'status': 404, 'title': 
expected_title, 'type': 'about:blank'}, response.json)
+
+@mock_plugin_manager(plugins=[])
 def test_should_response_200(self):
+XCom.set(
+key="job_id",
+value="TEST_JOB_ID",
+execution_date=self.now,
+task_id="TEST_SINGLE_QUERY",
+dag_id=self.dag.dag_id,
+)
 response = self.client.get(
-
"/dags/TEST_DG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_TASK_ID/links"
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links"
+)
+
+self.assertEqual(200, response.status_code, response.data)

Review comment:
   ```suggestion
   self.assertEqual(200, response.status_code)
   ```
   I have never used it but I guess response.data returns the response.json in 
bytes literal? 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj opened a new pull request #9478: Add link to ADC docs in use-alternative-secret-backend.rst

2020-06-22 Thread GitBox


mik-laj opened a new pull request #9478:
URL: https://github.com/apache/airflow/pull/9478


   The ADC is very complex and a thorough description of this process is beyond 
the scope of our documentation, so it adds links to Google documentation. 
Unfortunately, Google also doesn't have one good public documentation that 
fully describes the ADC, so I have to add two links.
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [X] Description above provides context of the change
   - [X] Unit tests coverage for changes (not needed for documentation changes)
   - [X] Target Github ISSUE in description if exists
   - [X] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [X] Relevant documentation is updated including usage instructions.
   - [X] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 02/02: Pinning max pandas version to 2.0 (lesser than) to allow pandas 1.0. (#7954)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d095b5870168a6522b84f381beed0fb63f7342db
Author: JPFrancoia 
AuthorDate: Mon Mar 30 22:52:04 2020 +0200

Pinning max pandas version to 2.0 (lesser than) to allow pandas 1.0. (#7954)

(cherry picked from commit 1428c0f796e5bed5724e26388485c173e18bdad5)
---
 requirements/requirements-python2.7.txt | 6 +++---
 requirements/requirements-python3.5.txt | 6 +++---
 requirements/requirements-python3.6.txt | 8 
 requirements/requirements-python3.7.txt | 8 
 requirements/setup-2.7.md5  | 2 +-
 requirements/setup-3.5.md5  | 2 +-
 requirements/setup-3.6.md5  | 2 +-
 requirements/setup-3.7.md5  | 2 +-
 setup.py| 2 +-
 9 files changed, 19 insertions(+), 19 deletions(-)

diff --git a/requirements/requirements-python2.7.txt 
b/requirements/requirements-python2.7.txt
index 4929fb8..5d8c1df 100644
--- a/requirements/requirements-python2.7.txt
+++ b/requirements/requirements-python2.7.txt
@@ -69,9 +69,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 bleach==3.1.5
 blinker==1.4
-boto3==1.14.7
+boto3==1.14.8
 boto==2.49.0
-botocore==1.17.7
+botocore==1.17.8
 cached-property==1.5.1
 cachetools==3.1.1
 cassandra-driver==3.20.2
@@ -112,7 +112,7 @@ email-validator==1.1.1
 entrypoints==0.3
 enum34==1.1.10
 execnet==1.7.1
-fastavro==0.23.4
+fastavro==0.23.5
 filelock==3.0.12
 flake8-colors==0.1.6
 flake8==3.8.3
diff --git a/requirements/requirements-python3.5.txt 
b/requirements/requirements-python3.5.txt
index d145814..4f12b75 100644
--- a/requirements/requirements-python3.5.txt
+++ b/requirements/requirements-python3.5.txt
@@ -60,9 +60,9 @@ bcrypt==3.1.7
 beautifulsoup4==4.7.1
 billiard==3.6.3.0
 blinker==1.4
-boto3==1.14.7
+boto3==1.14.8
 boto==2.49.0
-botocore==1.17.7
+botocore==1.17.8
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -99,7 +99,7 @@ elasticsearch==5.5.3
 email-validator==1.1.1
 entrypoints==0.3
 execnet==1.7.1
-fastavro==0.23.4
+fastavro==0.23.5
 filelock==3.0.12
 flake8-colors==0.1.6
 flake8==3.8.3
diff --git a/requirements/requirements-python3.6.txt 
b/requirements/requirements-python3.6.txt
index 8c400d1..85fb5f4 100644
--- a/requirements/requirements-python3.6.txt
+++ b/requirements/requirements-python3.6.txt
@@ -62,9 +62,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.14.7
+boto3==1.14.8
 boto==2.49.0
-botocore==1.17.7
+botocore==1.17.8
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -101,7 +101,7 @@ elasticsearch==5.5.3
 email-validator==1.1.1
 entrypoints==0.3
 execnet==1.7.1
-fastavro==0.23.4
+fastavro==0.23.5
 filelock==3.0.12
 flake8-colors==0.1.6
 flake8==3.8.3
@@ -202,7 +202,7 @@ oauthlib==3.1.0
 oscrypto==1.2.0
 packaging==20.4
 pandas-gbq==0.13.2
-pandas==0.25.3
+pandas==1.0.5
 papermill==2.1.1
 parameterized==0.7.4
 paramiko==2.7.1
diff --git a/requirements/requirements-python3.7.txt 
b/requirements/requirements-python3.7.txt
index 484b1eb..1483b27 100644
--- a/requirements/requirements-python3.7.txt
+++ b/requirements/requirements-python3.7.txt
@@ -62,9 +62,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.14.7
+boto3==1.14.8
 boto==2.49.0
-botocore==1.17.7
+botocore==1.17.8
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -101,7 +101,7 @@ elasticsearch==5.5.3
 email-validator==1.1.1
 entrypoints==0.3
 execnet==1.7.1
-fastavro==0.23.4
+fastavro==0.23.5
 filelock==3.0.12
 flake8-colors==0.1.6
 flake8==3.8.3
@@ -201,7 +201,7 @@ oauthlib==3.1.0
 oscrypto==1.2.0
 packaging==20.4
 pandas-gbq==0.13.2
-pandas==0.25.3
+pandas==1.0.5
 papermill==2.1.1
 parameterized==0.7.4
 paramiko==2.7.1
diff --git a/requirements/setup-2.7.md5 b/requirements/setup-2.7.md5
index e89e341..0cea857 100644
--- a/requirements/setup-2.7.md5
+++ b/requirements/setup-2.7.md5
@@ -1 +1 @@
-cb2e93d35d589fe61d94c9c6deb731f0  /opt/airflow/setup.py
+77db06fc6e178c2ddc7e84f3c63d4c63  /opt/airflow/setup.py
diff --git a/requirements/setup-3.5.md5 b/requirements/setup-3.5.md5
index e89e341..0cea857 100644
--- a/requirements/setup-3.5.md5
+++ b/requirements/setup-3.5.md5
@@ -1 +1 @@
-cb2e93d35d589fe61d94c9c6deb731f0  /opt/airflow/setup.py
+77db06fc6e178c2ddc7e84f3c63d4c63  /opt/airflow/setup.py
diff --git a/requirements/setup-3.6.md5 b/requirements/setup-3.6.md5
index e89e341..0cea857 100644
--- a/requirements/setup-3.6.md5
+++ b/requirements/setup-3.6.md5
@@ -1 +1 @@
-cb2e93d35d589fe61d94c9c6deb731f0  /opt/airflow/setup.py
+77db06fc6e178c2ddc7e84f3c63d4c63  /opt/airflow/setup.py
diff --git a/requirements/setup-3.7.md5 b/requirements/setup-3.7.md5
index e89e341..0cea857 100644
--- a/requirements/setup-3.7.md5
+++ b/requirements/setup-3.7.md5
@@ -1 +1 @@
-cb2e93d35d589fe61d94c9c6deb731f0  /opt/airflow/setup.py

[airflow] 01/02: Update example webserver_config.py to show correct CSRF config (#8944)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c036cdf29b5551626c5354df19d905fc166a847b
Author: Ash Berlin-Taylor 
AuthorDate: Thu May 21 12:12:29 2020 +0100

Update example webserver_config.py to show correct CSRF config (#8944)

CSRF_ENABLED does nothing.

Thankfully, due to sensible defaults in flask-wtf, CSRF is on by
default, but we should set this correctly.

Fixes #8915

(cherry picked from commit 16206cd6262a1e4d51bc425d52cfa61141aaaffc)
---
 airflow/config_templates/default_webserver_config.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/config_templates/default_webserver_config.py 
b/airflow/config_templates/default_webserver_config.py
index 23d3985..33eef9f 100644
--- a/airflow/config_templates/default_webserver_config.py
+++ b/airflow/config_templates/default_webserver_config.py
@@ -32,7 +32,7 @@ basedir = os.path.abspath(os.path.dirname(__file__))
 SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')
 
 # Flask-WTF flag for CSRF
-CSRF_ENABLED = True
+WTF_CSRF_ENABLED = True
 
 # 
 # AUTHENTICATION CONFIG



[airflow] branch v1-10-test updated (215fc24 -> d095b58)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 215fc24  Fix connection add/edit for spark (#8685)
 new c036cdf  Update example webserver_config.py to show correct CSRF 
config (#8944)
 new d095b58  Pinning max pandas version to 2.0 (lesser than) to allow 
pandas 1.0. (#7954)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/config_templates/default_webserver_config.py | 2 +-
 requirements/requirements-python2.7.txt  | 6 +++---
 requirements/requirements-python3.5.txt  | 6 +++---
 requirements/requirements-python3.6.txt  | 8 
 requirements/requirements-python3.7.txt  | 8 
 requirements/setup-2.7.md5   | 2 +-
 requirements/setup-3.5.md5   | 2 +-
 requirements/setup-3.6.md5   | 2 +-
 requirements/setup-3.7.md5   | 2 +-
 setup.py | 2 +-
 10 files changed, 20 insertions(+), 20 deletions(-)



[GitHub] [airflow] Acehaidrey commented on a change in pull request #9472: Add drop_partition functionality for HiveMetastoreHook

2020-06-22 Thread GitBox


Acehaidrey commented on a change in pull request #9472:
URL: https://github.com/apache/airflow/pull/9472#discussion_r443822020



##
File path: airflow/providers/apache/hive/hooks/hive.py
##
@@ -775,6 +775,20 @@ def table_exists(self, table_name, db='default'):
 except Exception:  # pylint: disable=broad-except
 return False
 
+def drop_partitions(self, table_name, part_vals, delete_data=False, 
db='default'):
+"""
+Drop partitions matching param_names input
+>>> hh = HiveMetastoreHook()
+>>> hh.drop_partitions(db='airflow', table_name='static_babynames', 
part_vals="['2020-05-01']")
+True
+"""
+if self.table_exists(table_name, db):
+with self.metastore as client:
+return client.drop_partition(db, table_name, part_vals, 
delete_data)
+else:
+self.log.info("Table %s.%s does not exist!" % (db, table_name))

Review comment:
   maybe make a log message about the partition existing. do we want to 
check that too before we try to drop the partition? There is a function in the 
hook for this too.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: fixup! [AIRFLOW-3900] Error on undefined template variables in unit tests. (#4719)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new bb34c05  fixup! [AIRFLOW-3900] Error on undefined template variables 
in unit tests. (#4719)
bb34c05 is described below

commit bb34c057244720cf25d1745297b1d89da5c8f85c
Author: Kaxil Naik 
AuthorDate: Sun May 10 12:18:39 2020 +0100

fixup! [AIRFLOW-3900] Error on undefined template variables in unit tests. 
(#4719)
---
 airflow/www_rbac/templates/airflow/dag.html | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www_rbac/templates/airflow/dag.html 
b/airflow/www_rbac/templates/airflow/dag.html
index 97338b0..c10e293 100644
--- a/airflow/www_rbac/templates/airflow/dag.html
+++ b/airflow/www_rbac/templates/airflow/dag.html
@@ -33,7 +33,7 @@
   {% else %}
 
 DAG: {{ dag.dag_id }}
- {{ dag.description_unicode[0:150] + '...' 
if dag.description_unicode and dag.description_unicode|length > 150 else 
dag.description_unicode|default('', true) }} 
+ {{ dag.description[0:150] + '...' if 
dag.description and dag.description|length > 150 else 
dag.description|default('', true) }} 
   {% endif %}
   {% if root %}
 ROOT: {{ root }}



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443745818



##
File path: airflow/api_connexion/endpoints/extra_link_endpoint.py
##
@@ -18,9 +18,41 @@
 # TODO(mik-laj): We have to implement it.
 # Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8140

Review comment:
   ```suggestion
   
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443754097



##
File path: airflow/api_connexion/endpoints/extra_link_endpoint.py
##
@@ -18,9 +18,41 @@
 # TODO(mik-laj): We have to implement it.
 # Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8140
 
+from flask import current_app
 
-def get_extra_links():
+from airflow import DAG
+from airflow.api_connexion.exceptions import NotFound
+from airflow.exceptions import TaskNotFound
+from airflow.models.dagbag import DagBag
+from airflow.models.dagrun import DagRun as DR
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_extra_links(dag_id: str, dag_run_id: str, task_id: str, session):
 """
 Get extra links for task instance
 """
-raise NotImplementedError("Not implemented yet.")
+dagbag: DagBag = current_app.dag_bag
+dag: DAG = dagbag.get_dag(dag_id)
+if not dag:
+raise NotFound("DAG not found")
+
+try:
+task = dag.get_task(task_id)
+except TaskNotFound:
+raise NotFound("Task not found")
+
+execution_date = (
+session.query(DR.execution_date).filter(DR.dag_id == 
dag_id).filter(DR.run_id == dag_run_id).scalar()

Review comment:
   ```suggestion
   session.query(DR.execution_date).filter(DR.dag_id == dag_id, 
DR.run_id == dag_run_id).one()
   ```
   `.scalar()` implicitly calls `.one()` and `one` is most suitable for Rest 
Api that wants to raise 404 according to sqlalchemy tutorial here 
https://docs.sqlalchemy.org/en/13/orm/tutorial.html#returning-lists-and-scalars





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


mik-laj commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443765208



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # type: ignore  # 
pylint: disable=no-member
+self.app.dag_bag.sync_to_db()   # type: ignore  # pylint: 
disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])

Review comment:
   BigQueryInsertJobOperator doesn't have extra links. 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: [AIRFLOW-3900] Error on undefined template variables in unit tests. (#4719)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 53c4e1c  [AIRFLOW-3900] Error on undefined template variables in unit 
tests. (#4719)
53c4e1c is described below

commit 53c4e1cfd7f24d68207d9bed63862099c5e6401f
Author: Joshua Carp 
AuthorDate: Sat Feb 16 00:03:28 2019 -0500

[AIRFLOW-3900] Error on undefined template variables in unit tests. (#4719)

(cherry-picked from a7586648726aa99f0976150ad376c0ce553544b0)
---
 airflow/www_rbac/templates/airflow/dag.html | 2 +-
 tests/www_rbac/test_views.py| 2 ++
 2 files changed, 3 insertions(+), 1 deletion(-)

diff --git a/airflow/www_rbac/templates/airflow/dag.html 
b/airflow/www_rbac/templates/airflow/dag.html
index b5f4d00..7de2cc8 100644
--- a/airflow/www_rbac/templates/airflow/dag.html
+++ b/airflow/www_rbac/templates/airflow/dag.html
@@ -28,7 +28,7 @@
 {% block content %}
 
 
-  {% if dag.parent_dag %}
+  {% if dag.parent_dag is defined and dag.parent_dag %}
 SUBDAG:   {{ dag.dag_id 
}}
   {% else %}
 
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index 68a605a..9a199f6 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -32,6 +32,7 @@ from datetime import timedelta
 
 import pytest
 import six
+import jinja2
 from flask import Markup, session as flask_session, url_for
 from flask._compat import PY2
 from parameterized import parameterized
@@ -66,6 +67,7 @@ class TestBase(unittest.TestCase):
 def setUpClass(cls):
 cls.app, cls.appbuilder = application.create_app(session=Session, 
testing=True)
 cls.app.config['WTF_CSRF_ENABLED'] = False
+cls.app.jinja_env.undefined = jinja2.StrictUndefined
 settings.configure_orm()
 cls.session = Session
 



[airflow] branch v1-10-test updated: Decrypt secrets from SystemsManagerParameterStoreBackend (#9214)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 0a4ec72  Decrypt secrets from SystemsManagerParameterStoreBackend 
(#9214)
0a4ec72 is described below

commit 0a4ec7253ff20644fb04faf80e0ac3d23bd69389
Author: Nathan Toups 
AuthorDate: Sun Jun 14 10:35:59 2020 -0600

Decrypt secrets from SystemsManagerParameterStoreBackend (#9214)

(cherry picked from commit ffb85740373f7adb70d28ec7d5a8886380170e5e)
---
 airflow/contrib/secrets/aws_systems_manager.py|  2 +-
 tests/contrib/secrets/test_aws_systems_manager.py | 12 
 2 files changed, 13 insertions(+), 1 deletion(-)

diff --git a/airflow/contrib/secrets/aws_systems_manager.py 
b/airflow/contrib/secrets/aws_systems_manager.py
index 971ad18..862fa96 100644
--- a/airflow/contrib/secrets/aws_systems_manager.py
+++ b/airflow/contrib/secrets/aws_systems_manager.py
@@ -100,7 +100,7 @@ class 
SystemsManagerParameterStoreBackend(BaseSecretsBackend, LoggingMixin):
 ssm_path = self.build_path(path_prefix, secret_id)
 try:
 response = self.client.get_parameter(
-Name=ssm_path, WithDecryption=False
+Name=ssm_path, WithDecryption=True
 )
 value = response["Parameter"]["Value"]
 return value
diff --git a/tests/contrib/secrets/test_aws_systems_manager.py 
b/tests/contrib/secrets/test_aws_systems_manager.py
index 975e298..9801f19 100644
--- a/tests/contrib/secrets/test_aws_systems_manager.py
+++ b/tests/contrib/secrets/test_aws_systems_manager.py
@@ -81,6 +81,18 @@ class 
TestSystemsManagerParameterStoreBackend(unittest.TestCase):
 self.assertEqual('world', returned_uri)
 
 @mock_ssm
+def test_get_variable_secret_string(self):
+param = {
+'Name': '/airflow/variables/hello',
+'Type': 'SecureString',
+'Value': 'world'
+}
+ssm_backend = SystemsManagerParameterStoreBackend()
+ssm_backend.client.put_parameter(**param)
+returned_uri = ssm_backend.get_variable('hello')
+self.assertEqual('world', returned_uri)
+
+@mock_ssm
 def test_get_variable_non_existent_key(self):
 """
 Test that if Variable key is not present in SSM,



[airflow] branch v1-10-test updated: fixup! fixup! [AIRFLOW-8902] Fix Dag Run UI execution date with timezone cannot be saved issue (#8902)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new c232972  fixup! fixup! [AIRFLOW-8902] Fix Dag Run UI execution date 
with timezone cannot be saved issue (#8902)
c232972 is described below

commit c2329721385b15984ab5acddad92767348c763e0
Author: Kaxil Naik 
AuthorDate: Tue Jun 23 00:22:26 2020 +0100

fixup! fixup! [AIRFLOW-8902] Fix Dag Run UI execution date with timezone 
cannot be saved issue (#8902)
---
 tests/www_rbac/test_views.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index d5eb040..2824184 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -2552,7 +2552,7 @@ class TestDagRunModelView(TestBase):
 
 dr = self.session.query(models.DagRun).one()
 
-self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 5, 
4, 3))
+self.assertEqual(dr.execution_date, timezone.datetime(2018, 7, 6, 9, 
4, 3))
 
 def test_create_dagrun_execution_date_without_timezone_default_utc(self):
 data = {



[airflow] branch master updated: Remove unused recurse_tasks function (#9465)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new ee36142  Remove unused recurse_tasks function (#9465)
ee36142 is described below

commit ee361427416197da20b7f0db6df3cf66b7e82912
Author: Kaxil Naik 
AuthorDate: Tue Jun 23 00:34:39 2020 +0100

Remove unused recurse_tasks function (#9465)
---
 airflow/www/utils.py | 19 ---
 1 file changed, 19 deletions(-)

diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index a65260e..37d47c1 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -30,8 +30,6 @@ from pygments import highlight, lexers
 from pygments.formatters import HtmlFormatter
 
 from airflow.configuration import conf
-from airflow.models.baseoperator import BaseOperator
-from airflow.operators.subdag_operator import SubDagOperator
 from airflow.utils import timezone
 from airflow.utils.code_utils import get_python_source
 from airflow.utils.json import AirflowJsonEncoder
@@ -343,23 +341,6 @@ def get_attr_renderer():
 }
 
 
-def recurse_tasks(tasks, task_ids, dag_ids, task_id_to_dag):# noqa: D103
-if isinstance(tasks, list):
-for task in tasks:
-recurse_tasks(task, task_ids, dag_ids, task_id_to_dag)
-return
-if isinstance(tasks, SubDagOperator):
-subtasks = tasks.subdag.tasks
-dag_ids.append(tasks.subdag.dag_id)
-for subtask in subtasks:
-if subtask.task_id not in task_ids:
-task_ids.append(subtask.task_id)
-task_id_to_dag[subtask.task_id] = tasks.subdag
-recurse_tasks(subtasks, task_ids, dag_ids, task_id_to_dag)
-if isinstance(tasks, BaseOperator):
-task_id_to_dag[tasks.task_id] = tasks.dag
-
-
 def get_chart_height(dag):
 """
 TODO(aoen): See [AIRFLOW-1263] We use the number of tasks in the DAG as a 
heuristic to



[GitHub] [airflow] kaxil commented on pull request #9465: Remove unused recurse_tasks function

2020-06-22 Thread GitBox


kaxil commented on pull request #9465:
URL: https://github.com/apache/airflow/pull/9465#issuecomment-647821191


   FYI: This function was added in 
https://github.com/apache/airflow/commit/91900c995fa8fc373d88974a97cf931a14d7527e
 but was made redundant by 
https://github.com/apache/airflow/commit/28cfd2c541c12468b3e4f634545dfa31a77b0091



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #9465: Remove unused recurse_tasks function

2020-06-22 Thread GitBox


kaxil merged pull request #9465:
URL: https://github.com/apache/airflow/pull/9465


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443758402



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager

Review comment:
   ```suggestion
   from tests.test_utils.mock_plugins import mock_plugin_manager
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Fix displaying Executor Class Name in "Base Job" table (#8679)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new c2e1270  Fix displaying Executor Class Name in "Base Job" table (#8679)
c2e1270 is described below

commit c2e12700c801b393fcbd06f275b67ca5391d0652
Author: Kaxil Naik 
AuthorDate: Sat May 2 18:30:45 2020 +0100

Fix displaying Executor Class Name in "Base Job" table (#8679)

(cherry picked from commit 0a7b5004ac5df830389e16f9344ef549d27e0353)
---
 airflow/jobs/base_job.py|  2 +-
 tests/jobs/test_base_job.py | 24 +++-
 2 files changed, 24 insertions(+), 2 deletions(-)

diff --git a/airflow/jobs/base_job.py b/airflow/jobs/base_job.py
index 1135696..94f5969 100644
--- a/airflow/jobs/base_job.py
+++ b/airflow/jobs/base_job.py
@@ -84,7 +84,7 @@ class BaseJob(Base, LoggingMixin):
 *args, **kwargs):
 self.hostname = get_hostname()
 self.executor = executor or executors.get_default_executor()
-self.executor_class = executor.__class__.__name__
+self.executor_class = self.executor.__class__.__name__
 self.start_date = timezone.utcnow()
 self.latest_heartbeat = timezone.utcnow()
 if heartrate is not None:
diff --git a/tests/jobs/test_base_job.py b/tests/jobs/test_base_job.py
index 7487b13..8ffa5bb 100644
--- a/tests/jobs/test_base_job.py
+++ b/tests/jobs/test_base_job.py
@@ -24,10 +24,12 @@ import unittest
 from sqlalchemy.exc import OperationalError
 
 from airflow.jobs import BaseJob
+from airflow.executors.sequential_executor import SequentialExecutor
 from airflow.utils import timezone
 from airflow.utils.db import create_session
 from airflow.utils.state import State
 from tests.compat import Mock, patch
+from tests.test_utils.config import conf_vars
 
 
 class BaseJobTest(unittest.TestCase):
@@ -119,4 +121,24 @@ class BaseJobTest(unittest.TestCase):
 
 job.heartbeat()
 
-self.assertEqual(job.latest_heartbeat, when, "attriubte not 
updated when heartbeat fails")
+self.assertEqual(job.latest_heartbeat, when, "attribute not 
updated when heartbeat fails")
+
+@conf_vars({('scheduler', 'max_tis_per_query'): '100'})
+@patch('airflow.jobs.base_job.executors.get_default_executor')
+@patch('airflow.jobs.base_job.get_hostname')
+@patch('airflow.jobs.base_job.getpass.getuser')
+def test_essential_attr(self, mock_getuser, mock_hostname, 
mock_default_executor):
+mock_sequential_executor = SequentialExecutor()
+mock_hostname.return_value = "test_hostname"
+mock_getuser.return_value = "testuser"
+mock_default_executor.return_value = mock_sequential_executor
+
+test_job = self.TestJob(None, heartrate=10, dag_id="example_dag", 
state=State.RUNNING)
+self.assertEqual(test_job.executor_class, "SequentialExecutor")
+self.assertEqual(test_job.heartrate, 10)
+self.assertEqual(test_job.dag_id, "example_dag")
+self.assertEqual(test_job.hostname, "test_hostname")
+self.assertEqual(test_job.max_tis_per_query, 100)
+self.assertEqual(test_job.unixname, "testuser")
+self.assertEqual(test_job.state, "running")
+self.assertEqual(test_job.executor, mock_sequential_executor)



[airflow] branch v1-10-test updated: Add TaskInstance state to TI Tooltip to be colour-blind friendlier (#8910)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new a732559  Add TaskInstance state to TI Tooltip to be colour-blind 
friendlier (#8910)
a732559 is described below

commit a7325594e70b0bd7be2eeb9bfec5c522642fcb3b
Author: Joe Harris 
AuthorDate: Thu May 21 10:53:54 2020 +0100

Add TaskInstance state to TI Tooltip to be colour-blind friendlier (#8910)

Currently there is no way to determine the state of a TaskInstance in the 
graph view or tree view for people with colour blindness

Approximately 4.5% of people experience some form of colour vision 
deficiency

(cherry picked from commit f3f74c73201d947ce9334c50630b9cf5debbc66a)
---
 airflow/www_rbac/static/js/task-instances.js | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/airflow/www_rbac/static/js/task-instances.js 
b/airflow/www_rbac/static/js/task-instances.js
index ec971e0..8b49700 100644
--- a/airflow/www_rbac/static/js/task-instances.js
+++ b/airflow/www_rbac/static/js/task-instances.js
@@ -62,6 +62,9 @@ function generateTooltipDateTimes(startDate, endDate, dagTZ) {
 
 export default function tiTooltip(ti, {includeTryNumber = false} = {}) {
   let tt = '';
+  if (ti.state !== undefined) {
+tt += `Status: ${escapeHtml(ti.state)}`;
+  }
   if (ti.task_id !== undefined) {
 tt += `Task_id: ${escapeHtml(ti.task_id)}`;
   }



[GitHub] [airflow] feluelle commented on a change in pull request #8962: [AIRFLOW-8057] [AIP-31] Add @task decorator

2020-06-22 Thread GitBox


feluelle commented on a change in pull request #8962:
URL: https://github.com/apache/airflow/pull/8962#discussion_r443747672



##
File path: docs/concepts.rst
##
@@ -173,6 +213,62 @@ Each task is a node in our DAG, and there is a dependency 
from task_1 to task_2:
 We can say that task_1 is *upstream* of task_2, and conversely task_2 is 
*downstream* of task_1.
 When a DAG Run is created, task_1 will start running and task_2 waits for 
task_1 to complete successfully before it may start.
 
+.. _concepts:task_decorator:
+
+Python task decorator
+-
+
+Airflow ``task`` decorator converts any Python function to an Airflow operator.
+The decorated function can be called once to set the arguments and key 
arguments for operator execution.
+
+
+.. code-block:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+  @dag.task
+  def hello_world():
+  print('hello world!')
+
+
+  # Also...
+  from airflow.decorators import task
+
+
+  @task
+  def hello_name(name: str):
+  print(f'hello {name}!')
+
+
+  hello_name('Airflow users')
+
+Task decorator captures returned values and sends them to the :ref:`XCom 
backend `. By default, returned
+value is saved as a single XCom value. You can set ``multiple_outputs`` key 
argument to ``True`` to unroll dictionaries,
+lists or tuples into seprate XCom values. This can be used with regular 
operators to create
+:ref:`functional DAGs `.
+
+Calling a decorated function returns an ``XComArg`` instance. You can use it 
to set templated fields on downstream
+operators.
+
+You can call a decorated function more than once in a DAG. The decorated 
function will automatically generate
+a unique ``task_id`` for each generated operator.
+
+.. code-block:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+
+@dag.task
+def update_user(user_id: int):
+  ...
+
+# Avoid generating this list dynamically to keep DAG topology stable 
between DAG runs
+for user_id in user_ids:
+  update_current(user_id)

Review comment:
   ```suggestion
 update_user(user_id)
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443754097



##
File path: airflow/api_connexion/endpoints/extra_link_endpoint.py
##
@@ -18,9 +18,41 @@
 # TODO(mik-laj): We have to implement it.
 # Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8140
 
+from flask import current_app
 
-def get_extra_links():
+from airflow import DAG
+from airflow.api_connexion.exceptions import NotFound
+from airflow.exceptions import TaskNotFound
+from airflow.models.dagbag import DagBag
+from airflow.models.dagrun import DagRun as DR
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_extra_links(dag_id: str, dag_run_id: str, task_id: str, session):
 """
 Get extra links for task instance
 """
-raise NotImplementedError("Not implemented yet.")
+dagbag: DagBag = current_app.dag_bag
+dag: DAG = dagbag.get_dag(dag_id)
+if not dag:
+raise NotFound("DAG not found")
+
+try:
+task = dag.get_task(task_id)
+except TaskNotFound:
+raise NotFound("Task not found")
+
+execution_date = (
+session.query(DR.execution_date).filter(DR.dag_id == 
dag_id).filter(DR.run_id == dag_run_id).scalar()

Review comment:
   ```suggestion
   session.query(DR.execution_date).filter(DR.dag_id == dag_id, 
DR.run_id == dag_run_id).scalar()
   ```
   I'm wrong about using `one`. `scalar()` or `first()` is better





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Add a tip to trigger DAG screen (#9049)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new fd3902f  Add a tip to trigger DAG screen (#9049)
fd3902f is described below

commit fd3902f5a4dd69f47f171769621e56a255200d72
Author: Kamil Breguła 
AuthorDate: Thu May 28 13:40:51 2020 +0200

Add a tip to trigger DAG screen (#9049)

(cherry-picked from 52c7862)
---
 airflow/www/templates/airflow/trigger.html  | 3 +++
 airflow/www_rbac/templates/airflow/trigger.html | 3 +++
 2 files changed, 6 insertions(+)

diff --git a/airflow/www/templates/airflow/trigger.html 
b/airflow/www/templates/airflow/trigger.html
index 483981e..9eb9c78 100644
--- a/airflow/www/templates/airflow/trigger.html
+++ b/airflow/www/templates/airflow/trigger.html
@@ -30,6 +30,9 @@
 Configuration JSON (Optional)
 {{ conf }}
   
+  
+To access configuration in your DAG use {{ '{{' }} dag_run.conf 
{{ '}}' }}.
+  
   
   bail.
 
diff --git a/airflow/www_rbac/templates/airflow/trigger.html 
b/airflow/www_rbac/templates/airflow/trigger.html
index ee4ecc8..73f95c3 100644
--- a/airflow/www_rbac/templates/airflow/trigger.html
+++ b/airflow/www_rbac/templates/airflow/trigger.html
@@ -30,6 +30,9 @@
 Configuration JSON (Optional)
 {{ conf }}
   
+  
+To access configuration in your DAG use {{ '{{' }} dag_run.conf 
{{ '}}' }}.
+  
   
   bail.
 



[GitHub] [airflow] vanka56 commented on a change in pull request #9472: Add drop_partition functionality for HiveMetastoreHook

2020-06-22 Thread GitBox


vanka56 commented on a change in pull request #9472:
URL: https://github.com/apache/airflow/pull/9472#discussion_r443823716



##
File path: airflow/providers/apache/hive/hooks/hive.py
##
@@ -775,6 +775,20 @@ def table_exists(self, table_name, db='default'):
 except Exception:  # pylint: disable=broad-except
 return False
 
+def drop_partitions(self, table_name, part_vals, delete_data=False, 
db='default'):
+"""
+Drop partitions matching param_names input
+>>> hh = HiveMetastoreHook()
+>>> hh.drop_partitions(db='airflow', table_name='static_babynames', 
part_vals="['2020-05-01']")
+True
+"""
+if self.table_exists(table_name, db):
+with self.metastore as client:
+return client.drop_partition(db, table_name, part_vals, 
delete_data)
+else:
+self.log.info("Table %s.%s does not exist!" % (db, table_name))

Review comment:
   Got it. i can add that.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] OmairK commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


OmairK commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443731489



##
File path: airflow/api_connexion/endpoints/extra_link_endpoint.py
##
@@ -18,9 +18,41 @@
 # TODO(mik-laj): We have to implement it.
 # Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8140
 
+from flask import current_app
 
-def get_extra_links():
+from airflow import DAG
+from airflow.api_connexion.exceptions import NotFound
+from airflow.exceptions import TaskNotFound
+from airflow.models.dagbag import DagBag
+from airflow.models.dagrun import DagRun as DR
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_extra_links(dag_id: str, dag_run_id: str, task_id: str, session):
 """
 Get extra links for task instance
 """
-raise NotImplementedError("Not implemented yet.")
+dagbag: DagBag = current_app.dag_bag
+dag: DAG = dagbag.get_dag(dag_id)
+if not dag:
+raise NotFound("DAG not found")

Review comment:
   ```suggestion
   raise NotFound(detail=f"DAG with id: '{dag_id}' not found")
   ```
   The errors are a bit more descriptive this way, what do you think?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: UX Fix: Prevent undesired text selection with DAG title selection in Chrome (#8912)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 5e7867f  UX Fix: Prevent undesired text selection with DAG title 
selection in Chrome (#8912)
5e7867f is described below

commit 5e7867f599c23b487964d2a3016173c3764b41de
Author: Ryan Hamilton 
AuthorDate: Tue May 19 19:23:04 2020 -0400

UX Fix: Prevent undesired text selection with DAG title selection in Chrome 
(#8912)

Negate user-select in Firefox where behavior is already as desired

(cherry-picked from ce7fdea)
---
 airflow/www_rbac/templates/airflow/dag.html | 14 +++---
 1 file changed, 7 insertions(+), 7 deletions(-)

diff --git a/airflow/www_rbac/templates/airflow/dag.html 
b/airflow/www_rbac/templates/airflow/dag.html
index 7de2cc8..97338b0 100644
--- a/airflow/www_rbac/templates/airflow/dag.html
+++ b/airflow/www_rbac/templates/airflow/dag.html
@@ -26,21 +26,21 @@
 {% endblock %}
 
 {% block content %}
-
+  
 
   {% if dag.parent_dag is defined and dag.parent_dag %}
-SUBDAG:   {{ dag.dag_id 
}}
+SUBDAG: {{ dag.dag_id }}
   {% else %}
 
-DAG:   {{ dag.dag_id }}
+DAG: {{ dag.dag_id }}
  {{ dag.description_unicode[0:150] + '...' 
if dag.description_unicode and dag.description_unicode|length > 150 else 
dag.description_unicode|default('', true) }} 
   {% endif %}
   {% if root %}
-ROOT:   {{ root }}
+ROOT: {{ root }}
   {% endif %}
 
 
-  
+  
 schedule: {{ dag.schedule_interval }}
   
 
@@ -50,9 +50,9 @@
 {% set base_date_arg = request.args.get('base_date') %}
 {% set num_runs_arg = request.args.get('num_runs') %}
 {% if execution_date is defined %}
- {% set execution_date_arg = execution_date %}
+  {% set execution_date_arg = execution_date %}
 {% else %}
-{% set execution_date_arg = request.args.get('execution_date') %}
+  {% set execution_date_arg = request.args.get('execution_date') %}
 {% endif %}
 
   {% if dag.parent_dag is defined and dag.parent_dag %}



[GitHub] [airflow] OmairK commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


OmairK commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443740331



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # type: ignore  # 
pylint: disable=no-member
+self.app.dag_bag.sync_to_db()   # type: ignore  # pylint: 
disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])
+return dag
+
+@parameterized.expand(
+[
+(
+
"/api/v1/dags/INVALID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links",
+'DAG not found'
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/INVALID/taskInstances/TEST_SINGLE_QUERY/links",
+"DAG Run not found"
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/INVALID/links",
+"Task not found"
+),
+]
+)
+def test_should_response_404_on_invalid_task_id(self, url, expected_title):
+response = self.client.get(url)

Review comment:
   ```suggestion
   @parameterized.expand(
   [
   (
   "invalid dag_id",
   
"/api/v1/dags/INVALID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links",
   'DAG not found'
   ),
   (
   "invalid run_id",
   
"/api/v1/dags/TEST_DAG_ID/dagRuns/INVALID/taskInstances/TEST_SINGLE_QUERY/links",
   "DAG Run not found"
   ),
   (
   "invalid task_id",
   
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/INVALID/links",
   "Task not found"
   ),
   ]
   )
   def test_should_response_404_on(self, name, url, expected_title):
   del name
   response = self.client.get(url)
   ```
   These tests are dealing with multiple invalid id(s), this is IMO more 
descriptive, what do you think ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443764076



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # type: ignore  # 
pylint: disable=no-member
+self.app.dag_bag.sync_to_db()   # type: ignore  # pylint: 
disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])

Review comment:
   ```suggestion
   BigQueryInsertJobOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
   BigQueryInsertJobOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])
   ```
   This operator `BigQueryExecuteQueryOperator` is deprecated





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


mik-laj commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443768531



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,198 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
 
+from airflow import DAG
+from airflow.models.baseoperator import BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.models.xcom import XCom
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # type: ignore  # 
pylint: disable=no-member
+self.app.dag_bag.sync_to_db()   # type: ignore  # pylint: 
disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])
+return dag
+
+@parameterized.expand(
+[
+(
+
"/api/v1/dags/INVALID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links",
+'DAG not found'
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/INVALID/taskInstances/TEST_SINGLE_QUERY/links",
+"DAG Run not found"
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/INVALID/links",
+"Task not found"
+),
+]
+)
+def test_should_response_404_on_invalid_task_id(self, url, expected_title):
+response = self.client.get(url)
+
+self.assertEqual(404, response.status_code)
+self.assertEqual({'detail': None, 'status': 404, 'title': 
expected_title, 'type': 'about:blank'}, response.json)
+
+@mock_plugin_manager(plugins=[])
 def test_should_response_200(self):
+XCom.set(
+key="job_id",
+value="TEST_JOB_ID",
+execution_date=self.now,
+task_id="TEST_SINGLE_QUERY",
+dag_id=self.dag.dag_id,
+)
 response = self.client.get(
-
"/dags/TEST_DG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_TASK_ID/links"
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links"
+)
+
+self.assertEqual(200, response.status_code, response.data)

Review comment:
   3 argument is just a message for the developer. It can be in the form of 
bytes if it is safer and more readable.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-6940) Improve test isolation in test_views.py

2020-06-22 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17142512#comment-17142512
 ] 

ASF subversion and git services commented on AIRFLOW-6940:
--

Commit c80a35d462f8980f1aedcf8fb70900648379a107 in airflow's branch 
refs/heads/v1-10-test from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c80a35d ]

[AIRFLOW-6940] Improve test isolation in test_views.py (#7564)


> Improve test isolation in test_views.py
> ---
>
> Key: AIRFLOW-6940
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6940
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.9
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[airflow] branch v1-10-test updated (c232972 -> d24f2c6)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c232972  fixup! fixup! [AIRFLOW-8902] Fix Dag Run UI execution date 
with timezone cannot be saved issue (#8902)
 new 72af0a1  Fix tree view if config contains " (#9250)
 new fc68cf8  Fix json string escape in tree view (#8551)
 new 22c95f0  Fix failing tests from #9250 (#9307)
 new d24f2c6  Monkey patch greenlet celery pools (#8559)

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/bin/cli.py   | 10 +-
 airflow/www_rbac/templates/airflow/tree.html |  2 +-
 airflow/www_rbac/views.py|  6 --
 tests/www_rbac/test_views.py | 20 
 4 files changed, 34 insertions(+), 4 deletions(-)



[airflow] 02/04: Fix json string escape in tree view (#8551)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fc68cf881c1ebf5405c956fef320b60c0584f7d2
Author: QP Hou 
AuthorDate: Mon Apr 27 07:33:12 2020 -0700

Fix json string escape in tree view (#8551)

close #8523.

(cherry-picked from bcbd888)
---
 airflow/www_rbac/views.py|  8 ++--
 tests/www_rbac/test_views.py | 19 +++
 2 files changed, 25 insertions(+), 2 deletions(-)

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index be7a384..55a10e9 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -1524,6 +1524,11 @@ class Airflow(AirflowBaseView):
 external_logs = conf.get('elasticsearch', 'frontend')
 doc_md = wwwutils.wrapped_markdown(getattr(dag, 'doc_md', None), 
css_class='dag-doc')
 
+# avoid spaces to reduce payload size
+data = htmlsafe_json_dumps(data, separators=(',', ':'))
+# escape slashes to avoid JSON parse error in JS
+data = data.replace('\\', '')
+
 return self.render_template(
 'airflow/tree.html',
 operators=sorted({op.task_type: op for op in dag.tasks}.values(),
@@ -1532,8 +1537,7 @@ class Airflow(AirflowBaseView):
 form=form,
 dag=dag,
 doc_md=doc_md,
-# avoid spaces to reduce payload size
-data=htmlsafe_json_dumps(data, separators=(',', ':')),
+data=data,
 blur=blur, num_runs=num_runs,
 show_external_logs=bool(external_logs))
 
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index 2824184..c668227 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -580,6 +580,25 @@ class TestAirflowBaseViews(TestBase):
 mock_get_dag.assert_called_once_with('example_bash_operator')
 self.check_content_in_response('example_bash_operator', resp)
 
+@parameterized.expand([
+("hello\nworld", "hellonworld"),
+("hello'world", "hellou0027world"),
+

[airflow] 01/04: Fix tree view if config contains " (#9250)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 72af0a11b88e643ee35d74c537f10aaa9288fddb
Author: Igor Khrol 
AuthorDate: Mon Jun 15 13:08:34 2020 +0300

Fix tree view if config contains " (#9250)

If you run DAG with `{"\"": ""}` configuration tree view will be broken:
```
tree:1 Uncaught SyntaxError: Unexpected string in JSON at position 806
at JSON.parse ()
at tree?dag_id=hightlight_test_runs=25:1190
```

JSON.parse is given incorrectly escaped json string.

(cherry-picked from a8cd23c8f04c29a3bc37e9a1aec2fbdf044116e4)
---
 airflow/www_rbac/templates/airflow/tree.html | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www_rbac/templates/airflow/tree.html 
b/airflow/www_rbac/templates/airflow/tree.html
index 68723d3..b8d167c 100644
--- a/airflow/www_rbac/templates/airflow/tree.html
+++ b/airflow/www_rbac/templates/airflow/tree.html
@@ -138,7 +138,7 @@ function populate_taskinstance_properties(node) {
 
 var devicePixelRatio = window.devicePixelRatio || 1;
 // JSON.parse is faster for large payloads than an object literal (because the 
JSON grammer is simpler!)
-var data = JSON.parse('{{ data }}');
+var data = JSON.parse({{ data|tojson }});
 var barHeight = 20;
 var axisHeight = 40;
 var square_x = parseInt(500 * devicePixelRatio);



[airflow] 04/04: Monkey patch greenlet celery pools (#8559)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d24f2c6eb5980fcf6bc52d2d06f7c92c44748854
Author: Amir Amangeldi 
AuthorDate: Sun Apr 26 12:48:11 2020 -0400

Monkey patch greenlet celery pools (#8559)

Celery pools of type eventlet and gevent use greenlets,
which requires monkey patching the app:
https://eventlet.net/doc/patching.html#monkey-patch

Otherwise task instances hang on the workers and are never
executed.

(cherry picked from commit 9d7dab4c404b2025729509b85558cf0705cb7ccb)
---
 airflow/bin/cli.py | 10 +-
 1 file changed, 9 insertions(+), 1 deletion(-)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 4239701..eec216e 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1092,6 +1092,7 @@ def worker(args):
 
 # Celery worker
 from airflow.executors.celery_executor import app as celery_app
+from celery import maybe_patch_concurrency
 from celery.bin import worker
 
 autoscale = args.autoscale
@@ -1112,7 +1113,14 @@ def worker(args):
 }
 
 if conf.has_option("celery", "pool"):
-options["pool"] = conf.get("celery", "pool")
+pool = conf.get("celery", "pool")
+options["pool"] = pool
+# Celery pools of type eventlet and gevent use greenlets, which
+# requires monkey patching the app:
+# https://eventlet.net/doc/patching.html#monkey-patch
+# Otherwise task instances hang on the workers and are never
+# executed.
+maybe_patch_concurrency(['-P', pool])
 
 if args.daemon:
 pid, stdout, stderr, log_file = setup_locations("worker",



[airflow] 03/04: Fix failing tests from #9250 (#9307)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 22c95f075c132041b1c8378490399904a32d2682
Author: Kamil Breguła 
AuthorDate: Mon Jun 15 16:23:11 2020 +0200

Fix failing tests from #9250 (#9307)

(cherry-picked from 2c18a3f)
---
 airflow/www_rbac/views.py|  2 --
 tests/www_rbac/test_views.py | 11 ++-
 2 files changed, 6 insertions(+), 7 deletions(-)

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 55a10e9..6ae6ef6 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -1526,8 +1526,6 @@ class Airflow(AirflowBaseView):
 
 # avoid spaces to reduce payload size
 data = htmlsafe_json_dumps(data, separators=(',', ':'))
-# escape slashes to avoid JSON parse error in JS
-data = data.replace('\\', '')
 
 return self.render_template(
 'airflow/tree.html',
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index c668227..116d3d6 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -581,11 +581,12 @@ class TestAirflowBaseViews(TestBase):
 self.check_content_in_response('example_bash_operator', resp)
 
 @parameterized.expand([
-("hello\nworld", "hellonworld"),
-("hello'world", "hellou0027world"),
-

[GitHub] [airflow] ephraimbuddy opened a new pull request #9482: add crud endpoint for xcom

2020-06-22 Thread GitBox


ephraimbuddy opened a new pull request #9482:
URL: https://github.com/apache/airflow/pull/9482


   ---
   Closes : #9111 
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Target Github ISSUE in description if exists
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #9482: add crud endpoint for xcom

2020-06-22 Thread GitBox


kaxil commented on a change in pull request #9482:
URL: https://github.com/apache/airflow/pull/9482#discussion_r443906611



##
File path: airflow/api_connexion/endpoints/xcom_endpoint.py
##
@@ -14,16 +14,45 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import List, Optional
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
sshttps://github.com/apache/airflow/issues/8134
+from flask import Response, request
+from marshmallow import ValidationError
+from sqlalchemy.orm.session import Session
 
+from airflow.api_connexion.exceptions import AlreadyExists, BadRequest, 
NotFound
+from airflow.api_connexion.schemas.xcom_schema import xcom_schema
+from airflow.models import DagRun as DR, XCom
+from airflow.utils.session import provide_session
 
-def delete_xcom_entry():
+
+@provide_session
+def delete_xcom_entry(
+dag_id: str,
+dag_run_id: str,
+task_id: str,
+xcom_key: str,
+session: Session
+) -> Response:
 """
 Delete an XCom entry
 """
-raise NotImplementedError("Not implemented yet.")
+
+dag_run = session.query(DR).filter(DR.run_id == dag_run_id,
+   DR.dag_id == dag_id).first()
+if not dag_run:
+raise NotFound(f'DAGRun with dag_id:{dag_id} and run_id:{dag_run_id} 
not found')
+
+query = session.query(XCom)
+query = query.filter(XCom.dag_id == dag_id,
+ XCom.task_id == task_id,
+ XCom.key == xcom_key)
+entry = query.delete()

Review comment:
   You can chain these SQL Queries as I have done here: 
https://github.com/apache/airflow/pull/9424





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vanka56 commented on pull request #9280: Functionality to shuffle HMS connections to be used by HiveMetastoreHook facilitating load balancing

2020-06-22 Thread GitBox


vanka56 commented on pull request #9280:
URL: https://github.com/apache/airflow/pull/9280#issuecomment-647857736


   Hi @ashb @turbaszek,
   Do you mind reviewing this change when you get a chance. Thank you! cc 
@Acehaidrey Acehaidrey



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] zikun opened a new pull request #9483: Correct command for starting Celery Flower

2020-06-22 Thread GitBox


zikun opened a new pull request #9483:
URL: https://github.com/apache/airflow/pull/9483


   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] tag nightly-master updated (2190e50 -> ee36142)

2020-06-22 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to tag nightly-master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag nightly-master was modified! ***

from 2190e50  (commit)
  to ee36142  (commit)
from 2190e50  Move modules in `airflow.contrib.utils.log` to 
`airflow.utils.log` (#9395)
 add c9c0275  Disable schema  ordering (#9471)
 add 2bb40ef  Add __init__ method to Variable class (#9470)
 add 3235670  Add unit tests for OracleOperator (#9469)
 add d7de735  Move out weekday from airflow.contrib (#9388)
 add c7a454a  Add AWS ECS system test (#)
 add 7256f4c  Pylint fixes and deprecation of rare used methods in 
Connection (#9419)
 add 097180b  Remove redundant code from breeze initialization (#9375)
 add ee36142  Remove unused recurse_tasks function (#9465)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|  33 
 airflow/api_connexion/schemas/dag_run_schema.py|   5 -
 airflow/contrib/utils/weekday.py   |  41 +---
 airflow/hooks/base_hook.py |  12 +-
 airflow/models/connection.py   | 117 ++--
 airflow/models/variable.py |   7 +-
 .../amazon/aws/example_dags/example_ecs_fargate.py |  11 +-
 airflow/providers/google/cloud/hooks/cloud_sql.py  |   3 +-
 airflow/sensors/weekday_sensor.py  |   8 +-
 airflow/{contrib => }/utils/weekday.py |   1 -
 airflow/www/utils.py   |  19 --
 scripts/ci/libraries/_initialization.sh|   6 +-
 scripts/ci/pylint_todo.txt |   1 -
 tests/models/test_connection.py|  14 +-
 .../amazon/aws/operators/test_ecs_system.py|  99 ++
 tests/providers/apache/hive/hooks/test_hive.py |   2 +-
 tests/providers/apache/livy/hooks/test_livy.py |   4 +-
 .../providers/google/cloud/hooks/test_cloud_sql.py |  12 +-
 .../google/cloud/operators/test_cloud_sql.py   |   6 +-
 .../hooks => oracle/operators}/__init__.py |   0
 .../providers/oracle/operators/test_oracle.py  |  30 ++-
 tests/sensors/test_weekday_sensor.py   |   2 +-
 tests/test_project_structure.py|   1 -
 tests/test_utils/amazon_system_helpers.py  | 211 +
 tests/utils/test_weekday.py|   2 +-
 25 files changed, 527 insertions(+), 120 deletions(-)
 copy airflow/{contrib => }/utils/weekday.py (99%)
 create mode 100644 tests/providers/amazon/aws/operators/test_ecs_system.py
 copy tests/providers/{zendesk/hooks => oracle/operators}/__init__.py (100%)
 copy airflow/operators/oracle_operator.py => 
tests/providers/oracle/operators/test_oracle.py (53%)



[GitHub] [airflow] kaxil opened a new pull request #9480: Fix typo in the word "default" in www/forms.py

2020-06-22 Thread GitBox


kaxil opened a new pull request #9480:
URL: https://github.com/apache/airflow/pull/9480


   `defualt_timezone` -> `defualt_timezone`
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9481: Remove redundant parentheses in /test_datacatalog.py

2020-06-22 Thread GitBox


kaxil opened a new pull request #9481:
URL: https://github.com/apache/airflow/pull/9481


   string doesn't need parentheses
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: [AIRFLOW-6940] Improve test isolation in test_views.py (#7564)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new c80a35d  [AIRFLOW-6940] Improve test isolation in test_views.py (#7564)
c80a35d is described below

commit c80a35d462f8980f1aedcf8fb70900648379a107
Author: Kamil Breguła 
AuthorDate: Thu Feb 27 11:26:13 2020 +0100

[AIRFLOW-6940] Improve test isolation in test_views.py (#7564)
---
 tests/www_rbac/test_views.py | 48 +---
 1 file changed, 9 insertions(+), 39 deletions(-)

diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index 116d3d6..a5210f3 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -58,8 +58,9 @@ from airflow.utils.db import create_session
 from airflow.utils.state import State
 from airflow.utils.timezone import datetime
 from airflow.www_rbac import app as application
-from tests.test_utils.config import conf_vars
 from tests.compat import mock
+from tests.test_utils.config import conf_vars
+from tests.test_utils.db import clear_db_runs
 
 
 class TestBase(unittest.TestCase):
@@ -71,6 +72,10 @@ class TestBase(unittest.TestCase):
 settings.configure_orm()
 cls.session = Session
 
+@classmethod
+def tearDownClass(cls):
+clear_db_runs()
+
 def setUp(self):
 self.client = self.app.test_client()
 self.login()
@@ -379,21 +384,9 @@ class TestAirflowBaseViews(TestBase):
 super(TestAirflowBaseViews, self).setUp()
 self.logout()
 self.login()
-self.cleanup_dagruns()
+clear_db_runs()
 self.prepare_dagruns()
 
-def cleanup_dagruns(self):
-DR = models.DagRun
-dag_ids = ['example_bash_operator',
-   'example_subdag_operator',
-   'example_xcom']
-(self.session
- .query(DR)
- .filter(DR.dag_id.in_(dag_ids))
- .filter(DR.run_id == self.run_id)
- .delete(synchronize_session='fetch'))
-self.session.commit()
-
 def prepare_dagruns(self):
 self.bash_dag = self.dagbag.dags['example_bash_operator']
 self.sub_dag = self.dagbag.dags['example_subdag_operator']
@@ -1349,17 +1342,6 @@ class TestDagACLView(TestBase):
 for dag in dagbag.dags.values():
 dag.sync_to_db()
 
-def cleanup_dagruns(self):
-DR = models.DagRun
-dag_ids = ['example_bash_operator',
-   'example_subdag_operator']
-(self.session
- .query(DR)
- .filter(DR.dag_id.in_(dag_ids))
- .filter(DR.run_id == self.run_id)
- .delete(synchronize_session='fetch'))
-self.session.commit()
-
 def prepare_dagruns(self):
 dagbag = models.DagBag(include_examples=True)
 self.bash_dag = dagbag.dags['example_bash_operator']
@@ -1379,7 +1361,7 @@ class TestDagACLView(TestBase):
 
 def setUp(self):
 super(TestDagACLView, self).setUp()
-self.cleanup_dagruns()
+clear_db_runs()
 self.prepare_dagruns()
 self.logout()
 self.appbuilder.sm.sync_roles()
@@ -2658,21 +2640,9 @@ class TestDecorators(TestBase):
 super(TestDecorators, self).setUp()
 self.logout()
 self.login()
-self.cleanup_dagruns()
+clear_db_runs()
 self.prepare_dagruns()
 
-def cleanup_dagruns(self):
-DR = models.DagRun
-dag_ids = ['example_bash_operator',
-   'example_subdag_operator',
-   'example_xcom']
-(self.session
- .query(DR)
- .filter(DR.dag_id.in_(dag_ids))
- .filter(DR.run_id == self.run_id)
- .delete(synchronize_session='fetch'))
-self.session.commit()
-
 def prepare_dagruns(self):
 dagbag = models.DagBag(include_examples=True)
 self.bash_dag = dagbag.dags['example_bash_operator']



[GitHub] [airflow] yuqian90 commented on a change in pull request #8992: [AIRFLOW-5391] Do not re-run skipped tasks when they are cleared (#7276)

2020-06-22 Thread GitBox


yuqian90 commented on a change in pull request #8992:
URL: https://github.com/apache/airflow/pull/8992#discussion_r443886553



##
File path: tests/ti_deps/deps/test_trigger_rule_dep.py
##
@@ -19,13 +19,13 @@
 
 import unittest
 from datetime import datetime
+from unittest.mock import Mock
 
 from airflow.models import BaseOperator, TaskInstance
 from airflow.utils.trigger_rule import TriggerRule
 from airflow.ti_deps.deps.trigger_rule_dep import TriggerRuleDep
 from airflow.utils.db import create_session
 from airflow.utils.state import State

Review comment:
   Thanks for fixing this!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-5391) Clearing a task skipped by BranchPythonOperator will cause the task to execute

2020-06-22 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17142490#comment-17142490
 ] 

ASF GitHub Bot commented on AIRFLOW-5391:
-

yuqian90 commented on a change in pull request #8992:
URL: https://github.com/apache/airflow/pull/8992#discussion_r443886553



##
File path: tests/ti_deps/deps/test_trigger_rule_dep.py
##
@@ -19,13 +19,13 @@
 
 import unittest
 from datetime import datetime
+from unittest.mock import Mock
 
 from airflow.models import BaseOperator, TaskInstance
 from airflow.utils.trigger_rule import TriggerRule
 from airflow.ti_deps.deps.trigger_rule_dep import TriggerRuleDep
 from airflow.utils.db import create_session
 from airflow.utils.state import State

Review comment:
   Thanks for fixing this!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Clearing a task skipped by BranchPythonOperator will cause the task to execute
> --
>
> Key: AIRFLOW-5391
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5391
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.10.4
>Reporter: Qian Yu
>Assignee: Qian Yu
>Priority: Major
> Fix For: 2.0.0
>
>
> I tried this on 1.10.3 and 1.10.4, both have this issue: 
> E.g. in this example from the doc, branch_a executed, branch_false was 
> skipped because of branching condition. However if someone Clear 
> branch_false, it'll cause branch_false to execute. 
> !https://airflow.apache.org/_images/branch_good.png!
> This behaviour is understandable given how BranchPythonOperator is 
> implemented. BranchPythonOperator does not store its decision anywhere. It 
> skips its own downstream tasks in the branch at runtime. So there's currently 
> no way for branch_false to know it should be skipped without rerunning the 
> branching task.
> This is obviously counter-intuitive from the user's perspective. In this 
> example, users would not expect branch_false to execute when they clear it 
> because the branching task should have skipped it.
> There are a few ways to improve this:
> Option 1): Make downstream tasks skipped by BranchPythonOperator not 
> clearable without also clearing the upstream BranchPythonOperator. In this 
> example, if someone clears branch_false without clearing branching, the Clear 
> action should just fail with an error telling the user he needs to clear the 
> branching task as well.
> Option 2): Make BranchPythonOperator store the result of its skip condition 
> somewhere. Make downstream tasks check for this stored decision and skip 
> themselves if they should have been skipped by the condition. This probably 
> means the decision of BranchPythonOperator needs to be stored in the db.
>  
> [kevcampb|https://blog.diffractive.io/author/kevcampb/] attempted a 
> workaround and on this blog. And he acknowledged his workaround is not 
> perfect and a better permanent fix is needed:
> [https://blog.diffractive.io/2018/08/07/replacement-shortcircuitoperator-for-airflow/]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[airflow] branch v1-10-test updated: Optimize count query on /home (#8729)

2020-06-22 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 256d7f6  Optimize count query on /home (#8729)
256d7f6 is described below

commit 256d7f63107ac63c7b82042024218ae0aa2fb912
Author: Kamil Breguła 
AuthorDate: Wed May 6 18:33:12 2020 +0200

Optimize count query on /home (#8729)
---
 airflow/www_rbac/views.py | 17 +++--
 1 file changed, 11 insertions(+), 6 deletions(-)

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 6ae6ef6..c7831c4 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -289,12 +289,22 @@ class Airflow(AirflowBaseView):
 active_dags = dags_query.filter(~DagModel.is_paused)
 paused_dags = dags_query.filter(DagModel.is_paused)
 
+is_paused_count = dict(
+all_dags.with_entities(DagModel.is_paused, 
func.count(DagModel.dag_id))
+.group_by(DagModel.is_paused).all()
+)
+status_count_active = is_paused_count.get(False, 0)
+status_count_paused = is_paused_count.get(True, 0)
+all_dags_count = status_count_active + status_count_paused
 if arg_status_filter == 'active':
 current_dags = active_dags
+num_of_all_dags = status_count_active
 elif arg_status_filter == 'paused':
 current_dags = paused_dags
+num_of_all_dags = status_count_paused
 else:
 current_dags = all_dags
+num_of_all_dags = all_dags_count
 
 dags = current_dags.order_by(DagModel.dag_id).options(
 
joinedload(DagModel.tags)).offset(start).limit(dags_per_page).all()
@@ -320,13 +330,8 @@ class Airflow(AirflowBaseView):
 filename=filename),
 "error")
 
-num_of_all_dags = current_dags.count()
 num_of_pages = int(math.ceil(num_of_all_dags / float(dags_per_page)))
 
-status_count_active = active_dags.count()
-status_count_paused = paused_dags.count()
-status_count_all = status_count_active + status_count_paused
-
 return self.render_template(
 'airflow/dags.html',
 dags=dags,
@@ -344,7 +349,7 @@ class Airflow(AirflowBaseView):
 num_runs=num_runs,
 tags=tags,
 status_filter=arg_status_filter,
-status_count_all=status_count_all,
+status_count_all=all_dags_count,
 status_count_active=status_count_active,
 status_count_paused=status_count_paused)
 



[GitHub] [airflow] vaddisrinivas commented on issue #9474: Airflow support for S3 compatible storages

2020-06-22 Thread GitBox


vaddisrinivas commented on issue #9474:
URL: https://github.com/apache/airflow/issues/9474#issuecomment-647650974


   It doesnt work for me even as I continue to provide HOST in the connection 
and also other relevant parameters.
   If there is any alternative for the same, please help with that.
   
   Thanks.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vaddisrinivas edited a comment on issue #9474: Airflow support for S3 compatible storages

2020-06-22 Thread GitBox


vaddisrinivas edited a comment on issue #9474:
URL: https://github.com/apache/airflow/issues/9474#issuecomment-647650974


   It doesnt work for me even as I continue to provide HOST in the connection 
and also other relevant parameters.
   If there is any alternative for the same, please help with that.
   Also @dimon222, can you please share a sample connection screenshot/ 
instruction along with configurations thus enabled to facilitate remote logging?
   Thanks.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #9375: Remove redundant code from breeze initialization

2020-06-22 Thread GitBox


potiuk merged pull request #9375:
URL: https://github.com/apache/airflow/pull/9375


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (7256f4c -> 097180b)

2020-06-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7256f4c  Pylint fixes and deprecation of rare used methods in 
Connection (#9419)
 add 097180b  Remove redundant code from breeze initialization (#9375)

No new revisions were added by this update.

Summary of changes:
 scripts/ci/libraries/_initialization.sh | 6 +-
 1 file changed, 1 insertion(+), 5 deletions(-)



[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9475: Add extra links endpoint

2020-06-22 Thread GitBox


ephraimbuddy commented on a change in pull request #9475:
URL: https://github.com/apache/airflow/pull/9475#discussion_r443728375



##
File path: tests/api_connexion/endpoints/test_extra_link_endpoint.py
##
@@ -15,24 +15,196 @@
 # specific language governing permissions and limitations
 # under the License.
 import unittest
+from unittest import mock
+from urllib.parse import quote
 
-import pytest
-
+from airflow import DAG
+from airflow.models import XCom, BaseOperatorLink
+from airflow.models.dagrun import DagRun
+from airflow.plugins_manager import AirflowPlugin
+from airflow.providers.google.cloud.operators.bigquery import 
BigQueryExecuteQueryOperator
+from airflow.utils.dates import days_ago
+from airflow.utils.session import provide_session
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 from airflow.www import app
+from parameterized import parameterized
+from test_utils.mock_plugins import mock_plugin_manager
+from tests.test_utils.db import clear_db_runs, clear_db_xcom
 
 
 class TestGetExtraLinks(unittest.TestCase):
 @classmethod
 def setUpClass(cls) -> None:
 super().setUpClass()
-cls.app = app.create_app(testing=True)  # type:ignore
+with mock.patch.dict('os.environ', SKIP_DAGS_PARSING='True'):
+cls.app = app.create_app(testing=True)  # type:ignore
+
+@provide_session
+def setUp(self, session) -> None:
+self.now = datetime(2020, 1, 1)
+
+clear_db_runs()
+clear_db_xcom()
+
+self.dag = self._create_dag()
+self.app.dag_bag.dags = {self.dag.dag_id: self.dag}  # pylint: 
disable=no-member
+self.app.dag_bag.sync_to_db()  # pylint: disable=no-member
+
+dr = DagRun(
+dag_id=self.dag.dag_id,
+run_id="TEST_DAG_RUN_ID",
+execution_date=self.now,
+run_type=DagRunType.MANUAL.value,
+)
+session.add(dr)
+session.commit()
 
-def setUp(self) -> None:
 self.client = self.app.test_client()  # type:ignore
 
-@pytest.mark.skip(reason="Not implemented yet")
+def tearDown(self) -> None:
+super().tearDown()
+clear_db_runs()
+clear_db_xcom()
+
+@staticmethod
+def _create_dag():
+with DAG(dag_id="TEST_DAG_ID", 
default_args=dict(start_date=days_ago(2),)) as dag:
+BigQueryExecuteQueryOperator(task_id="TEST_SINGLE_QUERY", 
sql="SELECT 1")
+BigQueryExecuteQueryOperator(task_id="TEST_MULTIPLE_QUERY", 
sql=["SELECT 1", "SELECT 2"])
+return dag
+
+@parameterized.expand(
+[
+(
+
"/api/v1/dags/INVALID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links",
+'DAG not found'
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/INVALID/taskInstances/TEST_SINGLE_QUERY/links",
+"DAG Run not found"
+),
+(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/INVALID/links",
+"Task not found"
+),
+]
+)
+def test_should_response_404_on_invalid_task_id(self, url, expected_title):
+response = self.client.get(url)
+
+self.assertEqual(404, response.status_code)
+self.assertEqual({'detail': None, 'status': 404, 'title': 
expected_title, 'type': 'about:blank'}, response.json)
+
+@mock_plugin_manager(plugins=[])
 def test_should_response_200(self):
+XCom.set(
+key="job_id",
+value="TEST_JOB_ID",
+execution_date=self.now,
+task_id="TEST_SINGLE_QUERY",
+dag_id=self.dag.dag_id,
+)
+response = self.client.get(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links"
+)
+
+self.assertEqual(200, response.status_code, response.data)
+self.assertEqual(
+{
+"BigQuery Console": 
"https://console.cloud.google.com/bigquery?j=TEST_JOB_ID;,
+},
+response.json,
+)
+
+@mock_plugin_manager(plugins=[])
+def test_should_response_200_missing_xcom(self):
+response = self.client.get(
+
"/api/v1/dags/TEST_DAG_ID/dagRuns/TEST_DAG_RUN_ID/taskInstances/TEST_SINGLE_QUERY/links"
+)
+
+self.assertEqual(200, response.status_code, response.data)
+self.assertEqual(
+{
+"BigQuery Console": None,
+},
+response.json,
+)
+
+@mock_plugin_manager(plugins=[])
+def test_should_response_200_multiple_links(self):
+XCom.set(
+key="job_id",
+value=["TEST_JOB_ID_1", "TEST_JOB_ID_2"],
+execution_date=self.now,
+task_id="TEST_MULTIPLE_QUERY",
+dag_id=self.dag.dag_id,
+)
+response = self.client.get(
+

[GitHub] [airflow] potiuk commented on pull request #9469: Add unit tests for OracleOperator

2020-06-22 Thread GitBox


potiuk commented on pull request #9469:
URL: https://github.com/apache/airflow/pull/9469#issuecomment-647341806


   Thanks @chipmyersjr ! What's next :)? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mikaelfr commented on a change in pull request #9204: Add filtering tasks/task instances by tags

2020-06-22 Thread GitBox


mikaelfr commented on a change in pull request #9204:
URL: https://github.com/apache/airflow/pull/9204#discussion_r443388108



##
File path: airflow/models/baseoperator.py
##
@@ -359,6 +363,8 @@ def __init__(
 self.email = email
 self.email_on_retry = email_on_retry
 self.email_on_failure = email_on_failure
+self._task_tags: Optional[List[str]] = []
+self.task_tags = task_tags or []

Review comment:
   Fixed in e67fd78





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #9474: Airflow support for S3 compatible storages

2020-06-22 Thread GitBox


boring-cyborg[bot] commented on issue #9474:
URL: https://github.com/apache/airflow/issues/9474#issuecomment-647408995


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] turbaszek commented on issue #9351: Consider renaming master branch to something less offensive

2020-06-22 Thread GitBox


turbaszek commented on issue #9351:
URL: https://github.com/apache/airflow/issues/9351#issuecomment-647409405


   @ashb @potiuk who can make the change? 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vaddisrinivas opened a new issue #9474: Airflow support for S3 compatible storages

2020-06-22 Thread GitBox


vaddisrinivas opened a new issue #9474:
URL: https://github.com/apache/airflow/issues/9474


   Hi,
   
   Curious to know about the support for S3 compatible storages like DELL ECS, 
MINIO ETC
   
   Thanks



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] SteNicholas commented on issue #9321: Add Signal Based Scheduling To Airflow

2020-06-22 Thread GitBox


SteNicholas commented on issue #9321:
URL: https://github.com/apache/airflow/issues/9321#issuecomment-647414670


   @ruoyu90 Good point. Btw, could you please take discussion on dev mailing 
list? The thread associated with this issue is [[AIP-35] Add Signal Based 
Scheduling To 
Airflow](https://lists.apache.org/thread.html/re1a7e5cfcb1e9f4a0bfac41998da2d88ffb26d4f597036c772a4c86e%40%3Cdev.airflow.apache.org%3E).
 Thanks.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] OmairK commented on pull request #9329: Pool CRUD Endpoints

2020-06-22 Thread GitBox


OmairK commented on pull request #9329:
URL: https://github.com/apache/airflow/pull/9329#issuecomment-647432738


   > We must add more restriction related to default branch.
   > 
   > * DEFAULT_POOL_NAME should not be possible to delete.
   > 
   > * name of pool with name= DEFAULT_POOL_NAME should not be possible to 
change.
   
   Fixed `0f748af`
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] egibbm commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


egibbm commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443470966



##
File path: tests/test_project_structure.py
##
@@ -43,7 +43,6 @@
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   Because we have added the test_qubole.py by [this 
commit](https://github.com/apache/airflow/pull/9349/commits/d48ea0a167c4428c37fc2939779f11d03fef9863),
 and when the CI run I received a CI message to remove that line.
   
   Pls cmiiw.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] egibbm commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


egibbm commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443471571



##
File path: tests/test_project_structure.py
##
@@ -43,7 +43,6 @@
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   let me check. There might be a problem when merging master.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] OmairK opened a new pull request #9473: [WIP] Dag Runs CRUD endpoints

2020-06-22 Thread GitBox


OmairK opened a new pull request #9473:
URL: https://github.com/apache/airflow/pull/9473


   Closes #9110 
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Target Github ISSUE in description if exists
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on issue #9351: Consider renaming master branch to something less offensive

2020-06-22 Thread GitBox


mik-laj commented on issue #9351:
URL: https://github.com/apache/airflow/issues/9351#issuecomment-647416139


   @turbaszek This can be more problematic than we expected. See: 
https://apache-airflow.slack.com/archives/CCPRP7943/p1592582878155700



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] msumit commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


msumit commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443468608



##
File path: tests/test_project_structure.py
##
@@ -43,7 +43,6 @@
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   why this file has been removed?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


mik-laj commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443472316



##
File path: tests/test_project_structure.py
##
@@ -43,7 +43,6 @@
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   This is a test that checks for missing test files. This file has been 
added to this change, so we need to update this list to represent the current 
state.  
   
   This test has been added to inform the user about the project rules as soon 
as possible. No reviewer needs to inform users that we need unit tests.
   
   It also simplifies the work of project maintainer, because it provides easy 
insight into project defects, which allows engaging other contributors to these 
tasks. More info: https://github.com/apache/airflow/issues/8278





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] egibbm commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


egibbm commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443474198



##
File path: tests/test_project_structure.py
##
@@ -43,7 +43,6 @@
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   This is the reason why we remove the line:
   
![image](https://user-images.githubusercontent.com/39547572/85279432-98431b80-b4b0-11ea-859b-acdd29f34b87.png)
   
   https://github.com/apache/airflow/pull/9349/checks?check_run_id=781002396
   
   I hope this clears up the issue.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] chattarajoy commented on a change in pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


chattarajoy commented on a change in pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#discussion_r443417306



##
File path: tests/test_project_structure.py
##
@@ -44,7 +44,6 @@
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
 'tests/providers/oracle/operators/test_oracle.py',
-'tests/providers/qubole/hooks/test_qubole.py',

Review comment:
   Why is this removed? I see you've added the file





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj merged pull request #9419: Pylint fixes and deprecation of rare used methods in Connection

2020-06-22 Thread GitBox


mik-laj merged pull request #9419:
URL: https://github.com/apache/airflow/pull/9419


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (c7a454a -> 7256f4c)

2020-06-22 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c7a454a  Add AWS ECS system test (#)
 add 7256f4c  Pylint fixes and deprecation of rare used methods in 
Connection (#9419)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md|  26 +
 airflow/hooks/base_hook.py |  12 ++-
 airflow/models/connection.py   | 117 ++---
 airflow/providers/google/cloud/hooks/cloud_sql.py  |   3 +-
 scripts/ci/pylint_todo.txt |   1 -
 tests/models/test_connection.py|  14 ++-
 tests/providers/apache/hive/hooks/test_hive.py |   2 +-
 tests/providers/apache/livy/hooks/test_livy.py |   4 +-
 .../providers/google/cloud/hooks/test_cloud_sql.py |  12 +--
 .../google/cloud/operators/test_cloud_sql.py   |   6 +-
 10 files changed, 162 insertions(+), 35 deletions(-)



[GitHub] [airflow] mik-laj commented on pull request #9419: Pylint fixes and deprecation of rare used methods in Connection

2020-06-22 Thread GitBox


mik-laj commented on pull request #9419:
URL: https://github.com/apache/airflow/pull/9419#issuecomment-647377021


   I added backward compatibility method with the warning, but no alternative 
suggestion. This has always been an internal detail and should not be used 
externally.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (2190e50 -> c9c0275)

2020-06-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 2190e50  Move modules in `airflow.contrib.utils.log` to 
`airflow.utils.log` (#9395)
 add c9c0275  Disable schema  ordering (#9471)

No new revisions were added by this update.

Summary of changes:
 airflow/api_connexion/schemas/dag_run_schema.py | 5 -
 1 file changed, 5 deletions(-)



[GitHub] [airflow] chattarajoy commented on pull request #9349: [AIRFLOW-9347] Fix QuboleHook unable to add list to tags

2020-06-22 Thread GitBox


chattarajoy commented on pull request #9349:
URL: https://github.com/apache/airflow/pull/9349#issuecomment-647346942


   thanks for taking this up @egibbm . LGTM



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Add AWS ECS system test (#8888)

2020-06-22 Thread feluelle
This is an automated email from the ASF dual-hosted git repository.

feluelle pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new c7a454a  Add AWS ECS system test (#)
c7a454a is described below

commit c7a454aa32bf33133d042e8438ac259b32144b21
Author: Mustafa Gök 
AuthorDate: Mon Jun 22 11:18:13 2020 +0300

Add AWS ECS system test (#)
---
 .../amazon/aws/example_dags/example_ecs_fargate.py |  11 +-
 .../amazon/aws/operators/test_ecs_system.py|  99 ++
 tests/test_utils/amazon_system_helpers.py  | 211 +
 3 files changed, 316 insertions(+), 5 deletions(-)

diff --git a/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py 
b/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py
index be55e67..4c75d8f 100644
--- a/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py
+++ b/airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py
@@ -23,6 +23,7 @@ It overrides the command in the `hello-world-container` 
container.
 """
 
 import datetime
+import os
 
 from airflow import DAG
 from airflow.providers.amazon.aws.operators.ecs import ECSOperator
@@ -50,7 +51,7 @@ dag.doc_md = __doc__
 hello_world = ECSOperator(
 task_id="hello_world",
 dag=dag,
-aws_conn_id="aws_default",
+aws_conn_id="aws_ecs",
 cluster="c",
 task_definition="hello-world",
 launch_type="FARGATE",
@@ -64,8 +65,8 @@ hello_world = ECSOperator(
 },
 network_configuration={
 "awsvpcConfiguration": {
-"securityGroups": ["sg-123abc"],
-"subnets": ["subnet-123456ab"],
+"securityGroups": [os.environ.get("SECURITY_GROUP_ID", 
"sg-123abc")],
+"subnets": [os.environ.get("SUBNET_ID", "subnet-123456ab")],
 },
 },
 tags={
@@ -75,7 +76,7 @@ hello_world = ECSOperator(
 "Version": "0.0.1",
 "Environment": "Development",
 },
-awslogs_group="/ecs_logs/group_a",
-awslogs_stream_prefix="prefix_b",
+awslogs_group="/ecs/hello-world",
+awslogs_stream_prefix="prefix_b/hello-world-container",  # prefix with 
container name
 )
 # [END howto_operator_ecs]
diff --git a/tests/providers/amazon/aws/operators/test_ecs_system.py 
b/tests/providers/amazon/aws/operators/test_ecs_system.py
new file mode 100644
index 000..1a6eec7
--- /dev/null
+++ b/tests/providers/amazon/aws/operators/test_ecs_system.py
@@ -0,0 +1,99 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import pytest
+
+from tests.test_utils.amazon_system_helpers import AWS_DAG_FOLDER, 
AmazonSystemTest
+
+
+@pytest.mark.backend("postgres", "mysql")
+class ECSSystemTest(AmazonSystemTest):
+"""
+ECS System Test to run and test example ECS dags
+
+Required variables.env file content (from your account):
+# Auto-export all variables
+set -a
+
+# aws parameters
+REGION_NAME="eu-west-1"
+REGISTRY_ID="123456789012"
+IMAGE="alpine:3.9"
+SUBNET_ID="subnet-068e9654a3c357a"
+SECURITY_GROUP_ID="sg-054dc69874a651"
+EXECUTION_ROLE_ARN="arn:aws:iam::123456789012:role/FooBarRole"
+
+# remove all created/existing resources flag
+# comment out to keep resources or use empty string
+# REMOVE_RESOURCES="True"
+"""
+
+# should be same as in the example dag
+aws_conn_id = "aws_ecs"
+cluster = "c"
+task_definition = "hello-world"
+container = "hello-world-container"
+awslogs_group = "/ecs/hello-world"
+awslogs_stream_prefix = "prefix_b"  # only prefix without container name
+
+@classmethod
+def setup_class(cls):
+cls.create_connection(
+aws_conn_id=cls.aws_conn_id,
+region=cls._region_name(),
+)
+
+# create ecs cluster if it does not exist
+cls.create_ecs_cluster(
+aws_conn_id=cls.aws_conn_id,
+cluster_name=cls.cluster,
+)
+
+# create task_definition if it does not exist
+task_definition_exists = cls.is_ecs_task_definition_exists(
+aws_conn_id=cls.aws_conn_id,
+

[jira] [Commented] (AIRFLOW-4512) New trigger rule for tasks - At least one successful

2020-06-22 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17141808#comment-17141808
 ] 

ASF GitHub Bot commented on AIRFLOW-4512:
-

jledru commented on pull request #5288:
URL: https://github.com/apache/airflow/pull/5288#issuecomment-647362242


   Hello,
   Is there any method to achieve the goal of atleast_one_success ?
   As I would need this behaviour in one of my DAG.
   Thanks,



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> New trigger rule for tasks - At least one successful
> 
>
> Key: AIRFLOW-4512
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4512
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: core, dependencies
>Affects Versions: 1.10.3
>Reporter: Bharath Palaksha
>Assignee: Bharath Palaksha
>Priority: Minor
>   Original Estimate: 336h
>  Remaining Estimate: 336h
>
> {{New trigger rule - *atleast_one_success*}}
> {{Trigger rules help in defining better dependencies on up stream tasks. 
> There are good number of rules currently defined which are as below}}
>  * {{all_success}}: (default) all parents have succeeded
>  * {{all_failed}}: all parents are in a {{failed}} or {{upstream_failed}} 
> state
>  * {{all_done}}: all parents are done with their execution
>  * {{one_failed}}: fires as soon as at least one parent has failed, it does 
> not wait for all parents to be done
>  * {{one_success}}: fires as soon as at least one parent succeeds, it does 
> not wait for all parents to be done
>  * {{none_failed}}: all parents have not failed ({{failed}} or 
> {{upstream_failed}}) i.e. all parents have succeeded or been skipped
>  * {{none_skipped}}: no parent is in a {{skipped}} state, i.e. all parents 
> are in a {{success}}, {{failed}}, or {{upstream_failed}} state
>  * {{dummy}}: dependencies are just for show, trigger at will
>  
> There can be another rule added here which is *atleast_one_success* - This 
> waits for all parent tasks to be complete and checks if at least one parent 
> is successful and triggers current task. It differs from one_success as it 
> waits for all parents to be done. 
> Consider a very common scenario in data pipelines where you have a number of 
> parallel tasks generating some data. As a downstream task to all these 
> generate tasks, you have a task to collate all data into one collection which 
> has to run if any of the upstream generate is successful and also has to wait 
> for all of them to be done. one_success can't be used as it doesn't wait for 
> other tasks.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] feluelle merged pull request #8888: Add AWS ECS system test

2020-06-22 Thread GitBox


feluelle merged pull request #:
URL: https://github.com/apache/airflow/pull/


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] CodingJonas commented on issue #9310: Delay until Docker Swarm recognizes finished service

2020-06-22 Thread GitBox


CodingJonas commented on issue #9310:
URL: https://github.com/apache/airflow/issues/9310#issuecomment-647367141


   I currently using the workaround with a subprocess without issues. Not 
saying it is perfect, but I can make a pull request out of if, since it seems 
to me like a sufficient solution.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #9471: Disable schema ordering

2020-06-22 Thread GitBox


potiuk merged pull request #9471:
URL: https://github.com/apache/airflow/pull/9471


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk merged pull request #9469: Add unit tests for OracleOperator

2020-06-22 Thread GitBox


potiuk merged pull request #9469:
URL: https://github.com/apache/airflow/pull/9469


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Add unit tests for OracleOperator (#9469)

2020-06-22 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 3235670  Add unit tests for OracleOperator (#9469)
3235670 is described below

commit 3235670b058681c896ac107e13b5218e9ca930c7
Author: chipmyersjr 
AuthorDate: Mon Jun 22 00:34:34 2020 -0700

Add unit tests for OracleOperator (#9469)
---
 tests/providers/oracle/operators/__init__.py| 16 ++
 tests/providers/oracle/operators/test_oracle.py | 40 +
 tests/test_project_structure.py |  1 -
 3 files changed, 56 insertions(+), 1 deletion(-)

diff --git a/tests/providers/oracle/operators/__init__.py 
b/tests/providers/oracle/operators/__init__.py
new file mode 100644
index 000..13a8339
--- /dev/null
+++ b/tests/providers/oracle/operators/__init__.py
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
diff --git a/tests/providers/oracle/operators/test_oracle.py 
b/tests/providers/oracle/operators/test_oracle.py
new file mode 100644
index 000..729cced
--- /dev/null
+++ b/tests/providers/oracle/operators/test_oracle.py
@@ -0,0 +1,40 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import unittest
+
+import mock
+
+from airflow.providers.oracle.hooks.oracle import OracleHook
+from airflow.providers.oracle.operators.oracle import OracleOperator
+
+
+class TestOracleOperator(unittest.TestCase):
+@mock.patch.object(OracleHook, 'run')
+def test_execute(self, mock_run):
+sql = 'SELECT * FROM test_table'
+oracle_conn_id = 'oracle_default'
+parameters = {'parameter': 'value'}
+autocommit = False
+context = "test_context"
+task_id = "test_task_id"
+
+operator = OracleOperator(sql=sql, oracle_conn_id=oracle_conn_id, 
parameters=parameters,
+  autocommit=autocommit, task_id=task_id)
+operator.execute(context=context)
+
+mock_run.assert_called_once_with(sql, autocommit=autocommit, 
parameters=parameters)
diff --git a/tests/test_project_structure.py b/tests/test_project_structure.py
index fa693ff..b30fd1a 100644
--- a/tests/test_project_structure.py
+++ b/tests/test_project_structure.py
@@ -43,7 +43,6 @@ MISSING_TEST_FILES = {
 'tests/providers/jenkins/hooks/test_jenkins.py',
 'tests/providers/microsoft/azure/sensors/test_azure_cosmos.py',
 'tests/providers/microsoft/mssql/hooks/test_mssql.py',
-'tests/providers/oracle/operators/test_oracle.py',
 'tests/providers/qubole/hooks/test_qubole.py',
 'tests/providers/samba/hooks/test_samba.py',
 'tests/providers/yandex/hooks/test_yandex.py'



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #9469: Add unit tests for OracleOperator

2020-06-22 Thread GitBox


boring-cyborg[bot] commented on pull request #9469:
URL: https://github.com/apache/airflow/pull/9469#issuecomment-647341461


   Awesome work, congrats on your first merged pull request!
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] jledru commented on pull request #5288: AIRFLOW-4512 - New trigger rule for tasks - At least one successful

2020-06-22 Thread GitBox


jledru commented on pull request #5288:
URL: https://github.com/apache/airflow/pull/5288#issuecomment-647362242


   Hello,
   Is there any method to achieve the goal of atleast_one_success ?
   As I would need this behaviour in one of my DAG.
   Thanks,



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




  1   2   >