[jira] [Created] (AIRFLOW-2148) getting error "Field 'execution_date' doesn't have a default value" with airflow using celery/local executor

2018-02-25 Thread Amit Prashant Agrahari (JIRA)
Amit Prashant Agrahari created AIRFLOW-2148:
---

 Summary: getting error "Field 'execution_date' doesn't have a 
default value"  with airflow using celery/local executor 
 Key: AIRFLOW-2148
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2148
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Amit Prashant Agrahari


[2018-02-26 05:55:07,702] \{base_task_runner.py:98} INFO - Subtask: 
sqlalchemy.exc.InvalidRequestError: This Session's transaction has been rolled 
back due to a previous exception during flush. To begin a new transaction with 
this Session, first issue Session.rollback(). Original exception was: 
(pymysql.err.InternalError) (1364, "Field 'execution_date' doesn't have a 
default value") [SQL: 'INSERT INTO task_instance (task_id, dag_id, start_date, 
end_date, duration, state, try_number, max_tries, hostname, unixname, job_id, 
pool, queue, priority_weight, operator, queued_dttm, pid) VALUES (%(task_id)s, 
%(dag_id)s, %(start_date)s, %(end_date)s, %(duration)s, %(state)s, 
%(try_number)s, %(max_tries)s, %(hostname)s, %(unixname)s, %(job_id)s, 
%(pool)s, %(queue)s, %(priority_weight)s, %(operator)s, %(queued_dttm)s, 
%(pid)s)'] [parameters: \{'job_id': None, 'queue': 'default', 'pid': None, 
'max_tries': 0, 'dag_id': 'example_trigger_target_dag', 'try_number': 0, 
'priority_weight': 4, 'duration': None, 'task_id': 'validation_this', 
'operator': None, 'pool': None, 'state': None, 'hostname': '', 'start_date': 
None, 'unixname': 'root', 'queued_dttm': None, 'end_date': None}] (Background 
on this error at: http://sqlalche.me/e/2j85)

[2018-02-26 05:55:07,655] \{base_task_runner.py:98} INFO - Subtask: 
sqlalchemy.exc.IntegrityError: (pymysql.err.IntegrityError) (1062, "Duplicate 
entry 'example_trigger_target_dag-2018-02-26 05:55:07.00' for key 
'dag_id'") [SQL: 'INSERT INTO dag_run (dag_id, execution_date, start_date, 
end_date, state, run_id, external_trigger, conf) VALUES (%(dag_id)s, now(), 
now(), %(end_date)s, %(state)s, %(run_id)s, %(external_trigger)s, %(conf)s)'] 
[parameters: \{'dag_id': 'example_trigger_target_dag', 'end_date': None, 
'state': 'running', 'conf': 
b'\x80\x04\x95\x1f\x00\x00\x00\x00\x00\x00\x00}\x94\x8c\x07message\x94\x8c\x0eHello
 World 98\x94s.', 'run_id': 'trig__2018-02-26T05:55:07.485541', 
'external_trigger': 1}] (Background on this error at: http://sqlalche.me/e/gkpj)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2097) UnboundLocalError: local variable 'tz' referenced before assignment

2018-02-25 Thread Tao Feng (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2097?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng reassigned AIRFLOW-2097:
-

Assignee: Tao Feng

> UnboundLocalError: local variable 'tz' referenced before assignment
> ---
>
> Key: AIRFLOW-2097
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2097
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: utils
>Reporter: Bryce Drennnan
>Assignee: Tao Feng
>Priority: Minor
>
> The date_range function references the variable tz before its assigned.  I 
> noticed this while running doctests.
> See this part of the code:
> [https://github.com/apache/incubator-airflow/blob/15b8a36b9011166b06f176f684b71703a4aebddd/airflow/utils/dates.py#L73-L84]
> I believe this bug was introduced here:
> https://github.com/apache/incubator-airflow/commit/518a41acf319af27d49bdc0c84bda64b6b8af0b3#commitcomment-27433613



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2109) scheduler stopped picking up jobs suddenly

2018-02-25 Thread rahul (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

rahul reassigned AIRFLOW-2109:
--

Assignee: leon pang  (was: Achille Fokoue)

> scheduler stopped picking up jobs suddenly
> --
>
> Key: AIRFLOW-2109
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2109
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: Airflow 1.8
> Environment: aws EC2 server
>Reporter: rahul
>Assignee: leon pang
>Priority: Major
>
> Hi,
>  
> The scheduler stopped picking up jobs suddenly. when i resetdb it started 
> picking up properly.
>  
> could you please let me know the reason why airflow resetdb resolved this 
> issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2146) Initialize default Google BigQuery Connection with valid conn_type & Fix broken DBApiHook

2018-02-25 Thread Kaxil Naik (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2146?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik reassigned AIRFLOW-2146:
---

Assignee: Kaxil Naik

> Initialize default Google BigQuery Connection with valid conn_type & Fix 
> broken DBApiHook
> -
>
> Key: AIRFLOW-2146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2146
> Project: Apache Airflow
>  Issue Type: Task
>  Components: contrib, gcp
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.0
>
>
> `airflow initdb` creates a connection with conn_id='bigquery_default' and 
> conn_type='bigquery'. However, bigquery is not a valid conn_type, according 
> to models.Connection._types, and BigQuery connections should use the 
> google_cloud_platform conn_type.
> Also as [renanleme|https://github.com/renanleme] mentioned 
> [here|https://github.com/apache/incubator-airflow/pull/3031#issuecomment-368132910]
>  the dags he has created are broken when he is using `get_records()` from 
> BigQueryHook which is extended from DbApiHook.
> *Error Log*:
> {code}
> Traceback (most recent call last):
>   File "/src/apache-airflow/airflow/models.py", line 1519, in _run_raw_task
> result = task_copy.execute(context=context)
>   File "/airflow/dags/lib/operators/test_operator.py", line 21, in execute
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
>   File "/src/apache-airflow/airflow/hooks/base_hook.py", line 92, in 
> get_records
> raise NotImplementedError()
> {code}
> *Dag*:
> {code:python}
> from datetime import datetime
> from airflow import DAG
> from lib.operators.test_operator import TestOperator
> default_args = {
> 'depends_on_past': False,
> 'start_date': datetime(2018, 2, 21),
> }
> dag = DAG(
> 'test_dag',
> default_args=default_args,
> schedule_interval='0 6 * * *'
> )
> sql = '''
> SELECT id from YOUR_BIGQUERY_TABLE limit 10
> '''
> compare_grouped_event = TestOperator(
> task_id='test_operator',
> source_conn_id='gcp_airflow',
> sql=sql,
> dag=dag
> )
> {code}
> *Operator*:
> {code:python}
> from airflow.hooks.base_hook import BaseHook
> from airflow.models import BaseOperator
> from airflow.utils.decorators import apply_defaults
> class TestOperator(BaseOperator):
> @apply_defaults
> def __init__(
> self,
> sql,
> source_conn_id=None,
> *args, **kwargs):
> super(TestOperator, self).__init__(*args, **kwargs)
> self.sql = sql
> self.source_conn_id = source_conn_id
> def execute(self, context=None):
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
> self.log.info('Fetched records from source')
> @staticmethod
> def _get_db_hook(conn_id):
> return BaseHook.get_hook(conn_id=conn_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2146) Initialize default Google BigQuery Connection with valid conn_type & Fix broken DBApiHook

2018-02-25 Thread Anonymous (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2146?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anonymous reassigned AIRFLOW-2146:
--

Assignee: (was: Kaxil Naik)

> Initialize default Google BigQuery Connection with valid conn_type & Fix 
> broken DBApiHook
> -
>
> Key: AIRFLOW-2146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2146
> Project: Apache Airflow
>  Issue Type: Task
>  Components: contrib, gcp
>Reporter: Kaxil Naik
>Priority: Major
> Fix For: 1.10.0
>
>
> `airflow initdb` creates a connection with conn_id='bigquery_default' and 
> conn_type='bigquery'. However, bigquery is not a valid conn_type, according 
> to models.Connection._types, and BigQuery connections should use the 
> google_cloud_platform conn_type.
> Also as [renanleme|https://github.com/renanleme] mentioned 
> [here|https://github.com/apache/incubator-airflow/pull/3031#issuecomment-368132910]
>  the dags he has created are broken when he is using `get_records()` from 
> BigQueryHook which is extended from DbApiHook.
> *Error Log*:
> {code}
> Traceback (most recent call last):
>   File "/src/apache-airflow/airflow/models.py", line 1519, in _run_raw_task
> result = task_copy.execute(context=context)
>   File "/airflow/dags/lib/operators/test_operator.py", line 21, in execute
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
>   File "/src/apache-airflow/airflow/hooks/base_hook.py", line 92, in 
> get_records
> raise NotImplementedError()
> {code}
> *Dag*:
> {code:python}
> from datetime import datetime
> from airflow import DAG
> from lib.operators.test_operator import TestOperator
> default_args = {
> 'depends_on_past': False,
> 'start_date': datetime(2018, 2, 21),
> }
> dag = DAG(
> 'test_dag',
> default_args=default_args,
> schedule_interval='0 6 * * *'
> )
> sql = '''
> SELECT id from YOUR_BIGQUERY_TABLE limit 10
> '''
> compare_grouped_event = TestOperator(
> task_id='test_operator',
> source_conn_id='gcp_airflow',
> sql=sql,
> dag=dag
> )
> {code}
> *Operator*:
> {code:python}
> from airflow.hooks.base_hook import BaseHook
> from airflow.models import BaseOperator
> from airflow.utils.decorators import apply_defaults
> class TestOperator(BaseOperator):
> @apply_defaults
> def __init__(
> self,
> sql,
> source_conn_id=None,
> *args, **kwargs):
> super(TestOperator, self).__init__(*args, **kwargs)
> self.sql = sql
> self.source_conn_id = source_conn_id
> def execute(self, context=None):
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
> self.log.info('Fetched records from source')
> @staticmethod
> def _get_db_hook(conn_id):
> return BaseHook.get_hook(conn_id=conn_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2034) mixup between %s and {} when using str.format

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2034?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2034.
-
   Resolution: Fixed
Fix Version/s: (was: 1.9.0)
   1.10.0

Issue resolved by pull request #2976
[https://github.com/apache/incubator-airflow/pull/2976]

> mixup between %s and {} when using str.format
> -
>
> Key: AIRFLOW-2034
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2034
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: 1.9.0
>Reporter: knil-sama
>Assignee: knil-sama
>Priority: Trivial
>  Labels: easyfix
> Fix For: 1.10.0
>
>
> Convention is to use .format for string formating outside logging, else use 
> lazy formation
>  See comment in related issue
>  #[https://github.com/apache/incubator-airflow/pull/2823/files]
> But some code didn't implement it correctly.
> Problematic cases can be identified using following command line
> {{grep -r '%s'./* | grep '\.format('}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2034) mixup between %s and {} when using str.format

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2034?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376056#comment-16376056
 ] 

ASF subversion and git services commented on AIRFLOW-2034:
--

Commit 6efe2e3ce050264b580d11f3cffb109e8aef5bbd in incubator-airflow's branch 
refs/heads/master from [~knil-sama]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6efe2e3 ]

[AIRFLOW-2034] Fix mixup between %s and {} when using str.format
Convention is to use .format for string formating oustide logging, else use 
lazy format
See comment in related issue
https://github.com/apache/incubator-airflow/pull/2823/files
Identified problematic case using following command line
.git/COMMIT_EDITMSG:`grep -r '%s'./* | grep '\.format('`

Closes #2976 from knil-sama/fix-mixup-format-str


> mixup between %s and {} when using str.format
> -
>
> Key: AIRFLOW-2034
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2034
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: 1.9.0
>Reporter: knil-sama
>Assignee: knil-sama
>Priority: Trivial
>  Labels: easyfix
> Fix For: 1.9.0
>
>
> Convention is to use .format for string formating outside logging, else use 
> lazy formation
>  See comment in related issue
>  #[https://github.com/apache/incubator-airflow/pull/2823/files]
> But some code didn't implement it correctly.
> Problematic cases can be identified using following command line
> {{grep -r '%s'./* | grep '\.format('}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2034] Fix mixup between %s and {} when using str.format Convention is to use .format for string formating oustide logging, else use lazy format See comment in re

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master caa8fd9d3 -> 6efe2e3ce


[AIRFLOW-2034] Fix mixup between %s and {} when using str.format
Convention is to use .format for string formating oustide logging, else use 
lazy format
See comment in related issue
https://github.com/apache/incubator-airflow/pull/2823/files
Identified problematic case using following command line
.git/COMMIT_EDITMSG:`grep -r '%s'./* | grep '\.format('`

Closes #2976 from knil-sama/fix-mixup-format-str


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/6efe2e3c
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/6efe2e3c
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/6efe2e3c

Branch: refs/heads/master
Commit: 6efe2e3ce050264b580d11f3cffb109e8aef5bbd
Parents: caa8fd9
Author: knil-sama 
Authored: Sun Feb 25 12:32:43 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:32:43 2018 +0100

--
 airflow/hooks/slack_hook.py  | 2 +-
 airflow/task/task_runner/base_task_runner.py | 6 +++---
 2 files changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/6efe2e3c/airflow/hooks/slack_hook.py
--
diff --git a/airflow/hooks/slack_hook.py b/airflow/hooks/slack_hook.py
index cd47573..0672675 100644
--- a/airflow/hooks/slack_hook.py
+++ b/airflow/hooks/slack_hook.py
@@ -52,5 +52,5 @@ class SlackHook(BaseHook):
 rc = sc.api_call(method, **api_params)
 
 if not rc['ok']:
-msg = "Slack API call failed (%s)".format(rc['error'])
+msg = "Slack API call failed ({})".format(rc['error'])
 raise AirflowException(msg)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/6efe2e3c/airflow/task/task_runner/base_task_runner.py
--
diff --git a/airflow/task/task_runner/base_task_runner.py 
b/airflow/task/task_runner/base_task_runner.py
index 9a50390..946baf8 100644
--- a/airflow/task/task_runner/base_task_runner.py
+++ b/airflow/task/task_runner/base_task_runner.py
@@ -96,9 +96,9 @@ class BaseTaskRunner(LoggingMixin):
 line = line.decode('utf-8')
 if len(line) == 0:
 break
-self.log.info(u'Job {}: Subtask {} %s'.format(
-self._task_instance.job_id, self._task_instance.task_id),
-line.rstrip('\n'))
+self.log.info('Job %s: Subtask %s %s',
+  self._task_instance.job_id, 
self._task_instance.task_id,
+  line.rstrip('\n'))
 
 def run_command(self, run_with, join_args=False):
 """



[jira] [Resolved] (AIRFLOW-2102) Add custom_args to Sendgrid personalizations

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2102?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2102.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3035
[https://github.com/apache/incubator-airflow/pull/3035]

> Add custom_args to Sendgrid personalizations
> 
>
> Key: AIRFLOW-2102
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2102
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Reporter: Marcin Szymanski
>Assignee: Marcin Szymanski
>Priority: Major
> Fix For: 1.10.0
>
>
> Add support for {{custom_args}} in personalizations
> [https://sendgrid.com/docs/Classroom/Send/v3_Mail_Send/personalizations.html]
> {{custom_args}} should be passed in {{kwargs}} as other backends don't 
> support them



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2102) Add custom_args to Sendgrid personalizations

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2102?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376051#comment-16376051
 ] 

ASF subversion and git services commented on AIRFLOW-2102:
--

Commit caa8fd9d3c654ffed789370ddb69632f3131ce38 in incubator-airflow's branch 
refs/heads/master from [~ms32035]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=caa8fd9 ]

[AIRFLOW-2102] Add custom_args to Sendgrid personalizations

Closes #3035 from ms32035/sendgrid_personalization


> Add custom_args to Sendgrid personalizations
> 
>
> Key: AIRFLOW-2102
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2102
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Reporter: Marcin Szymanski
>Assignee: Marcin Szymanski
>Priority: Major
> Fix For: 1.10.0
>
>
> Add support for {{custom_args}} in personalizations
> [https://sendgrid.com/docs/Classroom/Send/v3_Mail_Send/personalizations.html]
> {{custom_args}} should be passed in {{kwargs}} as other backends don't 
> support them



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2102] Add custom_args to Sendgrid personalizations

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 4118e71de -> caa8fd9d3


[AIRFLOW-2102] Add custom_args to Sendgrid personalizations

Closes #3035 from ms32035/sendgrid_personalization


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/caa8fd9d
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/caa8fd9d
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/caa8fd9d

Branch: refs/heads/master
Commit: caa8fd9d3c654ffed789370ddb69632f3131ce38
Parents: 4118e71
Author: Marcin Szymanski 
Authored: Sun Feb 25 12:24:56 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:24:56 2018 +0100

--
 airflow/contrib/utils/sendgrid.py|  9 -
 tests/contrib/utils/test_sendgrid.py | 20 
 2 files changed, 24 insertions(+), 5 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/caa8fd9d/airflow/contrib/utils/sendgrid.py
--
diff --git a/airflow/contrib/utils/sendgrid.py 
b/airflow/contrib/utils/sendgrid.py
index f7af087..07614a5 100644
--- a/airflow/contrib/utils/sendgrid.py
+++ b/airflow/contrib/utils/sendgrid.py
@@ -24,7 +24,8 @@ import sendgrid
 
 from airflow.utils.email import get_email_address_list
 from airflow.utils.log.logging_mixin import LoggingMixin
-from sendgrid.helpers.mail import Attachment, Content, Email, Mail, 
Personalization
+from sendgrid.helpers.mail import Attachment, Content, Email, Mail, \
+Personalization, CustomArg
 
 
 def send_email(to, subject, html_content, files=None,
@@ -63,6 +64,12 @@ def send_email(to, subject, html_content, files=None,
 mail.add_personalization(personalization)
 mail.add_content(Content('text/html', html_content))
 
+# Add custom_args to personalization if present
+pers_custom_args = kwargs.get('personalization_custom_args', None)
+if isinstance(pers_custom_args, dict):
+for key in pers_custom_args.keys():
+personalization.add_custom_arg(CustomArg(key, 
pers_custom_args[key]))
+
 # Add email attachment.
 for fname in files or []:
 basename = os.path.basename(fname)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/caa8fd9d/tests/contrib/utils/test_sendgrid.py
--
diff --git a/tests/contrib/utils/test_sendgrid.py 
b/tests/contrib/utils/test_sendgrid.py
index 2459e5d..ba039e7 100644
--- a/tests/contrib/utils/test_sendgrid.py
+++ b/tests/contrib/utils/test_sendgrid.py
@@ -13,7 +13,7 @@
 # limitations under the License.
 #
 
-import logging
+import copy
 import unittest
 
 from airflow.contrib.utils.sendgrid import send_email
@@ -26,8 +26,6 @@ except ImportError:
 except ImportError:
 mock = None
 
-from mock import Mock
-from mock import patch
 
 class SendEmailSendGridTest(unittest.TestCase):
 # Unit test for sendgrid.send_email()
@@ -45,11 +43,25 @@ class SendEmailSendGridTest(unittest.TestCase):
  'bcc': [{'email': 'foo-...@foo.com'}, {'email': 
'bar-...@bar.com'}]}],
 'from': {'email': u'f...@bar.com'},
 'subject': 'sendgrid-send-email unit test'}
+self.personalization_custom_args = {'arg1': 'val1', 'arg2': 'val2'}
+self.expected_mail_data_custom_args = 
copy.deepcopy(self.expected_mail_data)
+
self.expected_mail_data_custom_args['personalizations'][0]['custom_args'] = \
+self.personalization_custom_args
+
+# Test the right email is constructed.
 
-# Test the right email is constructed.
 @mock.patch('os.environ.get')
 @mock.patch('airflow.contrib.utils.sendgrid._post_sendgrid_mail')
 def test_send_email_sendgrid_correct_email(self, mock_post, mock_get):
 mock_get.return_value = 'f...@bar.com'
 send_email(self.to, self.subject, self.html_content, cc=self.cc, 
bcc=self.bcc)
 mock_post.assert_called_with(self.expected_mail_data)
+
+# Test the right email is constructed.
+@mock.patch('os.environ.get')
+@mock.patch('airflow.contrib.utils.sendgrid._post_sendgrid_mail')
+def test_send_email_sendgrid_correct_email_custom_args(self, mock_post, 
mock_get):
+mock_get.return_value = 'f...@bar.com'
+send_email(self.to, self.subject, self.html_content, cc=self.cc, 
bcc=self.bcc,
+   
personalization_custom_args=self.personalization_custom_args)
+mock_post.assert_called_with(self.expected_mail_data_custom_args)



[jira] [Commented] (AIRFLOW-1053) HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 'ascii' codec can't encode character ..."

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1053?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376049#comment-16376049
 ] 

ASF subversion and git services commented on AIRFLOW-1053:
--

Commit 4118e71de3bc4d9802e5b1710cd8f29a488a02fb in incubator-airflow's branch 
refs/heads/master from [~naoya]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=4118e71 ]

[AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL

Closes #3053 from naoyak/AIRFLOW-1053


> HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 
> 'ascii' codec can't encode character ..."
> 
>
> Key: AIRFLOW-1053
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1053
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.0
>Reporter: Tomas Kafka
>Priority: Minor
>  Labels: utf-8
> Fix For: 1.10.0
>
> Attachments: airflow-hive-utf.py
>
>
> Run an attached DAG, for example as:
> {quote}
> airflow test airflow-hive-sample-utf utf-snowman 2017-01-01
> {quote}
> Imporant part:
> {quote}
> unicode_snowman = unichr(0x2603)
> op_test_select = HiveOperator(
> task_id='utf-snowman',
> hql='select \'' + unicode_snowman + '\' as utf_text;',
> dag=dag)
> {quote}
> It should return a single row with an unicode snowman, but instead ends with 
> error:
> {quote}
> UnicodeEncodeError: 'ascii' codec can't encode character u'\u2603' in 
> position 8: ordinal not in range(128)
> {quote}
> The same applies for unicode characters in external .hql files.
> Why is it a problem? Not because of snowmen, but I need to replace some 
> unicode chars in a Hive ETL query.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1053) HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 'ascii' codec can't encode character ..."

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1053?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376048#comment-16376048
 ] 

ASF subversion and git services commented on AIRFLOW-1053:
--

Commit 4118e71de3bc4d9802e5b1710cd8f29a488a02fb in incubator-airflow's branch 
refs/heads/master from [~naoya]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=4118e71 ]

[AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL

Closes #3053 from naoyak/AIRFLOW-1053


> HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 
> 'ascii' codec can't encode character ..."
> 
>
> Key: AIRFLOW-1053
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1053
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.0
>Reporter: Tomas Kafka
>Priority: Minor
>  Labels: utf-8
> Fix For: 1.10.0
>
> Attachments: airflow-hive-utf.py
>
>
> Run an attached DAG, for example as:
> {quote}
> airflow test airflow-hive-sample-utf utf-snowman 2017-01-01
> {quote}
> Imporant part:
> {quote}
> unicode_snowman = unichr(0x2603)
> op_test_select = HiveOperator(
> task_id='utf-snowman',
> hql='select \'' + unicode_snowman + '\' as utf_text;',
> dag=dag)
> {quote}
> It should return a single row with an unicode snowman, but instead ends with 
> error:
> {quote}
> UnicodeEncodeError: 'ascii' codec can't encode character u'\u2603' in 
> position 8: ordinal not in range(128)
> {quote}
> The same applies for unicode characters in external .hql files.
> Why is it a problem? Not because of snowmen, but I need to replace some 
> unicode chars in a Hive ETL query.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1035) Exponential backoff retry logic should use 2 as base

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376047#comment-16376047
 ] 

ASF subversion and git services commented on AIRFLOW-1035:
--

Commit 4118e71de3bc4d9802e5b1710cd8f29a488a02fb in incubator-airflow's branch 
refs/heads/master from [~naoya]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=4118e71 ]

[AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL

Closes #3053 from naoyak/AIRFLOW-1053


> Exponential backoff retry logic should use 2 as base
> 
>
> Key: AIRFLOW-1035
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1035
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Alex Guziel
>Assignee: Alex Guziel
>Priority: Major
> Fix For: 1.9.0
>
>
> Right now, the exponential backoff logic computes it as 
> (retry_period) ^ (retry_number) instead of retry_period * 2 ^ retry_number. 
> See https://en.wikipedia.org/wiki/Exponential_backoff



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-1053) HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 'ascii' codec can't encode character ..."

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-1053.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3053
[https://github.com/apache/incubator-airflow/pull/3053]

> HiveOperator: unicode character in HQL query produces "UnicodeEncodeError: 
> 'ascii' codec can't encode character ..."
> 
>
> Key: AIRFLOW-1053
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1053
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.0
>Reporter: Tomas Kafka
>Priority: Minor
>  Labels: utf-8
> Fix For: 1.10.0
>
> Attachments: airflow-hive-utf.py
>
>
> Run an attached DAG, for example as:
> {quote}
> airflow test airflow-hive-sample-utf utf-snowman 2017-01-01
> {quote}
> Imporant part:
> {quote}
> unicode_snowman = unichr(0x2603)
> op_test_select = HiveOperator(
> task_id='utf-snowman',
> hql='select \'' + unicode_snowman + '\' as utf_text;',
> dag=dag)
> {quote}
> It should return a single row with an unicode snowman, but instead ends with 
> error:
> {quote}
> UnicodeEncodeError: 'ascii' codec can't encode character u'\u2603' in 
> position 8: ordinal not in range(128)
> {quote}
> The same applies for unicode characters in external .hql files.
> Why is it a problem? Not because of snowmen, but I need to replace some 
> unicode chars in a Hive ETL query.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2127) Airflow's Alembic migrations globally disable logging

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376042#comment-16376042
 ] 

ASF subversion and git services commented on AIRFLOW-2127:
--

Commit 496d0f4372a2eead616b8074db357db2b71c28eb in incubator-airflow's branch 
refs/heads/master from [~jiffyclub]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=496d0f4 ]

[AIRFLOW-2127] Keep loggers during DB migrations

Python's logging.config.fileConfig function will,
by default,
disable all existing loggers when it is called.
The fileConfig
function is used with default arguments by Airflow
during
Alembic migrations and disables all loggers except
those from
alembic and sqlalchemy (this includes disabling
Airflow's own loggers).
This change sets the disable_existing_loggers flag
of fileConfig to
False so that it _does not_ disable any existing
loggers, allowing them
to continue working as normal.

See more in https://issues.apache.org/jira/browse/
AIRFLOW-2127.

Closes #3059 from jiffyclub/AIRFLOW-2127
/fileconfig-keep-loggers


> Airflow's Alembic migrations globally disable logging
> -
>
> Key: AIRFLOW-2127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2127
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Reporter: Matt Davis
>Priority: Major
> Fix For: 1.10.0
>
>
> When running Airflow's 
> {{[upgradedb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L295]}},
>  
> {{[resetdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L311]}},
>  and 
> {{[initdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L83]}}
>  functions logging is disabled thereafter for all but the 
> {{sqlalchemy.engine}} and {{alembic}} loggers. This is caused [this 
> usage|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]
>  of Python's {{fileConfig}} function, which by default disables all loggers 
> that aren't part of the supplied configuration. (See [Python 2 
> docs|https://docs.python.org/2/library/logging.config.html#logging.config.fileConfig]
>  and [Python 3 
> docs|https://docs.python.org/3/library/logging.config.html#logging.config.fileConfig].)
>  This can be fixed by adding {{disable_existing_loggers=False}} to the call 
> of {{fileConfig}}.
> This has affected us at Clover Health because we use these database utility 
> functions in some of our tooling, and particularly our _tests_ of the 
> tooling. Having all logging disabled in the midst of our tests makes it more 
> difficult to test our use of logging in completely unrelated parts of our 
> codebase.
> As an example, we were trying to use [pytest's caplog 
> feature|https://docs.pytest.org/en/latest/logging.html#caplog-fixture], but 
> were unable to do so with logging globally disabled by {{fileConfig}}. Here's 
> an example of a test that fails with {{disable_existing_loggers=True}} (the 
> default), but passes with {{disable_existing_loggers=False}}.
> {code}
> import logging
> import pytest
> import airflow.utils.db as af_db
> LOGGER = logging.getLogger(__name__)
> @pytest.fixture(autouse=True)
> def resetdb():
> af_db.resetdb()
> def test_caplog(caplog):
> LOGGER.info('LINE 1')
> assert caplog.record_tuples
> assert 'LINE 1' in caplog.text
> {code}
> I'll submit a pull request shortly to add {{disable_existing_loggers=False}} 
> to Airflow's 
> {{[env.py|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 496d0f437 -> 4118e71de


[AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL

Closes #3053 from naoyak/AIRFLOW-1053


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/4118e71d
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/4118e71d
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/4118e71d

Branch: refs/heads/master
Commit: 4118e71de3bc4d9802e5b1710cd8f29a488a02fb
Parents: 496d0f4
Author: Naoya Kanai 
Authored: Sun Feb 25 12:23:14 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:23:14 2018 +0100

--
 airflow/hooks/hive_hooks.py| 2 +-
 airflow/operators/hive_operator.py | 2 ++
 2 files changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/4118e71d/airflow/hooks/hive_hooks.py
--
diff --git a/airflow/hooks/hive_hooks.py b/airflow/hooks/hive_hooks.py
index 47aebc8..cd7319d 100644
--- a/airflow/hooks/hive_hooks.py
+++ b/airflow/hooks/hive_hooks.py
@@ -13,7 +13,7 @@
 # limitations under the License.
 #
 
-from __future__ import print_function
+from __future__ import print_function, unicode_literals
 from six.moves import zip
 from past.builtins import basestring
 

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/4118e71d/airflow/operators/hive_operator.py
--
diff --git a/airflow/operators/hive_operator.py 
b/airflow/operators/hive_operator.py
index ffb98ac..fdd1689 100644
--- a/airflow/operators/hive_operator.py
+++ b/airflow/operators/hive_operator.py
@@ -11,6 +11,8 @@
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 # See the License for the specific language governing permissions and
 # limitations under the License.
+from __future__ import unicode_literals
+
 import re
 
 from airflow.hooks.hive_hooks import HiveCliHook



[jira] [Commented] (AIRFLOW-2127) Airflow's Alembic migrations globally disable logging

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376043#comment-16376043
 ] 

ASF subversion and git services commented on AIRFLOW-2127:
--

Commit 496d0f4372a2eead616b8074db357db2b71c28eb in incubator-airflow's branch 
refs/heads/master from [~jiffyclub]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=496d0f4 ]

[AIRFLOW-2127] Keep loggers during DB migrations

Python's logging.config.fileConfig function will,
by default,
disable all existing loggers when it is called.
The fileConfig
function is used with default arguments by Airflow
during
Alembic migrations and disables all loggers except
those from
alembic and sqlalchemy (this includes disabling
Airflow's own loggers).
This change sets the disable_existing_loggers flag
of fileConfig to
False so that it _does not_ disable any existing
loggers, allowing them
to continue working as normal.

See more in https://issues.apache.org/jira/browse/
AIRFLOW-2127.

Closes #3059 from jiffyclub/AIRFLOW-2127
/fileconfig-keep-loggers


> Airflow's Alembic migrations globally disable logging
> -
>
> Key: AIRFLOW-2127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2127
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Reporter: Matt Davis
>Priority: Major
> Fix For: 1.10.0
>
>
> When running Airflow's 
> {{[upgradedb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L295]}},
>  
> {{[resetdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L311]}},
>  and 
> {{[initdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L83]}}
>  functions logging is disabled thereafter for all but the 
> {{sqlalchemy.engine}} and {{alembic}} loggers. This is caused [this 
> usage|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]
>  of Python's {{fileConfig}} function, which by default disables all loggers 
> that aren't part of the supplied configuration. (See [Python 2 
> docs|https://docs.python.org/2/library/logging.config.html#logging.config.fileConfig]
>  and [Python 3 
> docs|https://docs.python.org/3/library/logging.config.html#logging.config.fileConfig].)
>  This can be fixed by adding {{disable_existing_loggers=False}} to the call 
> of {{fileConfig}}.
> This has affected us at Clover Health because we use these database utility 
> functions in some of our tooling, and particularly our _tests_ of the 
> tooling. Having all logging disabled in the midst of our tests makes it more 
> difficult to test our use of logging in completely unrelated parts of our 
> codebase.
> As an example, we were trying to use [pytest's caplog 
> feature|https://docs.pytest.org/en/latest/logging.html#caplog-fixture], but 
> were unable to do so with logging globally disabled by {{fileConfig}}. Here's 
> an example of a test that fails with {{disable_existing_loggers=True}} (the 
> default), but passes with {{disable_existing_loggers=False}}.
> {code}
> import logging
> import pytest
> import airflow.utils.db as af_db
> LOGGER = logging.getLogger(__name__)
> @pytest.fixture(autouse=True)
> def resetdb():
> af_db.resetdb()
> def test_caplog(caplog):
> LOGGER.info('LINE 1')
> assert caplog.record_tuples
> assert 'LINE 1' in caplog.text
> {code}
> I'll submit a pull request shortly to add {{disable_existing_loggers=False}} 
> to Airflow's 
> {{[env.py|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2127) Airflow's Alembic migrations globally disable logging

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376044#comment-16376044
 ] 

ASF subversion and git services commented on AIRFLOW-2127:
--

Commit 496d0f4372a2eead616b8074db357db2b71c28eb in incubator-airflow's branch 
refs/heads/master from [~jiffyclub]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=496d0f4 ]

[AIRFLOW-2127] Keep loggers during DB migrations

Python's logging.config.fileConfig function will,
by default,
disable all existing loggers when it is called.
The fileConfig
function is used with default arguments by Airflow
during
Alembic migrations and disables all loggers except
those from
alembic and sqlalchemy (this includes disabling
Airflow's own loggers).
This change sets the disable_existing_loggers flag
of fileConfig to
False so that it _does not_ disable any existing
loggers, allowing them
to continue working as normal.

See more in https://issues.apache.org/jira/browse/
AIRFLOW-2127.

Closes #3059 from jiffyclub/AIRFLOW-2127
/fileconfig-keep-loggers


> Airflow's Alembic migrations globally disable logging
> -
>
> Key: AIRFLOW-2127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2127
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Reporter: Matt Davis
>Priority: Major
> Fix For: 1.10.0
>
>
> When running Airflow's 
> {{[upgradedb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L295]}},
>  
> {{[resetdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L311]}},
>  and 
> {{[initdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L83]}}
>  functions logging is disabled thereafter for all but the 
> {{sqlalchemy.engine}} and {{alembic}} loggers. This is caused [this 
> usage|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]
>  of Python's {{fileConfig}} function, which by default disables all loggers 
> that aren't part of the supplied configuration. (See [Python 2 
> docs|https://docs.python.org/2/library/logging.config.html#logging.config.fileConfig]
>  and [Python 3 
> docs|https://docs.python.org/3/library/logging.config.html#logging.config.fileConfig].)
>  This can be fixed by adding {{disable_existing_loggers=False}} to the call 
> of {{fileConfig}}.
> This has affected us at Clover Health because we use these database utility 
> functions in some of our tooling, and particularly our _tests_ of the 
> tooling. Having all logging disabled in the midst of our tests makes it more 
> difficult to test our use of logging in completely unrelated parts of our 
> codebase.
> As an example, we were trying to use [pytest's caplog 
> feature|https://docs.pytest.org/en/latest/logging.html#caplog-fixture], but 
> were unable to do so with logging globally disabled by {{fileConfig}}. Here's 
> an example of a test that fails with {{disable_existing_loggers=True}} (the 
> default), but passes with {{disable_existing_loggers=False}}.
> {code}
> import logging
> import pytest
> import airflow.utils.db as af_db
> LOGGER = logging.getLogger(__name__)
> @pytest.fixture(autouse=True)
> def resetdb():
> af_db.resetdb()
> def test_caplog(caplog):
> LOGGER.info('LINE 1')
> assert caplog.record_tuples
> assert 'LINE 1' in caplog.text
> {code}
> I'll submit a pull request shortly to add {{disable_existing_loggers=False}} 
> to Airflow's 
> {{[env.py|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2127) Airflow's Alembic migrations globally disable logging

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2127.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3059
[https://github.com/apache/incubator-airflow/pull/3059]

> Airflow's Alembic migrations globally disable logging
> -
>
> Key: AIRFLOW-2127
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2127
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Reporter: Matt Davis
>Priority: Major
> Fix For: 1.10.0
>
>
> When running Airflow's 
> {{[upgradedb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L295]}},
>  
> {{[resetdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L311]}},
>  and 
> {{[initdb|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/utils/db.py#L83]}}
>  functions logging is disabled thereafter for all but the 
> {{sqlalchemy.engine}} and {{alembic}} loggers. This is caused [this 
> usage|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]
>  of Python's {{fileConfig}} function, which by default disables all loggers 
> that aren't part of the supplied configuration. (See [Python 2 
> docs|https://docs.python.org/2/library/logging.config.html#logging.config.fileConfig]
>  and [Python 3 
> docs|https://docs.python.org/3/library/logging.config.html#logging.config.fileConfig].)
>  This can be fixed by adding {{disable_existing_loggers=False}} to the call 
> of {{fileConfig}}.
> This has affected us at Clover Health because we use these database utility 
> functions in some of our tooling, and particularly our _tests_ of the 
> tooling. Having all logging disabled in the midst of our tests makes it more 
> difficult to test our use of logging in completely unrelated parts of our 
> codebase.
> As an example, we were trying to use [pytest's caplog 
> feature|https://docs.pytest.org/en/latest/logging.html#caplog-fixture], but 
> were unable to do so with logging globally disabled by {{fileConfig}}. Here's 
> an example of a test that fails with {{disable_existing_loggers=True}} (the 
> default), but passes with {{disable_existing_loggers=False}}.
> {code}
> import logging
> import pytest
> import airflow.utils.db as af_db
> LOGGER = logging.getLogger(__name__)
> @pytest.fixture(autouse=True)
> def resetdb():
> af_db.resetdb()
> def test_caplog(caplog):
> LOGGER.info('LINE 1')
> assert caplog.record_tuples
> assert 'LINE 1' in caplog.text
> {code}
> I'll submit a pull request shortly to add {{disable_existing_loggers=False}} 
> to Airflow's 
> {{[env.py|https://github.com/apache/incubator-airflow/blob/fc26cade87e181f162ecc8391ae16dccbe6f29c4/airflow/migrations/env.py#L28]}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2127] Keep loggers during DB migrations

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master f2d10ef7b -> 496d0f437


[AIRFLOW-2127] Keep loggers during DB migrations

Python's logging.config.fileConfig function will,
by default,
disable all existing loggers when it is called.
The fileConfig
function is used with default arguments by Airflow
during
Alembic migrations and disables all loggers except
those from
alembic and sqlalchemy (this includes disabling
Airflow's own loggers).
This change sets the disable_existing_loggers flag
of fileConfig to
False so that it _does not_ disable any existing
loggers, allowing them
to continue working as normal.

See more in https://issues.apache.org/jira/browse/
AIRFLOW-2127.

Closes #3059 from jiffyclub/AIRFLOW-2127
/fileconfig-keep-loggers


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/496d0f43
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/496d0f43
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/496d0f43

Branch: refs/heads/master
Commit: 496d0f4372a2eead616b8074db357db2b71c28eb
Parents: f2d10ef
Author: Matt Davis 
Authored: Sun Feb 25 12:21:51 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:21:51 2018 +0100

--
 airflow/migrations/env.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/496d0f43/airflow/migrations/env.py
--
diff --git a/airflow/migrations/env.py b/airflow/migrations/env.py
index 8d5e55e..720275d 100644
--- a/airflow/migrations/env.py
+++ b/airflow/migrations/env.py
@@ -25,7 +25,7 @@ config = context.config
 
 # Interpret the config file for Python logging.
 # This line sets up loggers basically.
-fileConfig(config.config_file_name)
+fileConfig(config.config_file_name, disable_existing_loggers=False)
 
 # add your model's MetaData object here
 # for 'autogenerate' support



[jira] [Resolved] (AIRFLOW-2146) Initialize default Google BigQuery Connection with valid conn_type & Fix broken DBApiHook

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2146?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2146.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3073
[https://github.com/apache/incubator-airflow/pull/3073]

> Initialize default Google BigQuery Connection with valid conn_type & Fix 
> broken DBApiHook
> -
>
> Key: AIRFLOW-2146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2146
> Project: Apache Airflow
>  Issue Type: Task
>  Components: contrib, gcp
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.0
>
>
> `airflow initdb` creates a connection with conn_id='bigquery_default' and 
> conn_type='bigquery'. However, bigquery is not a valid conn_type, according 
> to models.Connection._types, and BigQuery connections should use the 
> google_cloud_platform conn_type.
> Also as [renanleme|https://github.com/renanleme] mentioned 
> [here|https://github.com/apache/incubator-airflow/pull/3031#issuecomment-368132910]
>  the dags he has created are broken when he is using `get_records()` from 
> BigQueryHook which is extended from DbApiHook.
> *Error Log*:
> {code}
> Traceback (most recent call last):
>   File "/src/apache-airflow/airflow/models.py", line 1519, in _run_raw_task
> result = task_copy.execute(context=context)
>   File "/airflow/dags/lib/operators/test_operator.py", line 21, in execute
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
>   File "/src/apache-airflow/airflow/hooks/base_hook.py", line 92, in 
> get_records
> raise NotImplementedError()
> {code}
> *Dag*:
> {code:python}
> from datetime import datetime
> from airflow import DAG
> from lib.operators.test_operator import TestOperator
> default_args = {
> 'depends_on_past': False,
> 'start_date': datetime(2018, 2, 21),
> }
> dag = DAG(
> 'test_dag',
> default_args=default_args,
> schedule_interval='0 6 * * *'
> )
> sql = '''
> SELECT id from YOUR_BIGQUERY_TABLE limit 10
> '''
> compare_grouped_event = TestOperator(
> task_id='test_operator',
> source_conn_id='gcp_airflow',
> sql=sql,
> dag=dag
> )
> {code}
> *Operator*:
> {code:python}
> from airflow.hooks.base_hook import BaseHook
> from airflow.models import BaseOperator
> from airflow.utils.decorators import apply_defaults
> class TestOperator(BaseOperator):
> @apply_defaults
> def __init__(
> self,
> sql,
> source_conn_id=None,
> *args, **kwargs):
> super(TestOperator, self).__init__(*args, **kwargs)
> self.sql = sql
> self.source_conn_id = source_conn_id
> def execute(self, context=None):
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
> self.log.info('Fetched records from source')
> @staticmethod
> def _get_db_hook(conn_id):
> return BaseHook.get_hook(conn_id=conn_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2146) Initialize default Google BigQuery Connection with valid conn_type & Fix broken DBApiHook

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376038#comment-16376038
 ] 

ASF subversion and git services commented on AIRFLOW-2146:
--

Commit f2d10ef7b465080584fe96590775b6a2696a38ef in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=f2d10ef ]

[AIRFLOW-2146] Resolve issues with BQ using DbApiHook methods

- Resolves issues with using methods like
`get_records()` from `BigQueryHook` which is
extended from `DbApiHook`.
- Reverting one changed file from
https://github.com/apache/incubator-
airflow/pull/3031 to use BigQuery Hook again
instead of `GoogleCloudBaseHook`
- Fix `conn_type` of `bigquery_default` connection

Closes #3073 from kaxil/AIRFLOW-2146


> Initialize default Google BigQuery Connection with valid conn_type & Fix 
> broken DBApiHook
> -
>
> Key: AIRFLOW-2146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2146
> Project: Apache Airflow
>  Issue Type: Task
>  Components: contrib, gcp
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.0
>
>
> `airflow initdb` creates a connection with conn_id='bigquery_default' and 
> conn_type='bigquery'. However, bigquery is not a valid conn_type, according 
> to models.Connection._types, and BigQuery connections should use the 
> google_cloud_platform conn_type.
> Also as [renanleme|https://github.com/renanleme] mentioned 
> [here|https://github.com/apache/incubator-airflow/pull/3031#issuecomment-368132910]
>  the dags he has created are broken when he is using `get_records()` from 
> BigQueryHook which is extended from DbApiHook.
> *Error Log*:
> {code}
> Traceback (most recent call last):
>   File "/src/apache-airflow/airflow/models.py", line 1519, in _run_raw_task
> result = task_copy.execute(context=context)
>   File "/airflow/dags/lib/operators/test_operator.py", line 21, in execute
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
>   File "/src/apache-airflow/airflow/hooks/base_hook.py", line 92, in 
> get_records
> raise NotImplementedError()
> {code}
> *Dag*:
> {code:python}
> from datetime import datetime
> from airflow import DAG
> from lib.operators.test_operator import TestOperator
> default_args = {
> 'depends_on_past': False,
> 'start_date': datetime(2018, 2, 21),
> }
> dag = DAG(
> 'test_dag',
> default_args=default_args,
> schedule_interval='0 6 * * *'
> )
> sql = '''
> SELECT id from YOUR_BIGQUERY_TABLE limit 10
> '''
> compare_grouped_event = TestOperator(
> task_id='test_operator',
> source_conn_id='gcp_airflow',
> sql=sql,
> dag=dag
> )
> {code}
> *Operator*:
> {code:python}
> from airflow.hooks.base_hook import BaseHook
> from airflow.models import BaseOperator
> from airflow.utils.decorators import apply_defaults
> class TestOperator(BaseOperator):
> @apply_defaults
> def __init__(
> self,
> sql,
> source_conn_id=None,
> *args, **kwargs):
> super(TestOperator, self).__init__(*args, **kwargs)
> self.sql = sql
> self.source_conn_id = source_conn_id
> def execute(self, context=None):
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
> self.log.info('Fetched records from source')
> @staticmethod
> def _get_db_hook(conn_id):
> return BaseHook.get_hook(conn_id=conn_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2146) Initialize default Google BigQuery Connection with valid conn_type & Fix broken DBApiHook

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376039#comment-16376039
 ] 

ASF subversion and git services commented on AIRFLOW-2146:
--

Commit f2d10ef7b465080584fe96590775b6a2696a38ef in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=f2d10ef ]

[AIRFLOW-2146] Resolve issues with BQ using DbApiHook methods

- Resolves issues with using methods like
`get_records()` from `BigQueryHook` which is
extended from `DbApiHook`.
- Reverting one changed file from
https://github.com/apache/incubator-
airflow/pull/3031 to use BigQuery Hook again
instead of `GoogleCloudBaseHook`
- Fix `conn_type` of `bigquery_default` connection

Closes #3073 from kaxil/AIRFLOW-2146


> Initialize default Google BigQuery Connection with valid conn_type & Fix 
> broken DBApiHook
> -
>
> Key: AIRFLOW-2146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2146
> Project: Apache Airflow
>  Issue Type: Task
>  Components: contrib, gcp
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.0
>
>
> `airflow initdb` creates a connection with conn_id='bigquery_default' and 
> conn_type='bigquery'. However, bigquery is not a valid conn_type, according 
> to models.Connection._types, and BigQuery connections should use the 
> google_cloud_platform conn_type.
> Also as [renanleme|https://github.com/renanleme] mentioned 
> [here|https://github.com/apache/incubator-airflow/pull/3031#issuecomment-368132910]
>  the dags he has created are broken when he is using `get_records()` from 
> BigQueryHook which is extended from DbApiHook.
> *Error Log*:
> {code}
> Traceback (most recent call last):
>   File "/src/apache-airflow/airflow/models.py", line 1519, in _run_raw_task
> result = task_copy.execute(context=context)
>   File "/airflow/dags/lib/operators/test_operator.py", line 21, in execute
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
>   File "/src/apache-airflow/airflow/hooks/base_hook.py", line 92, in 
> get_records
> raise NotImplementedError()
> {code}
> *Dag*:
> {code:python}
> from datetime import datetime
> from airflow import DAG
> from lib.operators.test_operator import TestOperator
> default_args = {
> 'depends_on_past': False,
> 'start_date': datetime(2018, 2, 21),
> }
> dag = DAG(
> 'test_dag',
> default_args=default_args,
> schedule_interval='0 6 * * *'
> )
> sql = '''
> SELECT id from YOUR_BIGQUERY_TABLE limit 10
> '''
> compare_grouped_event = TestOperator(
> task_id='test_operator',
> source_conn_id='gcp_airflow',
> sql=sql,
> dag=dag
> )
> {code}
> *Operator*:
> {code:python}
> from airflow.hooks.base_hook import BaseHook
> from airflow.models import BaseOperator
> from airflow.utils.decorators import apply_defaults
> class TestOperator(BaseOperator):
> @apply_defaults
> def __init__(
> self,
> sql,
> source_conn_id=None,
> *args, **kwargs):
> super(TestOperator, self).__init__(*args, **kwargs)
> self.sql = sql
> self.source_conn_id = source_conn_id
> def execute(self, context=None):
> records = self._get_db_hook(self.source_conn_id).get_records(self.sql)
> self.log.info('Fetched records from source')
> @staticmethod
> def _get_db_hook(conn_id):
> return BaseHook.get_hook(conn_id=conn_id)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2146] Resolve issues with BQ using DbApiHook methods

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master d7b5d09d4 -> f2d10ef7b


[AIRFLOW-2146] Resolve issues with BQ using DbApiHook methods

- Resolves issues with using methods like
`get_records()` from `BigQueryHook` which is
extended from `DbApiHook`.
- Reverting one changed file from
https://github.com/apache/incubator-
airflow/pull/3031 to use BigQuery Hook again
instead of `GoogleCloudBaseHook`
- Fix `conn_type` of `bigquery_default` connection

Closes #3073 from kaxil/AIRFLOW-2146


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/f2d10ef7
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/f2d10ef7
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/f2d10ef7

Branch: refs/heads/master
Commit: f2d10ef7b465080584fe96590775b6a2696a38ef
Parents: d7b5d09
Author: Kaxil Naik 
Authored: Sun Feb 25 12:16:28 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:16:28 2018 +0100

--
 airflow/models.py   | 4 ++--
 airflow/utils/db.py | 3 ++-
 tests/core.py   | 1 -
 3 files changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/f2d10ef7/airflow/models.py
--
diff --git a/airflow/models.py b/airflow/models.py
index b318ea6..f9f3fbc 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -662,8 +662,8 @@ class Connection(Base, LoggingMixin):
 from airflow.hooks.mysql_hook import MySqlHook
 return MySqlHook(mysql_conn_id=self.conn_id)
 elif self.conn_type == 'google_cloud_platform':
-from airflow.contrib.hooks.gcp_api_base_hook import 
GoogleCloudBaseHook
-return GoogleCloudBaseHook(gcp_conn_id=self.conn_id)
+from airflow.contrib.hooks.bigquery_hook import BigQueryHook
+return BigQueryHook(bigquery_conn_id=self.conn_id)
 elif self.conn_type == 'postgres':
 from airflow.hooks.postgres_hook import PostgresHook
 return PostgresHook(postgres_conn_id=self.conn_id)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/f2d10ef7/airflow/utils/db.py
--
diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 64ce220..2a38424 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -103,7 +103,8 @@ def initdb():
 schema='default'))
 merge_conn(
 models.Connection(
-conn_id='bigquery_default', conn_type='bigquery'))
+conn_id='bigquery_default', conn_type='google_cloud_platform',
+schema='default'))
 merge_conn(
 models.Connection(
 conn_id='local_mysql', conn_type='mysql',

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/f2d10ef7/tests/core.py
--
diff --git a/tests/core.py b/tests/core.py
index f76f65e..45a6e34 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -1039,7 +1039,6 @@ class CliTests(unittest.TestCase):
 # expected:
 self.assertIn(['aws_default', 'aws'], conns)
 self.assertIn(['beeline_default', 'beeline'], conns)
-self.assertIn(['bigquery_default', 'bigquery'], conns)
 self.assertIn(['emr_default', 'emr'], conns)
 self.assertIn(['mssql_default', 'mssql'], conns)
 self.assertIn(['mysql_default', 'mysql'], conns)



[jira] [Resolved] (AIRFLOW-2087) Scheduler Report shows incorrect "Total task number"

2018-02-25 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2087?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-2087.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

Issue resolved by pull request #3074
[https://github.com/apache/incubator-airflow/pull/3074]

> Scheduler Report shows incorrect "Total task number"
> 
>
> Key: AIRFLOW-2087
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2087
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 1.8, 1.9.0
>Reporter: Daniel Lamblin
>Assignee: Tao Feng
>Priority: Trivial
> Fix For: 1.10.0
>
>
> [https://github.com/apache/incubator-airflow/blob/4751abf8acad766cb576ecfe3a333d68cc693b8c/airflow/models.py#L479]
> This line is printing the same "Total task number" as "Number of DAGs" in the 
> cli tool `airflow list_dags -r`.
> E.G. some output:
> {{---}}
> {{DagBag loading stats for /pang/service/airflow/dags}}
> {{---}}
> {{Number of DAGs: 1143}}
> {{Total task number: 1143}}
> {{DagBag parsing time: 24.900074}}
> {{}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2087) Scheduler Report shows incorrect "Total task number"

2018-02-25 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16376036#comment-16376036
 ] 

ASF subversion and git services commented on AIRFLOW-2087:
--

Commit d7b5d09d4709dd090ceff2985dd05976203a9ad3 in incubator-airflow's branch 
refs/heads/master from Tao feng
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=d7b5d09 ]

[AIRFLOW-2087] Scheduler Report shows incorrect Total task number

Closes #3074 from feng-tao/airflow-2087


> Scheduler Report shows incorrect "Total task number"
> 
>
> Key: AIRFLOW-2087
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2087
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 1.8, 1.9.0
>Reporter: Daniel Lamblin
>Assignee: Tao Feng
>Priority: Trivial
> Fix For: 1.10.0
>
>
> [https://github.com/apache/incubator-airflow/blob/4751abf8acad766cb576ecfe3a333d68cc693b8c/airflow/models.py#L479]
> This line is printing the same "Total task number" as "Number of DAGs" in the 
> cli tool `airflow list_dags -r`.
> E.G. some output:
> {{---}}
> {{DagBag loading stats for /pang/service/airflow/dags}}
> {{---}}
> {{Number of DAGs: 1143}}
> {{Total task number: 1143}}
> {{DagBag parsing time: 24.900074}}
> {{}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2087] Scheduler Report shows incorrect Total task number

2018-02-25 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master ad308ea44 -> d7b5d09d4


[AIRFLOW-2087] Scheduler Report shows incorrect Total task number

Closes #3074 from feng-tao/airflow-2087


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/d7b5d09d
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/d7b5d09d
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/d7b5d09d

Branch: refs/heads/master
Commit: d7b5d09d4709dd090ceff2985dd05976203a9ad3
Parents: ad308ea
Author: Tao feng 
Authored: Sun Feb 25 12:13:57 2018 +0100
Committer: Bolke de Bruin 
Committed: Sun Feb 25 12:13:57 2018 +0100

--
 airflow/models.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d7b5d09d/airflow/models.py
--
diff --git a/airflow/models.py b/airflow/models.py
index e436974..b318ea6 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -476,7 +476,7 @@ class DagBag(BaseDagBag, LoggingMixin):
 dag_folder=self.dag_folder,
 duration=sum([o.duration for o in stats]),
 dag_num=sum([o.dag_num for o in stats]),
-task_num=sum([o.dag_num for o in stats]),
+task_num=sum([o.task_num for o in stats]),
 table=pprinttable(stats),
 )