[GitHub] [airflow] baolsen edited a comment on issue #6675: [AIRFLOW-6038] AWS DataSync example_dags added
baolsen edited a comment on issue #6675: [AIRFLOW-6038] AWS DataSync example_dags added URL: https://github.com/apache/airflow/pull/6675#issuecomment-558968294 Hi @potiuk Please may I ask for your assistance with this. My build is failing due to cyclic import (it was failing even before your recent fixes). I am not sure where to start debugging. I think the root cause is the new documentation files I've added, but I'm not sure why this would cause cyclic import problems in other files and I have no idea which one of my files is causing the issue. Any advice would be appreciated. Perhaps it is how I am importing "airflow" and "airflow exceptions" in my example dags? Here is where the build is failing during "static checks": * Module airflow.example_dags.example_http_operator airflow/example_dags/example_http_operator.py:1:0: R0401: Cyclic import (airflow.executors -> airflow.executors.kubernetes_executor -> airflow.kubernetes.pod_generator) (cyclic-import) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] baolsen commented on issue #6675: [AIRFLOW-6038] AWS DataSync example_dags added
baolsen commented on issue #6675: [AIRFLOW-6038] AWS DataSync example_dags added URL: https://github.com/apache/airflow/pull/6675#issuecomment-558968294 Hi @potiuk Please may I ask for your assistance with this. My build is failing due to cyclic import (it was failing even before your recent fixes). I am not sure where to start debugging. I think the root cause is the new documentation files I've added, but I'm not sure why this would cause cyclic import problems in other files and I have no idea which one of my files is causing the issue. Any advice would be appreciated. Perhaps it is how I am importing "airflow" and "airflow exceptions" ? Here is where the build is failing during "static checks": * Module airflow.example_dags.example_http_operator airflow/example_dags/example_http_operator.py:1:0: R0401: Cyclic import (airflow.executors -> airflow.executors.kubernetes_executor -> airflow.kubernetes.pod_generator) (cyclic-import) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] houqp commented on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate
houqp commented on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate URL: https://github.com/apache/airflow/pull/6553#issuecomment-558965123 @ashb type annotation additions are not moved to https://github.com/apache/airflow/pull/6674. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] TobKed commented on issue #6657: [AIRFLOW-6059] Fix typo in docs build.sh - path providers instead of provider
TobKed commented on issue #6657: [AIRFLOW-6059] Fix typo in docs build.sh - path providers instead of provider URL: https://github.com/apache/airflow/pull/6657#issuecomment-558953888 Hi @feluelle . I've reopened it unchanged since it is relatively small change and doesn't collide with bigger https://github.com/apache/airflow/pull/6599. WDYT? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6055) Exponential Backoff in Sensors
[ https://issues.apache.org/jira/browse/AIRFLOW-6055?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16983204#comment-16983204 ] ASF GitHub Bot commented on AIRFLOW-6055: - msumit commented on pull request #6654: [AIRFLOW-6055] Option for exponential backoff in Sensors URL: https://github.com/apache/airflow/pull/6654 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Exponential Backoff in Sensors > -- > > Key: AIRFLOW-6055 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6055 > Project: Apache Airflow > Issue Type: New Feature > Components: operators >Affects Versions: 2.0.0 >Reporter: Sumit Maheshwari >Assignee: Sumit Maheshwari >Priority: Major > > Like operators, there should be an option in Sensors as well to do > exponential backoff. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6055) Exponential Backoff in Sensors
[ https://issues.apache.org/jira/browse/AIRFLOW-6055?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16983206#comment-16983206 ] ASF subversion and git services commented on AIRFLOW-6055: -- Commit 5d08c54f71a03d8cc0db4b5b6bce874454eb984c in airflow's branch refs/heads/master from Sumit Maheshwari [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=5d08c54 ] [AIRFLOW-6055] Option for exponential backoff in Sensors (#6654) A new option "exponential_backoff" in Sensors, will increase the next poke or next reschedule time for sensors exponentially. Turned off by default. > Exponential Backoff in Sensors > -- > > Key: AIRFLOW-6055 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6055 > Project: Apache Airflow > Issue Type: New Feature > Components: operators >Affects Versions: 2.0.0 >Reporter: Sumit Maheshwari >Assignee: Sumit Maheshwari >Priority: Major > > Like operators, there should be an option in Sensors as well to do > exponential backoff. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] msumit merged pull request #6654: [AIRFLOW-6055] Option for exponential backoff in Sensors
msumit merged pull request #6654: [AIRFLOW-6055] Option for exponential backoff in Sensors URL: https://github.com/apache/airflow/pull/6654 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6038) AWS DataSync example dags
[ https://issues.apache.org/jira/browse/AIRFLOW-6038?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16983193#comment-16983193 ] ASF GitHub Bot commented on AIRFLOW-6038: - baolsen commented on pull request #6675: [AIRFLOW-6038] AWS DataSync example_dags added URL: https://github.com/apache/airflow/pull/6675 Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: Added Amazon AWS how-to documentation scaffolding, plus example DAGs for AWS DataSync Operators with their respective how-to guides. ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Documentation & examples. ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > AWS DataSync example dags > - > > Key: AIRFLOW-6038 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6038 > Project: Apache Airflow > Issue Type: Improvement > Components: aws, examples >Affects Versions: 1.10.6 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > > Add example_dags for AWS DataSync operators -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] baolsen opened a new pull request #6675: [AIRFLOW-6038] AWS DataSync example_dags added
baolsen opened a new pull request #6675: [AIRFLOW-6038] AWS DataSync example_dags added URL: https://github.com/apache/airflow/pull/6675 Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: Added Amazon AWS how-to documentation scaffolding, plus example DAGs for AWS DataSync Operators with their respective how-to guides. ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Documentation & examples. ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io commented on issue #6674: [AIRFLOW-6062] fix type error from new version of mypy
codecov-io commented on issue #6674: [AIRFLOW-6062] fix type error from new version of mypy URL: https://github.com/apache/airflow/pull/6674#issuecomment-558935464 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=h1) Report > Merging [#6674](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/3f3b42883939fa9bbdda57b9c96efbaa76923609?src=pr=desc) will **decrease** coverage by `0.3%`. > The diff coverage is `100%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6674/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6674 +/- ## == - Coverage 83.84% 83.54% -0.31% == Files 671 671 Lines 3760837608 == - Hits3153431418 -116 - Misses 6074 6190 +116 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/gcp/utils/field\_validator.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvZmllbGRfdmFsaWRhdG9yLnB5) | `92.3% <ø> (ø)` | :arrow_up: | | [airflow/contrib/operators/file\_to\_wasb.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9maWxlX3RvX3dhc2IucHk=) | `100% <ø> (ø)` | :arrow_up: | | [airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==) | `91.43% <ø> (ø)` | :arrow_up: | | [airflow/operators/dagrun\_operator.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZGFncnVuX29wZXJhdG9yLnB5) | `96% <100%> (ø)` | :arrow_up: | | [airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5) | `93.46% <100%> (ø)` | :arrow_up: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5) | `50.98% <0%> (-23.53%)` | :arrow_down: | | [...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==) | `78.2% <0%> (-20.52%)` | :arrow_down: | | ... and [2 more](https://codecov.io/gh/apache/airflow/pull/6674/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=footer). Last update [3f3b428...4ca26f1](https://codecov.io/gh/apache/airflow/pull/6674?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate
codecov-io edited a comment on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate URL: https://github.com/apache/airflow/pull/6553#issuecomment-553170084 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=h1) Report > Merging [#6553](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/3f3b42883939fa9bbdda57b9c96efbaa76923609?src=pr=desc) will **decrease** coverage by `0.3%`. > The diff coverage is `100%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6553/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6553 +/- ## == - Coverage 83.84% 83.54% -0.31% == Files 671 671 Lines 3760837601 -7 == - Hits3153431414 -120 - Misses 6074 6187 +113 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/jobs/local\_task\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2xvY2FsX3Rhc2tfam9iLnB5) | `89.33% <ø> (+4.33%)` | :arrow_up: | | [airflow/jobs/base\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2Jhc2Vfam9iLnB5) | `92.14% <100%> (+3.41%)` | :arrow_up: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5) | `50.98% <0%> (-23.53%)` | :arrow_down: | | [...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==) | `78.2% <0%> (-20.52%)` | :arrow_down: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `89.13% <0%> (-3.63%)` | :arrow_down: | | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `57.99% <0%> (-0.5%)` | :arrow_down: | | [airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==) | `91.13% <0%> (-0.31%)` | :arrow_down: | | ... and [2 more](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=footer). Last update [3f3b428...b07bbf7](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate
codecov-io edited a comment on issue #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate URL: https://github.com/apache/airflow/pull/6553#issuecomment-553170084 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=h1) Report > Merging [#6553](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/3f3b42883939fa9bbdda57b9c96efbaa76923609?src=pr=desc) will **decrease** coverage by `0.3%`. > The diff coverage is `100%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6553/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6553 +/- ## == - Coverage 83.84% 83.54% -0.31% == Files 671 671 Lines 3760837601 -7 == - Hits3153431414 -120 - Misses 6074 6187 +113 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/jobs/local\_task\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2xvY2FsX3Rhc2tfam9iLnB5) | `89.33% <ø> (+4.33%)` | :arrow_up: | | [airflow/jobs/base\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2Jhc2Vfam9iLnB5) | `92.14% <100%> (+3.41%)` | :arrow_up: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5) | `50.98% <0%> (-23.53%)` | :arrow_down: | | [...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==) | `78.2% <0%> (-20.52%)` | :arrow_down: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `89.13% <0%> (-3.63%)` | :arrow_down: | | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `57.99% <0%> (-0.5%)` | :arrow_down: | | [airflow/jobs/backfill\_job.py](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL2JhY2tmaWxsX2pvYi5weQ==) | `91.13% <0%> (-0.31%)` | :arrow_down: | | ... and [2 more](https://codecov.io/gh/apache/airflow/pull/6553/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=footer). Last update [3f3b428...b07bbf7](https://codecov.io/gh/apache/airflow/pull/6553?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-5920) Add support to execute OpenCypher query against Neo4j
[ https://issues.apache.org/jira/browse/AIRFLOW-5920?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16983132#comment-16983132 ] Timothy Findlay commented on AIRFLOW-5920: -- [~jackjack10], - I have made a few more updates, the draft PR is now green for the build. Can you cast your eye over this, if it looks ok, I'll make this a formal PR an look for it to be merged in. > Add support to execute OpenCypher query against Neo4j > - > > Key: AIRFLOW-5920 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5920 > Project: Apache Airflow > Issue Type: New Feature > Components: hooks, operators >Affects Versions: 1.10.7 >Reporter: Timothy Findlay >Assignee: Timothy Findlay >Priority: Minor > Original Estimate: 48h > Remaining Estimate: 48h > > As a DAG developer > I want to create DAG tasks to execute OpenCypher queries against a graph > database > So that the output can be used elsewhere in a DAG / business -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] codecov-io edited a comment on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
codecov-io edited a comment on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#issuecomment-558211492 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=h1) Report > Merging [#6627](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/e51e1c770dad235b3fd8fdc330e44b83df8dcc4a?src=pr=desc) will **decrease** coverage by `0.55%`. > The diff coverage is `62.96%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6627/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6627 +/- ## == - Coverage 83.82% 83.26% -0.56% == Files 672 671 -1 Lines 3759437664 +70 == - Hits3151231362 -150 - Misses 6082 6302 +220 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `55.09% <0%> (-3.06%)` | :arrow_down: | | [airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5) | `64.02% <100%> (+0.79%)` | :arrow_up: | | [airflow/task/task\_runner/standard\_task\_runner.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy90YXNrL3Rhc2tfcnVubmVyL3N0YW5kYXJkX3Rhc2tfcnVubmVyLnB5) | `63.93% <58.49%> (-36.07%)` | :arrow_down: | | [airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5) | `82.75% <70.83%> (+4.78%)` | :arrow_up: | | [airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=) | `0% <0%> (-100%)` | :arrow_down: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/executors/sequential\_executor.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvc2VxdWVudGlhbF9leGVjdXRvci5weQ==) | `47.61% <0%> (-52.39%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5) | `50.98% <0%> (-23.53%)` | :arrow_down: | | ... and [33 more](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=footer). Last update [e51e1c7...1379213](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tfindlay-tw commented on a change in pull request #6604: [AIRFLOW-5920] *DRAFT* Neo4j operator and hook
tfindlay-tw commented on a change in pull request #6604: [AIRFLOW-5920] *DRAFT* Neo4j operator and hook URL: https://github.com/apache/airflow/pull/6604#discussion_r351080645 ## File path: airflow/contrib/hooks/neo4j_hook.py ## @@ -0,0 +1,106 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +"""This hook provides minimal thin wrapper around the neo4j python library to provide query execution""" +from typing import Optional +from neo4j import BoltStatementResult, Driver, GraphDatabase, Session +from airflow.exceptions import AirflowException +from airflow.hooks.base_hook import BaseHook + + +class Neo4JHook(BaseHook): +"""This class enables the neo4j operator to execute queries against a configured neo4j server. +It requires the configuration name as set in Airflow -> Connections -> +:param n4j_conn_id: +:type str: +""" +_n4j_conn_id = None + +template_fields = ['n4j_conn_id'] + +def __init__(self, n4j_conn_id: str = 'n4j_default', *args, **kwargs): +super().__init__(*args, **kwargs) +self._n4j_conn_id = n4j_conn_id + +@staticmethod +def get_config(n4j_conn_id: Optional[str]) -> dict: +""" +Obtain the Username + Password from the Airflow connection definition +Store them in _config dictionary as: +*credentials* -- a tuple of username/password eg. ("username", "password") +*host* -- String for Neo4J URI eg. "bolt://1.1.1.1:7687" +:param n4j_conn_id: Name of connection configured in Airflow +:type n4j_conn_id: str +:return: dictionary with configuration values +:rtype dict +""" +# Initialize with empty dictionary +config: dict = {} +if n4j_conn_id is not None: +connection_object = Neo4JHook.get_connection(n4j_conn_id) +if connection_object.login and connection_object.host: +config['credentials'] = connection_object.login, connection_object.password +config['host'] = "bolt://{0}:{1}".format(connection_object.host, connection_object.port) +else: +raise AirflowException("No Neo4J connection: {}".format(n4j_conn_id)) Review comment: I have resolved this by making it require a string, `None` is not accepted by the function. As a result, no exception is to be thrown. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] maxirus commented on issue #6643: [AIRFLOW-6040] Fix KubernetesJobWatcher Read time out error
maxirus commented on issue #6643: [AIRFLOW-6040] Fix KubernetesJobWatcher Read time out error URL: https://github.com/apache/airflow/pull/6643#issuecomment-558903525 The `kube_client_request_args` config parameter is "global" to all of the client requests. Setting `timeout_seconds` here causes other methods, such as `create_namespaced_pod` to fail as this property is not recognized. If the ask here is to make this configurable, we'd need do so by adding another parameter to the `airflow.cfg` as to not break current functionality. My approach assumes that `_request_timeout` being set to 60 as the default in the config was deliberate so I wanted to be under this value by a considerable margin. That said, querying the API should realistically never take more than about 1s. If we didn't want to "hard-code" this value here, the other approaches I see are: 1. Create a new config parameter, `watch_timeout_seconds` 2. `if "_request_timeout" in kube_client_request_args then timeout_seconds = kube_client_request_args['_request_timeout'] - 1` 3. Check if there are any events for the label/worker uuid first, before the watch. If so, then watch 4. Leave it hard-coded 5. Other ideas? P.S.: The Worker UUID seems to not persist and is created at runtime. If I follow correctly, this get generated each time the scheduler runs. How is this tracked across restarts? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io commented on issue #6604: [AIRFLOW-5920] *DRAFT* Neo4j operator and hook
codecov-io commented on issue #6604: [AIRFLOW-5920] *DRAFT* Neo4j operator and hook URL: https://github.com/apache/airflow/pull/6604#issuecomment-558902885 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=h1) Report > :exclamation: No coverage uploaded for pull request base (`master@03c870a`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit). > The diff coverage is `97.5%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6604/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree) ```diff @@Coverage Diff@@ ## master#6604 +/- ## = Coverage ? 83.56% = Files ? 674 Lines ?37688 Branches ?0 = Hits ?31494 Misses? 6194 Partials ?0 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree) | Coverage Δ | | |---|---|---| | [...low/contrib/example\_dags/example\_neo4j\_operator.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX25lbzRqX29wZXJhdG9yLnB5) | `100% <100%> (ø)` | | | [airflow/contrib/hooks/neo4j\_hook.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL25lbzRqX2hvb2sucHk=) | `100% <100%> (ø)` | | | [airflow/contrib/operators/neo4j\_operator.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9uZW80al9vcGVyYXRvci5weQ==) | `94.59% <94.59%> (ø)` | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=footer). Last update [03c870a...27d4125](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] XD-DENG commented on issue #6673: Pod aliases
XD-DENG commented on issue #6673: Pod aliases URL: https://github.com/apache/airflow/pull/6673#issuecomment-558897376 Hi @dougblack , please follow the instructions in the PR template. On the other hand, if you would like to further test your code change, I would suggest to do that in your own fork first before you raise the PR. Repeated committing to PR would cost paid CPU time, and lengthen the build queue. So if you need more time & testing, I would suggest close this PR first. Cheers. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type
tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type URL: https://github.com/apache/airflow/pull/6665#issuecomment-55779 Hi @davlum I thought there are some awkward point in args of KubernetesPodOperator. There are classes we optionally use for KubernetesPodOperator. Port, Volume, VolumeMount, Secret, PodRuntimeInfoEnv, Resources. But only Resource passed by `dict` type. In my exp of 1.10.4, i was confused what args passed should be dict or class type So, there are two way to unify of them. 1. Resource could be changed Resource Type ``` def __init__(self, ..., resources: Optional[Resources] = None ...) ``` But, from using KubernetesPodOperator, we need to import all using class. 2. Others args could be passed by `dict` Type. like my PR. But as you say, if `kubernetes.client.models` is better, we could extract only `attach_to_pod` method previous class based `K8SModel`. (e.g. `PodRuntimeInfoEnv` -> `V1EnvVar`). And just pass them all instance of kubernetes.client.models This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type
tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type URL: https://github.com/apache/airflow/pull/6665#issuecomment-55779 Hi @davlum I thought there are some awkward point in args of KubernetesPodOperator. There are classes we optionally use for KubernetesPodOperator. Port, Volume, VolumeMount, Secret, PodRuntimeInfoEnv, Resources. But only Resource passed by `dict` type. In my exp of 1.10.4, i was confused what args passed should be dict or class type So, there are two way to unify of them. 1. Resource could be changed Resource Type ``` def __init__(self, ..., resources: Optional[Resources] = None ...) ``` But, from using KubernetesPodOperator, we need to import all using class. 2. Others args could be passed by `dict` Type. like my PR. But as you say, if `kubernetes.client.models` is better, we could extract only `attach_to_pod` method previous class based `K8SModel`. (e.g. `PodRuntimeInfoEnv` -> `V1EnvVar`). And just pass them all instance from kubernetes.client.models This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type
tyg03485 edited a comment on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type URL: https://github.com/apache/airflow/pull/6665#issuecomment-55779 Hi @davlum I thought there are some awkward point in args of KubernetesPodOperator. There are classes we optionally use for KubernetesPodOperator. Port, Volume, VolumeMount, Secret, PodRuntimeInfoEnv, Resources. But only Resource passed by `dict` type. In my exp of 1.10.4, i was confused what args passed should be dict or class type So, there are two way to unify of them. 1. Resource could be changed Resource Type ``` def __init__(self, ..., resources: Optional[Resources] = None ...) ``` But, from using KubernetesPodOperator, we need to import all using class. 2. Others args could be passed by `dict` Type. like my PR. But as you say, if `kubernetes.client.models` is better, we could deprecated previous class based `K8SModel`. (e.g. `PodRuntimeInfoEnv` -> `V1EnvVar`). And just pass them all instance from kubernetes.client.models This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] tyg03485 commented on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type
tyg03485 commented on issue #6665: [AIRFLOW-6068] Make args of KubernetesPodOperator to get dict type URL: https://github.com/apache/airflow/pull/6665#issuecomment-55779 Um.. I thought there are some awkward point in args of KubernetesPodOperator. There are classes we optionally use for KubernetesPodOperator. Port, Volume, VolumeMount, Secret, PodRuntimeInfoEnv, Resources. But only Resource passed by `dict` type. In my exp of 1.10.4, i was confused what args passed should be dict or class type So, there are two way to unify of them. 1. Resource could be changed Resource Type ``` def __init__(self, ..., resources: Optional[Resources] = None ...) ``` But, from using KubernetesPodOperator, we need to import all using class. 2. Others args could be passed by `dict` Type. like my PR. But as you say, if `kubernetes.client.models` is better, we could deprecated previous class based `K8SModel`. (e.g. `PodRuntimeInfoEnv` -> `V1EnvVar`). And just pass them all instance from kubernetes.client.models This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io commented on issue #6661: [AIRFLOW-5959] AIP-21 Change import paths for "jira" modules
codecov-io commented on issue #6661: [AIRFLOW-5959] AIP-21 Change import paths for "jira" modules URL: https://github.com/apache/airflow/pull/6661#issuecomment-558882971 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=h1) Report > :exclamation: No coverage uploaded for pull request base (`master@03c870a`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit). > The diff coverage is `63.15%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6661/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=tree) ```diff @@ Coverage Diff@@ ## master #6661 +/- ## Coverage ? 83.5% Files ? 674 Lines ? 37620 Branches ? 0 Hits ? 31416 Misses?6204 Partials ? 0 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==) | `65% <0%> (ø)` | | | [airflow/contrib/hooks/jira\_hook.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2ppcmFfaG9vay5weQ==) | `0% <0%> (ø)` | | | [airflow/contrib/operators/jira\_operator.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9qaXJhX29wZXJhdG9yLnB5) | `0% <0%> (ø)` | | | [airflow/contrib/sensors/jira\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvamlyYV9zZW5zb3IucHk=) | `0% <0%> (ø)` | | | [airflow/providers/jira/sensors/jira.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvamlyYS9zZW5zb3JzL2ppcmEucHk=) | `60% <60%> (ø)` | | | [airflow/providers/jira/operators/jira.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvamlyYS9vcGVyYXRvcnMvamlyYS5weQ==) | `76.66% <76.66%> (ø)` | | | [airflow/providers/jira/hooks/jira.py](https://codecov.io/gh/apache/airflow/pull/6661/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvamlyYS9ob29rcy9qaXJhLnB5) | `80% <80%> (ø)` | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=footer). Last update [03c870a...cda7f65](https://codecov.io/gh/apache/airflow/pull/6661?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] ashb commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
ashb commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#discussion_r351053566 ## File path: tests/dags/test_on_kill.py ## @@ -25,6 +25,11 @@ class DummyWithOnKill(DummyOperator): def execute(self, context): +import os Review comment: This wasn't an issue, I was just making doubley sure that the tests created more processes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
codecov-io edited a comment on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#issuecomment-558211492 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=h1) Report > Merging [#6627](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/e51e1c770dad235b3fd8fdc330e44b83df8dcc4a?src=pr=desc) will **decrease** coverage by `0.31%`. > The diff coverage is `62.96%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6627/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=tree) ```diff @@Coverage Diff@@ ## master #6627 +/- ## = - Coverage 83.82% 83.5% -0.32% = Files 672 671 -1 Lines 37594 37665 +71 = - Hits31512 31454 -58 - Misses 60826211 +129 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `58.22% <0%> (+0.06%)` | :arrow_up: | | [airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5) | `64.02% <100%> (+0.79%)` | :arrow_up: | | [airflow/task/task\_runner/standard\_task\_runner.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy90YXNrL3Rhc2tfcnVubmVyL3N0YW5kYXJkX3Rhc2tfcnVubmVyLnB5) | `63.93% <58.49%> (-36.07%)` | :arrow_down: | | [airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5) | `82.75% <70.83%> (+4.78%)` | :arrow_up: | | [airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==) | `44.44% <0%> (-55.56%)` | :arrow_down: | | [airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==) | `52.94% <0%> (-47.06%)` | :arrow_down: | | [airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==) | `45.25% <0%> (-46.72%)` | :arrow_down: | | [airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5) | `50.98% <0%> (-23.53%)` | :arrow_down: | | [...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==) | `78.2% <0%> (-20.52%)` | :arrow_down: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `89.13% <0%> (-3.63%)` | :arrow_down: | | ... and [25 more](https://codecov.io/gh/apache/airflow/pull/6627/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=footer). Last update [e51e1c7...6aa0f13](https://codecov.io/gh/apache/airflow/pull/6627?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
potiuk commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-558865616 Finally :). Thanks @kaxil This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6062) Scheduler doesn't delete worker pods from pods in different namespaces
[ https://issues.apache.org/jira/browse/AIRFLOW-6062?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16983025#comment-16983025 ] ASF GitHub Bot commented on AIRFLOW-6062: - houqp commented on pull request #6674: [AIRFLOW-6062] fix type error from new version of mypy URL: https://github.com/apache/airflow/pull/6674 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Scheduler doesn't delete worker pods from pods in different namespaces > -- > > Key: AIRFLOW-6062 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6062 > Project: Apache Airflow > Issue Type: Bug > Components: executor-kubernetes >Affects Versions: 1.10.5 >Reporter: Mihail Petkov >Assignee: Daniel Imberman >Priority: Blocker > > When you run Airflow's task instances as worker pods in different namespaces > into a Kubernetes cluster, the scheduler can delete only the pods that are > living in the same namespace where the scheduler lives. It's trying to delete > all pods that are in the namespace defined in the airflow.cfg file. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] houqp opened a new pull request #6674: [AIRFLOW-6062] fix type error from new version of mypy
houqp opened a new pull request #6674: [AIRFLOW-6062] fix type error from new version of mypy URL: https://github.com/apache/airflow/pull/6674 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-6080) New version of mypy catches more type errors
QP Hou created AIRFLOW-6080: --- Summary: New version of mypy catches more type errors Key: AIRFLOW-6080 URL: https://issues.apache.org/jira/browse/AIRFLOW-6080 Project: Apache Airflow Issue Type: Improvement Components: ci Affects Versions: 2.0.0 Reporter: QP Hou Assignee: QP Hou New version of mypy is reporting new list of errors. We should fix them so we can upgrade it in our CI/CD pipeline. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6070) On the admin dashboard, the recent tasks column has no tooltip for tasks in 'null' state
[ https://issues.apache.org/jira/browse/AIRFLOW-6070?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-6070. Resolution: Duplicate This has already been fixed in Airflow. Please note that this is not a ticket system for Cloud Composer. If you have bugs that are in Cloud Composer then you should report them to Google or test if the problem occurs in other versions of Airflow. > On the admin dashboard, the recent tasks column has no tooltip for tasks in > 'null' state > > > Key: AIRFLOW-6070 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6070 > Project: Apache Airflow > Issue Type: Bug > Components: ui >Affects Versions: 1.10.2 > Environment: GCP Composer composer-1.7.5-airflow-1.10.2 >Reporter: Adam Hopkinson >Priority: Trivial > Attachments: image-2019-11-26-10-03-40-377.png > > > On the DAGS listing template, the circles in the _Recent Tasks_ column all > have a tooltip apart from the second to last - which is for tasks with state > = `null` > !image-2019-11-26-10-03-40-377.png|width=261,height=37! > I believe this is happening in [this line of > code|https://github.com/apache/airflow/blob/0ff9e2307042ba95e69b32e37f2fc767a5fdc36d/airflow/www/templates/airflow/dags.html#L447], > which is: > {{.attr('title', function(d) \{return d.state || 'none'})}} > I'm not sure why it's not falling back to 'none' - I think it's possibly > seeing the value of d.state as the text value 'null' rather than a true null, > but then putting that into the title as true null. > I'm using GCP Composer, so don't have a local instance of Airflow that I can > test. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6074) Logging to Azure Blob Storage
[ https://issues.apache.org/jira/browse/AIRFLOW-6074?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982999#comment-16982999 ] Kamil Bregula commented on AIRFLOW-6074: Thanks for reporting it. I will try to improve it in my PR. [https://github.com/apache/airflow/pull/6644] > Logging to Azure Blob Storage > - > > Key: AIRFLOW-6074 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6074 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: 1.10.6 >Reporter: david diaz >Priority: Major > Attachments: image-2019-11-26-13-20-13-271.png > > > The template in airflow/airflow/config_templates/airflow_local_setting.py > contains a hard coded name for the wasb_container attribute in REMOTE_HANDLERS > > !image-2019-11-26-13-20-13-271.png! > The azure blob hook uses that name to look for the container to place the > logs, but if it fails due to that container not existing it will log that the > container doesnt exist using the name of the enviromental variable > REMOTE_BASE_LOG_FOLDER . Either the logging or the hardcoded value should be > changed. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] mik-laj commented on issue #6644: [AIRFLOW-6047] Simplify the logging configuration template
mik-laj commented on issue #6644: [AIRFLOW-6047] Simplify the logging configuration template URL: https://github.com/apache/airflow/pull/6644#issuecomment-558856882 Related jira: https://issues.apache.org/jira/projects/AIRFLOW/issues/AIRFLOW-6074?filter=addedrecently This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (AIRFLOW-6078) cc
[ https://issues.apache.org/jira/browse/AIRFLOW-6078?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula closed AIRFLOW-6078. -- Resolution: Fixed > cc > -- > > Key: AIRFLOW-6078 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6078 > Project: Apache Airflow > Issue Type: Bug > Components: core >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-5544. Resolution: Fixed Fix is available in community. Changes in not available in Cloud Composer, but it is separate project not supported by Apache. > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982996#comment-16982996 ] Kamil Bregula commented on AIRFLOW-5544: There is a plan to make these operators available for Cloud Composer in the near future. [https://lists.apache.org/thread.html/2c9559184045e772acd21cbdd7435f6bf89c76eb9311311d58d16e5f@%3Cdev.airflow.apache.org%3E] You will be able to install the latest operators supported by my team. > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6079) Make consistent use of timezone.parse
[ https://issues.apache.org/jira/browse/AIRFLOW-6079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982992#comment-16982992 ] ASF GitHub Bot commented on AIRFLOW-6079: - kaxil commented on pull request #6672: [AIRFLOW-6079] Make consistent use of timezone.parse URL: https://github.com/apache/airflow/pull/6672 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Make consistent use of timezone.parse > - > > Key: AIRFLOW-6079 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6079 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > In www/view.py, the following variations are used to parse time: > * pendulum.parse > * airflow.utils.timezone.parse > * timezone.parse -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil merged pull request #6672: [AIRFLOW-6079] Make consistent use of timezone.parse
kaxil merged pull request #6672: [AIRFLOW-6079] Make consistent use of timezone.parse URL: https://github.com/apache/airflow/pull/6672 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6079) Make consistent use of timezone.parse
[ https://issues.apache.org/jira/browse/AIRFLOW-6079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982993#comment-16982993 ] ASF subversion and git services commented on AIRFLOW-6079: -- Commit 3f3b42883939fa9bbdda57b9c96efbaa76923609 in airflow's branch refs/heads/master from Kaxil Naik [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=3f3b428 ] [AIRFLOW-6079] Make consistent use of timezone.parse (#6672) > Make consistent use of timezone.parse > - > > Key: AIRFLOW-6079 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6079 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > In www/view.py, the following variations are used to parse time: > * pendulum.parse > * airflow.utils.timezone.parse > * timezone.parse -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6079) Make consistent use of timezone.parse
[ https://issues.apache.org/jira/browse/AIRFLOW-6079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6079. - Resolution: Fixed > Make consistent use of timezone.parse > - > > Key: AIRFLOW-6079 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6079 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > In www/view.py, the following variations are used to parse time: > * pendulum.parse > * airflow.utils.timezone.parse > * timezone.parse -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil commented on issue #6672: [AIRFLOW-6079] Make consistent use of timezone.parse
kaxil commented on issue #6672: [AIRFLOW-6079] Make consistent use of timezone.parse URL: https://github.com/apache/airflow/pull/6672#issuecomment-558854779 CI passed: https://travis-ci.org/apache/airflow/builds/617404566 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] houqp commented on a change in pull request #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate
houqp commented on a change in pull request #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate URL: https://github.com/apache/airflow/pull/6553#discussion_r351022830 ## File path: airflow/models/taskinstance.py ## @@ -1320,9 +1320,9 @@ def set_duration(self): def xcom_push( self, -key, -value, -execution_date=None): +key: str, +value: Any, +execution_date: Optional[datetime] = None) -> None: Review comment: I have removed the type annotation. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] dougblack opened a new pull request #6673: Pod aliases
dougblack opened a new pull request #6673: Pod aliases URL: https://github.com/apache/airflow/pull/6673 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)). - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). ### Description - [ ] Here are some details about my PR, including screenshots of any UI changes: ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - All the public functions and the classes in the PR contain docstrings that explain what it does - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] dimberman commented on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
dimberman commented on issue #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#issuecomment-558849438 @ashb did you ever end up doing a timing comparison with the multiprocessing based solution? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#discussion_r351016268 ## File path: tests/dags/test_on_kill.py ## @@ -25,6 +25,11 @@ class DummyWithOnKill(DummyOperator): def execute(self, context): +import os Review comment: @nuclearpinguin Did you fix these issue in other PR? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982972#comment-16982972 ] Yang Hu commented on AIRFLOW-5544: -- Glad there are fixes there, we are on GCP Cloud Composer so that we have to wait Google pick up the version which has this update, but thanks for informing us! > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#discussion_r351015019 ## File path: airflow/task/task_runner/standard_task_runner.py ## @@ -17,28 +17,96 @@ # specific language governing permissions and limitations # under the License. +import os + import psutil +from setproctitle import setproctitle from airflow.task.task_runner.base_task_runner import BaseTaskRunner from airflow.utils.helpers import reap_process_group +CAN_FORK = hasattr(os, 'fork') + class StandardTaskRunner(BaseTaskRunner): """ Runs the raw Airflow task by invoking through the Bash shell. """ def __init__(self, local_task_job): super().__init__(local_task_job) +self._rc = None def start(self): -self.process = self.run_command() +if CAN_FORK and not self.run_as_user: +self.process = self._start_by_fork() +else: +self.process = self._start_by_exec() + +def _start_by_exec(self): +subprocess = self.run_command() +return psutil.Process(subprocess.pid) + +def _start_by_fork(self): +pid = os.fork() +if pid: +self.log.info("Started process %d to run task", pid) +return psutil.Process(pid) +else: +from airflow.bin.cli import get_parser +from airflow.logging_config import configure_logging +import signal +import airflow.settings as settings + +signal.signal(signal.SIGINT, signal.SIG_DFL) +signal.signal(signal.SIGTERM, signal.SIG_DFL) +# Start a new process group +os.setpgid(0, 0) + +configure_logging() + +# Force a new SQLAlchemy session. We can't share open DB handles +# between process. The cli code will re-create this as part of its +# normal startup +settings.engine.pool.dispose() +settings.engine.dispose() -def return_code(self): -return self.process.poll() +parser = get_parser() +args = parser.parse_args(self._command[1:]) Review comment: ```suggestion # [1:] - remove "airflow" from the command args = parser.parse_args(self._command[1:]) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution.
mik-laj commented on a change in pull request #6627: [AIRFLOW-5931] Use os.fork when appropriate to speed up task execution. URL: https://github.com/apache/airflow/pull/6627#discussion_r351014526 ## File path: airflow/cli/commands/task_command.py ## @@ -225,6 +226,11 @@ def task_test(args, dag=None): debugger.post_mortem() else: raise +finally: +if not already_has_stream_handler: Review comment: Thanks for it! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-557836097 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=h1) Report > Merging [#6601](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/5eddeac088d7609d7f9a5d81139aa82ee2df616d?src=pr=desc) will **increase** coverage by `0.01%`. > The diff coverage is `96.66%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6601/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6601 +/- ## == + Coverage 83.83% 83.85% +0.01% == Files 672 671 -1 Lines 3759437609 +15 == + Hits3151731536 +19 + Misses 6077 6073 -4 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/settings.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==) | `88.81% <ø> (-0.16%)` | :arrow_down: | | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `58.48% <0%> (+0.32%)` | :arrow_up: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `92.75% <100%> (ø)` | :arrow_up: | | [airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5) | `96.28% <100%> (+0.07%)` | :arrow_up: | | [airflow/www/utils.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=) | `80.19% <100%> (ø)` | :arrow_up: | | [airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==) | `65% <100%> (ø)` | :arrow_up: | | [airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5) | `77.97% <100%> (ø)` | :arrow_up: | | [airflow/operators/bigquery\_to\_mysql.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmlncXVlcnlfdG9fbXlzcWwucHk=) | `74% <100%> (ø)` | :arrow_up: | | [...e/marketing\_platform/operators/campaign\_manager.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL21hcmtldGluZ19wbGF0Zm9ybS9vcGVyYXRvcnMvY2FtcGFpZ25fbWFuYWdlci5weQ==) | `91.73% <100%> (ø)` | :arrow_up: | | [airflow/api/common/experimental/mark\_tasks.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC9tYXJrX3Rhc2tzLnB5) | `95.39% <100%> (+0.03%)` | :arrow_up: | | ... and [23 more](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=footer). Last update [5eddeac...0e8be9e](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982961#comment-16982961 ] Kamil Bregula commented on AIRFLOW-5544: Some body from community make change: [https://github.com/apache/airflow/commit/16e7e6175ae16d73d84c86163686a6c0b9b707ff#diff-d67992de738848a0b4278915ffd9b797] > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-557836097 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=h1) Report > Merging [#6601](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/5eddeac088d7609d7f9a5d81139aa82ee2df616d?src=pr=desc) will **increase** coverage by `0.01%`. > The diff coverage is `96.66%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6601/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6601 +/- ## == + Coverage 83.83% 83.85% +0.01% == Files 672 671 -1 Lines 3759437609 +15 == + Hits3151731536 +19 + Misses 6077 6073 -4 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/settings.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==) | `88.81% <ø> (-0.16%)` | :arrow_down: | | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `58.48% <0%> (+0.32%)` | :arrow_up: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `92.75% <100%> (ø)` | :arrow_up: | | [airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5) | `96.28% <100%> (+0.07%)` | :arrow_up: | | [airflow/www/utils.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=) | `80.19% <100%> (ø)` | :arrow_up: | | [airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==) | `65% <100%> (ø)` | :arrow_up: | | [airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5) | `77.97% <100%> (ø)` | :arrow_up: | | [airflow/operators/bigquery\_to\_mysql.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmlncXVlcnlfdG9fbXlzcWwucHk=) | `74% <100%> (ø)` | :arrow_up: | | [...e/marketing\_platform/operators/campaign\_manager.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL21hcmtldGluZ19wbGF0Zm9ybS9vcGVyYXRvcnMvY2FtcGFpZ25fbWFuYWdlci5weQ==) | `91.73% <100%> (ø)` | :arrow_up: | | [airflow/api/common/experimental/mark\_tasks.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC9tYXJrX3Rhc2tzLnB5) | `95.39% <100%> (+0.03%)` | :arrow_up: | | ... and [23 more](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=footer). Last update [5eddeac...0e8be9e](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
codecov-io edited a comment on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-557836097 # [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=h1) Report > Merging [#6601](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=desc) into [master](https://codecov.io/gh/apache/airflow/commit/5eddeac088d7609d7f9a5d81139aa82ee2df616d?src=pr=desc) will **increase** coverage by `0.01%`. > The diff coverage is `96.66%`. [![Impacted file tree graph](https://codecov.io/gh/apache/airflow/pull/6601/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) ```diff @@Coverage Diff @@ ## master#6601 +/- ## == + Coverage 83.83% 83.85% +0.01% == Files 672 671 -1 Lines 3759437609 +15 == + Hits3151731536 +19 + Misses 6077 6073 -4 ``` | [Impacted Files](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/settings.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==) | `88.81% <ø> (-0.16%)` | :arrow_down: | | [airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==) | `58.48% <0%> (+0.32%)` | :arrow_up: | | [airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5) | `92.75% <100%> (ø)` | :arrow_up: | | [airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5) | `96.28% <100%> (+0.07%)` | :arrow_up: | | [airflow/www/utils.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdXRpbHMucHk=) | `80.19% <100%> (ø)` | :arrow_up: | | [airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==) | `65% <100%> (ø)` | :arrow_up: | | [airflow/utils/helpers.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9oZWxwZXJzLnB5) | `77.97% <100%> (ø)` | :arrow_up: | | [airflow/operators/bigquery\_to\_mysql.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmlncXVlcnlfdG9fbXlzcWwucHk=) | `74% <100%> (ø)` | :arrow_up: | | [...e/marketing\_platform/operators/campaign\_manager.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvZ29vZ2xlL21hcmtldGluZ19wbGF0Zm9ybS9vcGVyYXRvcnMvY2FtcGFpZ25fbWFuYWdlci5weQ==) | `91.73% <100%> (ø)` | :arrow_up: | | [airflow/api/common/experimental/mark\_tasks.py](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC9tYXJrX3Rhc2tzLnB5) | `95.39% <100%> (+0.03%)` | :arrow_up: | | ... and [23 more](https://codecov.io/gh/apache/airflow/pull/6601/diff?src=pr=tree-more) | | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=footer). Last update [5eddeac...0e8be9e](https://codecov.io/gh/apache/airflow/pull/6601?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6010) Remove cyclic imports and pylint hacks
[ https://issues.apache.org/jira/browse/AIRFLOW-6010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982959#comment-16982959 ] ASF subversion and git services commented on AIRFLOW-6010: -- Commit 03c870a6172ab232af6319a30ad8d46622359b10 in airflow's branch refs/heads/master from Jarek Potiuk [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=03c870a ] [AIRFLOW-6010] Remove cyclic imports and pylint hacks (#6601) > Remove cyclic imports and pylint hacks > -- > > Key: AIRFLOW-6010 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6010 > Project: Apache Airflow > Issue Type: Sub-task > Components: core >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > [AIRFLOW-6010] Remove cyclic imports and pylint hacks > There were a number of problems involving cyclic imports in > Airflow's core. Mainly about settingsi, DAG context management, base operator > imports and serialisation. > Some of those problems were workarounded by #pylint: disables (for pylint), > some of them were bypassed with TYPE_CHECKING (for mypy) and some of them > were > just hidden because pylint check was splitting filei lists while TravisiCI > build. > This commit fixes most of the problems (only executor problem is left) and > removes all the workarounds. > The fixes are: > * Context for DAG context management was loaded from settings and > Now context managemen is moved to DAG and 'import settings' is not > needed in baseoperator, subdag_operator. > * Serialized Fields are lazy initialised - they were previously > initialized while parsing the python modules which made it impossible to > avoid > cycles. > * SerializedDagModel is removed from 'airflow.models' and imported > directly from serialization package. This is only internal class and does not > need to be exposed via models > * BaseOperator in core is imported from baseoperator package > rather than from 'airflow.models'. This helps in importing the whole airflow > __init__ of 'airflow' without having to pay attention > to the sequence of imports there. > * BaseOperator on the other hand is imported from 'airflowi.models' in > operators/DAGs/hooks/sensors. This is important for Backporting (AIP-21) > * The imports of BaseOperator are enforced with pre-commit. > * All the pylint/mypy hacks related to cyclic imports are removed -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
kaxil commented on issue #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601#issuecomment-558842390 Thanks @potiuk This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-6010) Remove cyclic imports and pylint hacks
[ https://issues.apache.org/jira/browse/AIRFLOW-6010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982958#comment-16982958 ] ASF GitHub Bot commented on AIRFLOW-6010: - kaxil commented on pull request #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Remove cyclic imports and pylint hacks > -- > > Key: AIRFLOW-6010 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6010 > Project: Apache Airflow > Issue Type: Sub-task > Components: core >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > > [AIRFLOW-6010] Remove cyclic imports and pylint hacks > There were a number of problems involving cyclic imports in > Airflow's core. Mainly about settingsi, DAG context management, base operator > imports and serialisation. > Some of those problems were workarounded by #pylint: disables (for pylint), > some of them were bypassed with TYPE_CHECKING (for mypy) and some of them > were > just hidden because pylint check was splitting filei lists while TravisiCI > build. > This commit fixes most of the problems (only executor problem is left) and > removes all the workarounds. > The fixes are: > * Context for DAG context management was loaded from settings and > Now context managemen is moved to DAG and 'import settings' is not > needed in baseoperator, subdag_operator. > * Serialized Fields are lazy initialised - they were previously > initialized while parsing the python modules which made it impossible to > avoid > cycles. > * SerializedDagModel is removed from 'airflow.models' and imported > directly from serialization package. This is only internal class and does not > need to be exposed via models > * BaseOperator in core is imported from baseoperator package > rather than from 'airflow.models'. This helps in importing the whole airflow > __init__ of 'airflow' without having to pay attention > to the sequence of imports there. > * BaseOperator on the other hand is imported from 'airflowi.models' in > operators/DAGs/hooks/sensors. This is important for Backporting (AIP-21) > * The imports of BaseOperator are enforced with pre-commit. > * All the pylint/mypy hacks related to cyclic imports are removed -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil merged pull request #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables
kaxil merged pull request #6601: [AIRFLOW-6010] Remove cyclic imports and pylint disables URL: https://github.com/apache/airflow/pull/6601 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Resolved] (AIRFLOW-6010) Remove cyclic imports and pylint hacks
[ https://issues.apache.org/jira/browse/AIRFLOW-6010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6010. - Fix Version/s: 2.0.0 Resolution: Fixed > Remove cyclic imports and pylint hacks > -- > > Key: AIRFLOW-6010 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6010 > Project: Apache Airflow > Issue Type: Sub-task > Components: core >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > [AIRFLOW-6010] Remove cyclic imports and pylint hacks > There were a number of problems involving cyclic imports in > Airflow's core. Mainly about settingsi, DAG context management, base operator > imports and serialisation. > Some of those problems were workarounded by #pylint: disables (for pylint), > some of them were bypassed with TYPE_CHECKING (for mypy) and some of them > were > just hidden because pylint check was splitting filei lists while TravisiCI > build. > This commit fixes most of the problems (only executor problem is left) and > removes all the workarounds. > The fixes are: > * Context for DAG context management was loaded from settings and > Now context managemen is moved to DAG and 'import settings' is not > needed in baseoperator, subdag_operator. > * Serialized Fields are lazy initialised - they were previously > initialized while parsing the python modules which made it impossible to > avoid > cycles. > * SerializedDagModel is removed from 'airflow.models' and imported > directly from serialization package. This is only internal class and does not > need to be exposed via models > * BaseOperator in core is imported from baseoperator package > rather than from 'airflow.models'. This helps in importing the whole airflow > __init__ of 'airflow' without having to pay attention > to the sequence of imports there. > * BaseOperator on the other hand is imported from 'airflowi.models' in > operators/DAGs/hooks/sensors. This is important for Backporting (AIP-21) > * The imports of BaseOperator are enforced with pre-commit. > * All the pylint/mypy hacks related to cyclic imports are removed -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982955#comment-16982955 ] Yang Hu commented on AIRFLOW-5544: -- Hi [~kamil.bregula] What improvements are there? Do you have a link or version that we can test? > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow-site] ashb commented on issue #198: Publish new website - AIP-11
ashb commented on issue #198: Publish new website - AIP-11 URL: https://github.com/apache/airflow-site/pull/198#issuecomment-558839747 Yes we have mod_rewrite enabled - for example https://git-wip-us.apache.org/repos/asf?p=incubator.git;a=blob;f=assets/.htaccess;h=c1c53fe407a149c52ed853ca45dc806d36858fb1;hb=HEAD I'd really like this to be done before we merge this PR to - I'm a very strong believer that we shouldn't have URLs suddenly 404 -- it gives a bad experience to users (and I think impacts search results from Google etc.) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] ashb commented on a change in pull request #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate
ashb commented on a change in pull request #6553: [AIRFLOW-5902] avoid unnecessary sleep to maintain local task job heart rate URL: https://github.com/apache/airflow/pull/6553#discussion_r351004248 ## File path: airflow/models/taskinstance.py ## @@ -1320,9 +1320,9 @@ def set_duration(self): def xcom_push( self, -key, -value, -execution_date=None): +key: str, +value: Any, +execution_date: Optional[datetime] = None) -> None: Review comment: Are these changes required to make this PR pass tests? If not it is preferred to not change files that you don't have to for the subject of the PR. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [airflow] mik-laj commented on a change in pull request #6644: [AIRFLOW-6047] Simplify the logging configuration template
mik-laj commented on a change in pull request #6644: [AIRFLOW-6047] Simplify the logging configuration template URL: https://github.com/apache/airflow/pull/6644#discussion_r350992546 ## File path: airflow/config_templates/airflow_local_settings.py ## @@ -191,32 +215,6 @@ 'json_format': ELASTICSEARCH_JSON_FORMAT, 'json_fields': ELASTICSEARCH_JSON_FIELDS }, -}, -} - -REMOTE_LOGGING = conf.getboolean('core', 'remote_logging') - -# Only update the handlers and loggers when CONFIG_PROCESSOR_MANAGER_LOGGER is set. -# This is to avoid exceptions when initializing RotatingFileHandler multiple times -# in multiple processes. -if os.environ.get('CONFIG_PROCESSOR_MANAGER_LOGGER') == 'True': -DEFAULT_LOGGING_CONFIG['handlers'] \ -.update(DEFAULT_DAG_PARSING_LOGGING_CONFIG['handlers']) -DEFAULT_LOGGING_CONFIG['loggers'] \ -.update(DEFAULT_DAG_PARSING_LOGGING_CONFIG['loggers']) - -# Manually create log directory for processor_manager handler as RotatingFileHandler -# will only create file but not the directory. -processor_manager_handler_config = DEFAULT_DAG_PARSING_LOGGING_CONFIG['handlers'][ -'processor_manager'] -directory = os.path.dirname(processor_manager_handler_config['filename']) -mkdirs(directory, 0o755) +} -if REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('s3://'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['s3']) -elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('gs://'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['gcs']) -elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('wasb'): -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['wasb']) -elif REMOTE_LOGGING and ELASTICSEARCH_HOST: -DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['elasticsearch']) +DEFAULT_LOGGING_CONFIG['handlers'].update(ELASTIC_REMOTE_HANDLERS) Review comment: I didn't want to increase the level of indentation, but It will be clearer to understand, so I make a change. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Resolved] (AIRFLOW-3722) Improve BiqQueryHook test coverage
[ https://issues.apache.org/jira/browse/AIRFLOW-3722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-3722. Resolution: Won't Do > Improve BiqQueryHook test coverage > -- > > Key: AIRFLOW-3722 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3722 > Project: Apache Airflow > Issue Type: Test > Components: gcp >Reporter: Felix Uellendall >Priority: Major > > There are currently many lines not being tested. > > This Ticket will improve the overall test coverage of this Hook. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] feluelle commented on issue #6672: [AIRFLOW-6079] Make consistent use of timezone.parse
feluelle commented on issue #6672: [AIRFLOW-6079] Make consistent use of timezone.parse URL: https://github.com/apache/airflow/pull/6672#issuecomment-558824727 That's correct. But lets see what the tests have to say ;) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Resolved] (AIRFLOW-5422) Add type annotations to GCP operators
[ https://issues.apache.org/jira/browse/AIRFLOW-5422?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-5422. Resolution: Fixed > Add type annotations to GCP operators > - > > Key: AIRFLOW-5422 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5422 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.5 >Reporter: Tobiasz Kedzierski >Assignee: Tobiasz Kedzierski >Priority: Minor > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5544) GCP datastore operation pull should have some retry to avoid false alarm
[ https://issues.apache.org/jira/browse/AIRFLOW-5544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982926#comment-16982926 ] Kamil Bregula commented on AIRFLOW-5544: [~coolcute] Is it still valid? My team has made improvements in this integration. > GCP datastore operation pull should have some retry to avoid false alarm > > > Key: AIRFLOW-5544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5544 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.5 >Reporter: Yang Hu >Priority: Minor > > Hi Airflow, > We noticed that sometime, google status url will return us 500 and airflow > will mark the whole task failed, but is actually runs fine. > > I think we need some retry before failing the task, at least at this 500 case > Sample request: > [2019-09-18 21:45:19,543] \{base_task_runner.py:98} INFO - Subtask: > googleapiclient.errors.HttpError: HttpError 500 when requesting href="https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json; > > target="_blank">https://datastore.googleapis.com/v1/projects/MASKED/operations/AS_MASKED_RI?alt=json > returned Unknown Error. > Ref code: > [https://github.com/apache/airflow/blob/7e4330cce0b4f333b1658cdc315b06505cf9dd76/airflow/gcp/hooks/datastore.py#L269] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6041) Add user agent to the Discovery API client
[ https://issues.apache.org/jira/browse/AIRFLOW-6041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982925#comment-16982925 ] ASF subversion and git services commented on AIRFLOW-6041: -- Commit 8e1ce8de24fda0187e098b02b629c116e3b678e5 in airflow's branch refs/heads/master from Kamil Breguła [ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8e1ce8d ] [AIRFLOW-6041] Add user agent to the Discovery API client (#6636) > Add user agent to the Discovery API client > -- > > Key: AIRFLOW-6041 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6041 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6041) Add user agent to the Discovery API client
[ https://issues.apache.org/jira/browse/AIRFLOW-6041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6041. - Fix Version/s: 2.0.0 Resolution: Fixed > Add user agent to the Discovery API client > -- > > Key: AIRFLOW-6041 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6041 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6041) Add user agent to the Discovery API client
[ https://issues.apache.org/jira/browse/AIRFLOW-6041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982924#comment-16982924 ] ASF GitHub Bot commented on AIRFLOW-6041: - kaxil commented on pull request #6636: [AIRFLOW-6041] Add user agent to the Discovery API client URL: https://github.com/apache/airflow/pull/6636 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Add user agent to the Discovery API client > -- > > Key: AIRFLOW-6041 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6041 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5635) Add expiration_time option to BigQueryCreateEmptyTableOperator and BigQueryCreateExternalTableOperator
[ https://issues.apache.org/jira/browse/AIRFLOW-5635?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982922#comment-16982922 ] Kamil Bregula commented on AIRFLOW-5635: [~zope.manish83] Do you plan to work on it? > Add expiration_time option to BigQueryCreateEmptyTableOperator and > BigQueryCreateExternalTableOperator > -- > > Key: AIRFLOW-5635 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5635 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.5 >Reporter: Joel Croteau >Assignee: Manish Zope >Priority: Major > > The [BigQuery REST > API|https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#Table] > includes the ability to set an {{expirationTime}} for a table when using the > {{tables.insert}} method, which both {{BigQueryCreateEmptyTable}} and > {{BigQueryCreateExternalTable}} use. We should include options in these > operators to set this if desired. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [airflow] kaxil merged pull request #6636: [AIRFLOW-6041] Add user agent to the Discovery API client
kaxil merged pull request #6636: [AIRFLOW-6041] Add user agent to the Discovery API client URL: https://github.com/apache/airflow/pull/6636 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3722) Improve BiqQueryHook test coverage
[ https://issues.apache.org/jira/browse/AIRFLOW-3722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982923#comment-16982923 ] Felix Uellendall commented on AIRFLOW-3722: --- No, don't think so. At least not in the near future. > Improve BiqQueryHook test coverage > -- > > Key: AIRFLOW-3722 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3722 > Project: Apache Airflow > Issue Type: Test > Components: gcp >Reporter: Felix Uellendall >Priority: Major > > There are currently many lines not being tested. > > This Ticket will improve the overall test coverage of this Hook. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6041) Add user agent to the Discovery API client
[ https://issues.apache.org/jira/browse/AIRFLOW-6041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula updated AIRFLOW-6041: --- Issue Type: New Feature (was: Bug) > Add user agent to the Discovery API client > -- > > Key: AIRFLOW-6041 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6041 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.6 >Reporter: Kamil Bregula >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-2056) Integrate Google Cloud Storage (GCS) operators into 1 file
[ https://issues.apache.org/jira/browse/AIRFLOW-2056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik closed AIRFLOW-2056. --- Resolution: Not A Problem Not Valid Anymore. As a part of refactoring and moving files (part of AIP-21) this was solved > Integrate Google Cloud Storage (GCS) operators into 1 file > -- > > Key: AIRFLOW-2056 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2056 > Project: Apache Airflow > Issue Type: Improvement > Components: contrib, gcp >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > There are currently 5 operators: > * GoogleCloudStorageCopyOperator > * GoogleCloudStorageDownloadOperator > * GoogleCloudStorageListOperator > * GoogleCloudStorageToBigQueryOperator > * GoogleCloudStorageToGoogleCloudStorageOperator > It would be ideal to have 1 file *gcs_operator.py* similar to > *dataproc_operator.py* containing all the operators related to Google Cloud > Storage. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-2056) Integrate Google Cloud Storage (GCS) operators into 1 file
[ https://issues.apache.org/jira/browse/AIRFLOW-2056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982918#comment-16982918 ] Kamil Bregula commented on AIRFLOW-2056: [~kaxilnaik] What is the progress of this ticket? Was everything done in it? > Integrate Google Cloud Storage (GCS) operators into 1 file > -- > > Key: AIRFLOW-2056 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2056 > Project: Apache Airflow > Issue Type: Improvement > Components: contrib, gcp >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > There are currently 5 operators: > * GoogleCloudStorageCopyOperator > * GoogleCloudStorageDownloadOperator > * GoogleCloudStorageListOperator > * GoogleCloudStorageToBigQueryOperator > * GoogleCloudStorageToGoogleCloudStorageOperator > It would be ideal to have 1 file *gcs_operator.py* similar to > *dataproc_operator.py* containing all the operators related to Google Cloud > Storage. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-2155) Lack of consistency between project vs. project_id
[ https://issues.apache.org/jira/browse/AIRFLOW-2155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982916#comment-16982916 ] Kamil Bregula commented on AIRFLOW-2155: [~maximilianroos] Is it still valid? My team has made changes in this area. > Lack of consistency between project vs. project_id > -- > > Key: AIRFLOW-2155 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2155 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.9.0 >Reporter: Maximilian Roos >Priority: Minor > > Some Operators use `project` to refer to the GCP Project, others use > `project_id`. > > As of Feb 27, there are [58 results for > project|https://github.com/apache/incubator-airflow/search?utf8=%E2%9C%93=project=] > and [22 results for > project_id|https://github.com/apache/incubator-airflow/search?utf8=%E2%9C%93=project_id=] > > While there are worse problems in our world, this does make passing default > args through error prone and duplicative -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-2540) Create an example dag using Google Cloud ML Engine operators
[ https://issues.apache.org/jira/browse/AIRFLOW-2540?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982912#comment-16982912 ] Kamil Bregula commented on AIRFLOW-2540: [~younghee] Do you plan to work on it? > Create an example dag using Google Cloud ML Engine operators > > > Key: AIRFLOW-2540 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2540 > Project: Apache Airflow > Issue Type: Task > Components: contrib, gcp >Reporter: Younghee Kwon >Priority: Minor > > Create an example dag to show how to use ML Engine operators. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-3401) Properly encode templated fields in Cloud Pub/Sub example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-3401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula updated AIRFLOW-3401: --- Description: Context: [https://groups.google.com/d/msg/cloud-composer-discuss/McHHu582G7o/7N66GrwsBAAJ|https://groups.google.com/d/msg/cloud-composer-discuss/McHHu582G7o/7N66GrwsBAAJ] (was: Context: [https://groups.google.com/d/msg/cloud-composer-discuss/McHHu582G7o/7N66GrwsBAAJ|http://example.com]) > Properly encode templated fields in Cloud Pub/Sub example DAG > - > > Key: AIRFLOW-3401 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3401 > Project: Apache Airflow > Issue Type: Bug > Components: contrib, documentation, gcp >Reporter: Wilson Lian >Priority: Trivial > Labels: examples > > Context: > [https://groups.google.com/d/msg/cloud-composer-discuss/McHHu582G7o/7N66GrwsBAAJ|https://groups.google.com/d/msg/cloud-composer-discuss/McHHu582G7o/7N66GrwsBAAJ] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-3722) Improve BiqQueryHook test coverage
[ https://issues.apache.org/jira/browse/AIRFLOW-3722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982906#comment-16982906 ] Kamil Bregula commented on AIRFLOW-3722: [~feluelle] Gentle ping. Do you plan to work on it? My team started working this week. > Improve BiqQueryHook test coverage > -- > > Key: AIRFLOW-3722 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3722 > Project: Apache Airflow > Issue Type: Test > Components: gcp >Reporter: Felix Uellendall >Priority: Major > > There are currently many lines not being tested. > > This Ticket will improve the overall test coverage of this Hook. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-3776) Use maintained google client library for bigquery
[ https://issues.apache.org/jira/browse/AIRFLOW-3776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982902#comment-16982902 ] Kamil Bregula commented on AIRFLOW-3776: [~jmcarp] My team started working on this change. Do you still want to continue this? > Use maintained google client library for bigquery > - > > Key: AIRFLOW-3776 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3776 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Reporter: Joshua Carp >Assignee: Joshua Carp >Priority: Minor > Labels: bigquery > > We're currently using > [https://github.com/googleapis/google-api-python-client], which has reached > end-of-line and isn't getting updates other than bug fixes. Google recommends > using [https://github.com/googleapis/google-cloud-python] instead. This > ticket tracks upgrading client libraries for bigquery. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-3759) Add kms_key_name parameter to BigQuery operators to support CMEK
[ https://issues.apache.org/jira/browse/AIRFLOW-3759?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982904#comment-16982904 ] Kamil Bregula commented on AIRFLOW-3759: [~jasper.smith] It is still valid? > Add kms_key_name parameter to BigQuery operators to support CMEK > > > Key: AIRFLOW-3759 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3759 > Project: Apache Airflow > Issue Type: Wish > Components: gcp, operators >Affects Versions: 1.10.2 >Reporter: Jasper Smith >Priority: Trivial > > To support customer managed encryption keys in BigQuery, add a kms_key_name > parameter to relevant BigQuery operators (those that can write to tables). > This can be used to populate the required > configuration.query.destinationEncryptionConfiguration.kmsKeyName property in > the job configuration. > > [API > Link|https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.destinationEncryptionConfiguration] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4068) Add GoogleCloudStorageFileTransformOperator
[ https://issues.apache.org/jira/browse/AIRFLOW-4068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4068. Resolution: Fixed > Add GoogleCloudStorageFileTransformOperator > --- > > Key: AIRFLOW-4068 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4068 > Project: Apache Airflow > Issue Type: Wish > Components: gcp, operators >Affects Versions: 1.10.3 >Reporter: jack >Priority: Major > Fix For: 2.0.0 > > > This can be added to the [gcs_operator > |https://github.com/apache/airflow/blob/master/airflow/contrib/operators/gcs_operator.py]as > another class. > The behavior should be similar to the > [S3FileTransformOperator|https://github.com/apache/airflow/blob/master/airflow/operators/s3_file_transform_operator.py] > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4226) DataProcPySparkOperator gets project ID from gcp_conn_id rather than project_id
[ https://issues.apache.org/jira/browse/AIRFLOW-4226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982900#comment-16982900 ] Kaxil Naik commented on AIRFLOW-4226: - Yes in 1.10.6 it is still the case. > DataProcPySparkOperator gets project ID from gcp_conn_id rather than > project_id > --- > > Key: AIRFLOW-4226 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4226 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3, 1.10.4, 1.10.5 >Reporter: Aaron Liblong >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.7 > > > DataProcPySparkOperator gets [the project ID for the cluster it > creates|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1334] > from [the hook instantiated from > gcp_conn_id|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1324] > rather than from its own project_id property. This is contrary to how every > other operator in this module works. Not sure if it's a bug or intended. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (AIRFLOW-4226) DataProcPySparkOperator gets project ID from gcp_conn_id rather than project_id
[ https://issues.apache.org/jira/browse/AIRFLOW-4226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik reassigned AIRFLOW-4226: --- Assignee: Kamil Bregula (was: Kaxil Naik) > DataProcPySparkOperator gets project ID from gcp_conn_id rather than > project_id > --- > > Key: AIRFLOW-4226 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4226 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3, 1.10.4, 1.10.5 >Reporter: Aaron Liblong >Assignee: Kamil Bregula >Priority: Minor > Fix For: 1.10.7 > > > DataProcPySparkOperator gets [the project ID for the cluster it > creates|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1334] > from [the hook instantiated from > gcp_conn_id|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1324] > rather than from its own project_id property. This is contrary to how every > other operator in this module works. Not sure if it's a bug or intended. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4157) FileToGoogleCloudStorageOperator should expose "multipart" uploading supported in GoogleCloudStorageHook.upload()
[ https://issues.apache.org/jira/browse/AIRFLOW-4157?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4157. Resolution: Cannot Reproduce We have now Python API library. This ticket is not valid now. > FileToGoogleCloudStorageOperator should expose "multipart" uploading > supported in GoogleCloudStorageHook.upload() > - > > Key: AIRFLOW-4157 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4157 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.2 >Reporter: Evgeny Podlepaev >Priority: Major > > FileToGoogleCloudStorageOperator should allow "multipart" flag to be > specified and passed to GoogleCloudStorageHook.upload(). Uploading large > files (gigabytes) in a single shot results in an out of memory error. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4226) DataProcPySparkOperator gets project ID from gcp_conn_id rather than project_id
[ https://issues.apache.org/jira/browse/AIRFLOW-4226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982895#comment-16982895 ] Kamil Bregula commented on AIRFLOW-4226: [~kaxilnaik] Is this ticket still valid? > DataProcPySparkOperator gets project ID from gcp_conn_id rather than > project_id > --- > > Key: AIRFLOW-4226 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4226 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3, 1.10.4, 1.10.5 >Reporter: Aaron Liblong >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.7 > > > DataProcPySparkOperator gets [the project ID for the cluster it > creates|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1334] > from [the hook instantiated from > gcp_conn_id|https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py#L1324] > rather than from its own project_id property. This is contrary to how every > other operator in this module works. Not sure if it's a bug or intended. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4779) Implement GCP Stackdriver' Hook and Operators
[ https://issues.apache.org/jira/browse/AIRFLOW-4779?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982891#comment-16982891 ] Kamil Bregula commented on AIRFLOW-4779: Hello [~ryan.yuan] Can you tell me what's the progress of this task? > Implement GCP Stackdriver' Hook and Operators > - > > Key: AIRFLOW-4779 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4779 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 2.0.0 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Major > > Implement GCP Stackdriver Logging' Hook and Operators. > - Read, write, delete logs > - Export logs to GCS, BQ or Pub/Sub -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4783) Make GCP operators Pylint compatible
[ https://issues.apache.org/jira/browse/AIRFLOW-4783?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982889#comment-16982889 ] Kamil Bregula commented on AIRFLOW-4783: [~Urbaszek] Do you have suggestion in this area? > Make GCP operators Pylint compatible > > > Key: AIRFLOW-4783 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4783 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 2.0.0 >Reporter: Tomasz Urbaszek >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4808) Error when getting an already deleted dataproc cluster status
[ https://issues.apache.org/jira/browse/AIRFLOW-4808?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982888#comment-16982888 ] Kamil Bregula commented on AIRFLOW-4808: [~london_su] Do you plan to work on it? > Error when getting an already deleted dataproc cluster status > -- > > Key: AIRFLOW-4808 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4808 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.0 >Reporter: London Su >Assignee: London Su >Priority: Trivial > > When a dataproc cluster is manually deleted, ` it will throw an NoneType is > not iterable error when *DataprocClusterCreateOperator* operator tries to get > cluster status, -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4886) Add routines methods to BigQuery Hooks and Operators
[ https://issues.apache.org/jira/browse/AIRFLOW-4886?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982887#comment-16982887 ] Kamil Bregula commented on AIRFLOW-4886: Hello [~ryan.yuan] What's up? Is it warm in Australia? Can you tell me what's the progress of this task? Regards > Add routines methods to BigQuery Hooks and Operators > > > Key: AIRFLOW-4886 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4886 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 2.0.0 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Major > > Add routines methods to BigQueryHook > Add related operators -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4894) Add hook and operator for GCP Data Loss Prevention API
[ https://issues.apache.org/jira/browse/AIRFLOW-4894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4894. Fix Version/s: 2.0.0 Resolution: Fixed > Add hook and operator for GCP Data Loss Prevention API > -- > > Key: AIRFLOW-4894 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4894 > Project: Apache Airflow > Issue Type: New Feature > Components: api, gcp, hooks, operators, tests >Affects Versions: 1.10.3 >Reporter: Zhengliang Zhu >Assignee: Zhengliang Zhu >Priority: Minor > Fix For: 2.0.0 > > > Add a hook and operator to manipulate and use Google Cloud Data Loss > Prevention(DLP) API. DLP API allow users to inspect or redact sensitive data > in text contents or GCP storage locations. > The hook includes the following APIs, implemented with Google service > discovery API: > * inspect/deidentify/reidentify for content: > [https://cloud.google.com/dlp/docs/reference/rest/v2/projects.content] > * create/delete/get/list/patch for inspectTemplates: > [https://cloud.google.com/dlp/docs/reference/rest/v2/organizations.inspectTemplates], > > [https://cloud.google.com/dlp/docs/reference/rest/v2/projects.inspectTemplates] > * create/delete/get/list/patch for storedInfoTypes: > [https://cloud.google.com/dlp/docs/reference/rest/v2/organizations.storedInfoTypes], > > [https://cloud.google.com/dlp/docs/reference/rest/v2/projects.storedInfoTypes] > * create/list/get/delete/cancel for dlpJobs: > [https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs] > The operator creates a long-running dlp job (for storage inspection or risk > analysis), keeps polling its status and waits for it to be done or > canceled/deleted. > Apart from unit tests, also tested locally in DAG level(not included in PR). -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4931) Add KMS Encryption Configuration to BigQuery Hook and Operators
[ https://issues.apache.org/jira/browse/AIRFLOW-4931?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4931. Resolution: Fixed > Add KMS Encryption Configuration to BigQuery Hook and Operators > --- > > Key: AIRFLOW-4931 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4931 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Critical > Fix For: 2.0.0 > > > One of the clients requires adding KMS encryption on BigQuery tables. > Reference: > [https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#ExternalDataConfiguration] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4964) Add BigQuery Data Transfer Hook and Operator
[ https://issues.apache.org/jira/browse/AIRFLOW-4964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4964. Fix Version/s: 2.0.0 Resolution: Fixed > Add BigQuery Data Transfer Hook and Operator > > > Key: AIRFLOW-4964 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4964 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.3 >Reporter: Ryan Yuan >Assignee: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > > Add BigQuery Data Transfer Hook and Operator to allow users to transfer data > from partner SaaS applications to Google BigQuery on a scheduled, managed > basis. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4971) Add Google Display & Video 360 integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-4971. Fix Version/s: 2.0.0 Resolution: Fixed > Add Google Display & Video 360 integration > -- > > Key: AIRFLOW-4971 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4971 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > > Hi > This project lacks integration with the Google Display & Video 360 service. I > would be happy if Airflow had proper operators and hooks that integrate with > this service. > Product Documentation: > https://developers.google.com/bid-manager/guides/getting-started-api > API Documentation: > https://developers.google.com/bid-manager/guides/getting-started-api > Lots of love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4967) Add Cloud AutoML NL Entity Extraction integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4967?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982883#comment-16982883 ] Kamil Bregula commented on AIRFLOW-4967: [~Urbaszek] Do you have other suggestions in this area? > Add Cloud AutoML NL Entity Extraction integration > - > > Key: AIRFLOW-4967 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4967 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > > Hi > This project lacks integration with the Cloud AutoML NL Entity Extraction > service. I would be happy if Airflow had proper operators and hooks that > integrate with this service. > Product Documentation: > https://cloud.google.com/natural-language/automl/entity-analysis/docs/ > API Documentation: > https://googleapis.github.io/google-cloud-python/latest/automl/index.html > Love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4966) Add Cloud AutoML NL Classification integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982885#comment-16982885 ] Kamil Bregula commented on AIRFLOW-4966: [~Urbaszek] Do you have other suggestions in this area? > Add Cloud AutoML NL Classification integration > -- > > Key: AIRFLOW-4966 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4966 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > > Hi > This project lacks integration with the Cloud AutoML NL Classification > service. I would be happy if Airflow had proper operators and hooks that > integrate with this service. > Product Documentation: https://cloud.google.com/natural-language/automl/docs/ > API Documentation: > https://googleapis.github.io/google-cloud-python/latest/automl/index.html > Love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4968) Add Cloud AutoML NL Sentiment integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4968?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982881#comment-16982881 ] Kamil Bregula commented on AIRFLOW-4968: [~Urbaszek] Do you have other suggestions in this area? > Add Cloud AutoML NL Sentiment integration > - > > Key: AIRFLOW-4968 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4968 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > > Hi > This project lacks integration with the Cloud AutoML NL Sentiment service. I > would be happy if Airflow had proper operators and hooks that integrate with > this service. > Product Documentation: > https://cloud.google.com/natural-language/automl/sentiment/docs/ > API Documentation: > https://googleapis.github.io/google-cloud-python/latest/automl/index.html > Lots of love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4969) Add Cloud AutoML Tables integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4969?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982880#comment-16982880 ] Kamil Bregula commented on AIRFLOW-4969: [~Urbaszek] Do you have other suggestions in this area? > Add Cloud AutoML Tables integration > --- > > Key: AIRFLOW-4969 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4969 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > > Hi > This project lacks integration with the Cloud AutoML Tables service. I would > be happy if Airflow had proper operators and hooks that integrate with this > service. > Product Documentation: https://cloud.google.com/automl-tables/docs/ > API Documentation: > https://googleapis.github.io/google-cloud-python/latest/automl/index.html > Love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-4972) Add Google Search Ads 360 integration
[ https://issues.apache.org/jira/browse/AIRFLOW-4972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982878#comment-16982878 ] Kamil Bregula commented on AIRFLOW-4972: [~Urbaszek] Do you have other suggestions in this area? > Add Google Search Ads 360 integration > - > > Key: AIRFLOW-4972 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4972 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Kamil Bregula >Priority: Major > > Hi > This project lacks integration with the Google Search Ads 360 service. I > would be happy if Airflow had proper operators and hooks that integrate with > this service. > Product Documentation: https://developers.google.com/search-ads/ > API Documentation: > https://developers.google.com/resources/api-libraries/documentation/dfareporting/v3.3/python/latest/ > Lots of love -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5013) Add GCP Data Catalog Hook and Operators
[ https://issues.apache.org/jira/browse/AIRFLOW-5013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982876#comment-16982876 ] Kamil Bregula commented on AIRFLOW-5013: Hi [~ryan.yuan] Do you plan to work on it? What is the progress of this task? Regards, > Add GCP Data Catalog Hook and Operators > --- > > Key: AIRFLOW-5013 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5013 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.3 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Major > > Add GCP Data Catalog services to Airflow > [https://cloud.google.com/data-catalog/docs/reference] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-5099) Implement Google Cloud AutoML operators
[ https://issues.apache.org/jira/browse/AIRFLOW-5099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16982871#comment-16982871 ] Kamil Bregula commented on AIRFLOW-5099: Hi [~Urbaszek] Is it done? Do you have any other plans? > Implement Google Cloud AutoML operators > --- > > Key: AIRFLOW-5099 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5099 > Project: Apache Airflow > Issue Type: New Feature > Components: gcp >Affects Versions: 1.10.4 >Reporter: Tomasz Urbaszek >Assignee: Tomasz Urbaszek >Priority: Major > > Implement Google Cloud AutoML operators. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-5129) Add typehint to gcp_dlp_hook.py
[ https://issues.apache.org/jira/browse/AIRFLOW-5129?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kamil Bregula resolved AIRFLOW-5129. Fix Version/s: 2.0.0 Resolution: Fixed > Add typehint to gcp_dlp_hook.py > --- > > Key: AIRFLOW-5129 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5129 > Project: Apache Airflow > Issue Type: Improvement > Components: gcp >Affects Versions: 1.10.3 >Reporter: Ryan Yuan >Assignee: Ryan Yuan >Priority: Minor > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)