[GitHub] codecov-io edited a comment on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description.
codecov-io edited a comment on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description. URL: https://github.com/apache/incubator-airflow/pull/4279#issuecomment-444370208 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=h1) Report > Merging [#4279](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4279/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4279 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=footer). Last update [9c04e8f...3844805](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description.
codecov-io commented on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description. URL: https://github.com/apache/incubator-airflow/pull/4279#issuecomment-444370203 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=h1) Report > Merging [#4279](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4279/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4279 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=footer). Last update [9c04e8f...3844805](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description.
codecov-io commented on issue #4279: [AIRFLOW-3444] Explicitly set transfer operator description. URL: https://github.com/apache/incubator-airflow/pull/4279#issuecomment-444370208 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=h1) Report > Merging [#4279](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4279/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4279 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=footer). Last update [9c04e8f...3844805](https://codecov.io/gh/apache/incubator-airflow/pull/4279?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3444) Expand templated fields in gcs transfer service operator
[ https://issues.apache.org/jira/browse/AIRFLOW-3444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709616#comment-16709616 ] ASF GitHub Bot commented on AIRFLOW-3444: - jmcarp opened a new pull request #4279: [AIRFLOW-3444] Explicitly set transfer operator description. URL: https://github.com/apache/incubator-airflow/pull/4279 Instead of passing transfer operator descriptions through generic kwargs, accept description as a named parameter; no other parameters are accepted by the API, so generic kwargs aren't useful. Also, add description and object conditions to templated fields. Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3444 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: The `S3ToGoogleCloudStorageTransferOperator` should support an explicit `description` parameter and allow that parameter to be templated. This will make it easier to set job descriptions, and we'll be able to drop the `job_kwargs` attribute, which doesn't have any other use. ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Updates existing tests ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Expand templated fields in gcs transfer service operator > > > Key: AIRFLOW-3444 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3444 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Josh Carp >Assignee: Josh Carp >Priority: Trivial > > The `S3ToGoogleCloudStorageTransferOperator` should support an explicit > `description` parameter and allow that parameter to be templated. This will > make it easier to set job descriptions, and we'll be able to drop the > `job_kwargs` attribute, which doesn't have any other use. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] jmcarp opened a new pull request #4279: [AIRFLOW-3444] Explicitly set transfer operator description.
jmcarp opened a new pull request #4279: [AIRFLOW-3444] Explicitly set transfer operator description. URL: https://github.com/apache/incubator-airflow/pull/4279 Instead of passing transfer operator descriptions through generic kwargs, accept description as a named parameter; no other parameters are accepted by the API, so generic kwargs aren't useful. Also, add description and object conditions to templated fields. Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3444 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: The `S3ToGoogleCloudStorageTransferOperator` should support an explicit `description` parameter and allow that parameter to be templated. This will make it easier to set job descriptions, and we'll be able to drop the `job_kwargs` attribute, which doesn't have any other use. ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: Updates existing tests ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [x] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-3444) Expand templated fields in gcs transfer service operator
Josh Carp created AIRFLOW-3444: -- Summary: Expand templated fields in gcs transfer service operator Key: AIRFLOW-3444 URL: https://issues.apache.org/jira/browse/AIRFLOW-3444 Project: Apache Airflow Issue Type: Improvement Reporter: Josh Carp Assignee: Josh Carp The `S3ToGoogleCloudStorageTransferOperator` should support an explicit `description` parameter and allow that parameter to be templated. This will make it easier to set job descriptions, and we'll be able to drop the `job_kwargs` attribute, which doesn't have any other use. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328155 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...42f8942](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328155 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...42f8942](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3406) Implement an Azure CosmosDB operator
[ https://issues.apache.org/jira/browse/AIRFLOW-3406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709543#comment-16709543 ] ASF GitHub Bot commented on AIRFLOW-3406: - tmiller-msft opened a new pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265 Add an operator and hook to manipulate and use Azure CosmosDB documents, including creation, deletion, and updating documents and collections. Includes sensor to detect documents being added to a collection. Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3406 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: An Azure CosmosDB hook, operator, and sensor for manipulating documents and collections ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [X] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Implement an Azure CosmosDB operator > - > > Key: AIRFLOW-3406 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3406 > Project: Apache Airflow > Issue Type: New Feature >Reporter: Tom Miller >Assignee: Tom Miller >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] tmiller-msft opened a new pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft opened a new pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265 Add an operator and hook to manipulate and use Azure CosmosDB documents, including creation, deletion, and updating documents and collections. Includes sensor to detect documents being added to a collection. Make sure you have checked _all_ steps below. ### Jira - [X] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3406 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [X] Here are some details about my PR, including screenshots of any UI changes: An Azure CosmosDB hook, operator, and sensor for manipulating documents and collections ### Tests - [X] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [X] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [X] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [X] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3406) Implement an Azure CosmosDB operator
[ https://issues.apache.org/jira/browse/AIRFLOW-3406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709526#comment-16709526 ] ASF GitHub Bot commented on AIRFLOW-3406: - tmiller-msft closed pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Implement an Azure CosmosDB operator > - > > Key: AIRFLOW-3406 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3406 > Project: Apache Airflow > Issue Type: New Feature >Reporter: Tom Miller >Assignee: Tom Miller >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] tmiller-msft closed pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft closed pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328155 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...13d5df0](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io edited a comment on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328155 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...13d5df0](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328152 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...13d5df0](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
codecov-io commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444328155 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=h1) Report > Merging [#4265](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **decrease** coverage by `<.01%`. > The diff coverage is `75%`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4265/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) ```diff @@Coverage Diff @@ ## master#4265 +/- ## == - Coverage 78.08% 78.08% -0.01% == Files 201 201 Lines 1645816462 +4 == + Hits1285112854 +3 - Misses 3607 3608 +1 ``` | [Impacted Files](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=tree) | Coverage Δ | | |---|---|---| | [airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==) | `33.33% <0%> (-0.27%)` | :arrow_down: | | [airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4265/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=) | `92.34% <100%> (ø)` | :arrow_up: | -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=footer). Last update [9c04e8f...13d5df0](https://codecov.io/gh/apache/incubator-airflow/pull/4265?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444327014 Thanks @tmiller-msft .Kindly take your time :-) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] tmiller-msft commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft commented on issue #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#issuecomment-444326544 Ah yes, I see what you mean.. I will update to do that instead.. and yeah, fixing rebasing as well, give me an hour or three. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238889159 ## File path: airflow/contrib/sensors/azure_cosmos_sensor.py ## @@ -0,0 +1,68 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.sensors.base_sensor_operator import BaseSensorOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosDocumentSensor(BaseSensorOperator): +""" +Checks for the existence of a document which +matches the given query in CosmosDB. Example: + +>>> azure_cosmos_sensor = AzureCosmosDocumentSensor(database_name="somedatabase_name", +...collection_name="somecollection_name", +...document_id="unique-doc-id", +...azure_cosmos_conn_id="azure_cosmos_default", +...task_id="azure_cosmos_sensor") +""" +template_fields = ('database_name', 'collection_name', 'document_id') + +@apply_defaults +def __init__( +self, +database_name, +collection_name, +document_id, +azure_cosmos_conn_id="azure_cosmos_default", +*args, +**kwargs): +""" +Create a new AzureCosmosDocumentSensor + +:param database_name: Target CosmosDB database_name. +:type database_name: str +:param collection_name: Target CosmosDB collection_name. +:type collection_name: str +:param document_id: The ID of the target document. +:type query: str +:param azure_cosmos_conn_id: The connection ID to use + when connecting to CosmosDB. Review comment: Thanks, updated to mirror this parameter on the other classes, which also addresses this since it no longer wraps at all. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238889041 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) + +# Create the collection as well +hook.create_collection(self.collection_name, self.database_name) Review comment: Thanks! Updated to query for existing database and only creating when not existing rather than using exception flow. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
tmiller-msft commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r23991 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) Review comment: Thanks! Updated to query for existing database and only creating when not existing rather than using exception flow. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (AIRFLOW-3443) KubernetesPodOperator image_pull_secrets must be a valid parameter
[ https://issues.apache.org/jira/browse/AIRFLOW-3443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Maxime Rauer closed AIRFLOW-3443. - Resolution: Duplicate > KubernetesPodOperator image_pull_secrets must be a valid parameter > -- > > Key: AIRFLOW-3443 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3443 > Project: Apache Airflow > Issue Type: Bug > Components: kubernetes >Affects Versions: 1.10.1 >Reporter: Maxime Rauer >Assignee: Maxime Rauer >Priority: Blocker > Original Estimate: 24h > Remaining Estimate: 24h > > We've been successfully using the KubernetesPodOperator in our company with a > local Docker registry, but when switching to a private repository such as > Amazon ECR, Airflow wasn't able to pull the secrets from the cluster. > I have made a change in the *make_pod()* function on > *kubernetes_pod_operator.py* to support a new *image_pull_secrets* field. > This works great on our end, and the community could benefit from it for the > version 10.0.1 > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] feluelle commented on issue #4121: [AIRFLOW-2568] Azure Container Instances operator
feluelle commented on issue #4121: [AIRFLOW-2568] Azure Container Instances operator URL: https://github.com/apache/incubator-airflow/pull/4121#issuecomment-444283991 Could you please add tests for the hook, too? :) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-3443) KubernetesPodOperator image_pull_secrets must be a valid parameter
Maxime Rauer created AIRFLOW-3443: - Summary: KubernetesPodOperator image_pull_secrets must be a valid parameter Key: AIRFLOW-3443 URL: https://issues.apache.org/jira/browse/AIRFLOW-3443 Project: Apache Airflow Issue Type: Bug Components: kubernetes Affects Versions: 1.10.1 Reporter: Maxime Rauer Assignee: Maxime Rauer We've been successfully using the KubernetesPodOperator in our company with a local Docker registry, but when switching to a private repository such as Amazon ECR, Airflow wasn't able to pull the secrets from the cluster. I have made a change in the *make_pod()* function on *kubernetes_pod_operator.py* to support a new *image_pull_secrets* field. This works great on our end, and the community could benefit from it for the version 10.0.1 -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3430) Document how to become a commiter
[ https://issues.apache.org/jira/browse/AIRFLOW-3430?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709294#comment-16709294 ] Fokko Driesprong commented on AIRFLOW-3430: --- Updated the existing documentation: https://cwiki.apache.org/confluence/display/AIRFLOW/Committers > Document how to become a commiter > - > > Key: AIRFLOW-3430 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3430 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Ash Berlin-Taylor >Assignee: Fokko Driesprong >Priority: Major > > Add to our documents what the process is to become a committer (CO50): > {quote}The way in which contributors can be granted more rights such as > commit access or decision power is clearly documented and is the same for all > contributors. > {quote} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] ashb commented on issue #3546: AIRFLOW-2664: Support filtering dag runs by id prefix in API.
ashb commented on issue #3546: AIRFLOW-2664: Support filtering dag runs by id prefix in API. URL: https://github.com/apache/incubator-airflow/pull/3546#issuecomment-444268426 Shouldn't our test set up already create those? If not how aren't all the other RBAC tests failing? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4278: [AIRFLOW-2524] Add SageMaker doc to AWS integration section
codecov-io commented on issue #4278: [AIRFLOW-2524] Add SageMaker doc to AWS integration section URL: https://github.com/apache/incubator-airflow/pull/4278#issuecomment-444268064 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=h1) Report > Merging [#4278](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4278/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4278 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=footer). Last update [9c04e8f...7f292c6](https://codecov.io/gh/apache/incubator-airflow/pull/4278?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444252494 This is great, when are you guys planning to merge this ? @oelesinsc24 it might be useful to fully leverage Glue API, two usefull arguments are `AllocatedCapacity` for number of DPUs per Job Run and `SecurityConfiguration` would allow encryption at rest using KMS. https://github.com/apache/incubator-airflow/blob/71cb37eaedae50e2abe9a13591e31a56f2e3659e/airflow/contrib/hooks/aws_glue_job_hook.py#L67-L70 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] Fokko commented on issue #3546: AIRFLOW-2664: Support filtering dag runs by id prefix in API.
Fokko commented on issue #3546: AIRFLOW-2664: Support filtering dag runs by id prefix in API. URL: https://github.com/apache/incubator-airflow/pull/3546#issuecomment-444262224 It looks like the `ab_` tables aren't properly initialized. ab is short for Flask App Builder: https://github.com/dpgaspar/Flask-AppBuilder These tables need to be initialized as well if you want to use the RBAC UI. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] Fokko commented on issue #4207: [AIRFLOW-3367] Run celery integration test with redis broker.
Fokko commented on issue #4207: [AIRFLOW-3367] Run celery integration test with redis broker. URL: https://github.com/apache/incubator-airflow/pull/4207#issuecomment-444261366 LGTM, sorry for the delayed reply. I'm currently a bit short on time :( This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] Fokko closed pull request #2362: Make FileSensor detect file or folder
Fokko closed pull request #2362: Make FileSensor detect file or folder URL: https://github.com/apache/incubator-airflow/pull/2362 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/airflow/contrib/operators/fs_operator.py b/airflow/contrib/operators/fs_operator.py index 259648709d..b654682268 100644 --- a/airflow/contrib/operators/fs_operator.py +++ b/airflow/contrib/operators/fs_operator.py @@ -13,7 +13,7 @@ # limitations under the License. # -from os import walk +import os import logging from airflow.operators.sensors import BaseSensorOperator @@ -51,8 +51,5 @@ def poke(self, context): full_path = "/".join([basepath, self.filepath]) logging.info( 'Poking for file {full_path} '.format(**locals())) -try: -files = [f for f in walk(full_path)] -except: -return False -return True + +return os.path.exists(full_path) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] Fokko commented on issue #2362: Make FileSensor detect file or folder
Fokko commented on issue #2362: Make FileSensor detect file or folder URL: https://github.com/apache/incubator-airflow/pull/2362#issuecomment-444255503 Please open a new PR @colinbreame if you still want to improve the behaviour of the FileSensor This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-2524) Airflow integration with AWS Sagemaker
[ https://issues.apache.org/jira/browse/AIRFLOW-2524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709238#comment-16709238 ] ASF GitHub Bot commented on AIRFLOW-2524: - yangaws opened a new pull request #4278: [AIRFLOW-2524] Add SageMaker doc to AWS integration section URL: https://github.com/apache/incubator-airflow/pull/4278 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: Add SageMaker doc to AWS integration section ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Airflow integration with AWS Sagemaker > -- > > Key: AIRFLOW-2524 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2524 > Project: Apache Airflow > Issue Type: Improvement > Components: aws, contrib >Reporter: Rajeev Srinivasan >Assignee: Yang Yu >Priority: Major > Labels: AWS > Fix For: 1.10.1 > > Time Spent: 10m > Remaining Estimate: 0h > > Would it be possible to orchestrate an end to end AWS Sagemaker job using > Airflow. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] yangaws opened a new pull request #4278: [AIRFLOW-2524] Add SageMaker doc to AWS integration section
yangaws opened a new pull request #4278: [AIRFLOW-2524] Add SageMaker doc to AWS integration section URL: https://github.com/apache/incubator-airflow/pull/4278 Make sure you have checked _all_ steps below. ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-XXX - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: Add SageMaker doc to AWS integration section ### Tests - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444252494 This is great, when are you guys planning to merge this ? also @oelesinsc24 it might be useful to fully leverage Glue API, two usefull arguments are `AllocatedCapacity` for number of DPUs per Job Run and `SecurityConfiguration` would allow encryption at rest using KMS. https://github.com/apache/incubator-airflow/blob/71cb37eaedae50e2abe9a13591e31a56f2e3659e/airflow/contrib/hooks/aws_glue_job_hook.py#L67-L70 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
shayan90 edited a comment on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444252494 This is great, when are you guys planning to merge this ? also @oelesinsc24 it might be useful to fully leverage Glue API, two usefull arguments are `AllocatedCapacity` allocate capacity for number of DPUs per Job Run and `SecurityConfiguration` would allow encryption at rest using KMS. https://github.com/apache/incubator-airflow/blob/71cb37eaedae50e2abe9a13591e31a56f2e3659e/airflow/contrib/hooks/aws_glue_job_hook.py#L67-L70 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shayan90 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
shayan90 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444252494 This is great, when are you guys planning to merge this ? also @oelesinsc24 it might be useful to fully leverage Glue API, two use full arguments are `AllocatedCapacity` allocate capacity for number of DPUs per Job Run and `SecurityConfiguration` would allow encryption at rest using KMS. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4277: [AIRFLOW-3217] Airflow log view and code view should wrap
codecov-io commented on issue #4277: [AIRFLOW-3217] Airflow log view and code view should wrap URL: https://github.com/apache/incubator-airflow/pull/4277#issuecomment-444199709 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=h1) Report > Merging [#4277](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4277/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4277 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=footer). Last update [9c04e8f...d4fe8a7](https://codecov.io/gh/apache/incubator-airflow/pull/4277?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] nmcalabroso commented on issue #2906: [AIRFLOW-1956] Add parameter whether the navbar clock time is UTC
nmcalabroso commented on issue #2906: [AIRFLOW-1956] Add parameter whether the navbar clock time is UTC URL: https://github.com/apache/incubator-airflow/pull/2906#issuecomment-444184660 Now with the introduction of `default_timezone` config, I think this PR is now relevant. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3217) Airflow log view and code view should wrap
[ https://issues.apache.org/jira/browse/AIRFLOW-3217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709011#comment-16709011 ] ASF GitHub Bot commented on AIRFLOW-3217: - Eronarn opened a new pull request #4277: [AIRFLOW-3217] Airflow log view and code view should wrap URL: https://github.com/apache/incubator-airflow/pull/4277 ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues: https://issues.apache.org/jira/browse/AIRFLOW-3217 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: DAG code view before/after: ![screenshot 2018-12-04 12 19 46](https://user-images.githubusercontent.com/536435/49460335-12248e80-f7bf-11e8-9bb1-f730f91d0e47.png) ![screenshot 2018-12-04 12 19 23](https://user-images.githubusercontent.com/536435/49460327-0d5fda80-f7bf-11e8-8aef-2bcebfadba61.png) Log view before/after: ![screenshot 2018-12-04 12 20 21](https://user-images.githubusercontent.com/536435/49460336-12248e80-f7bf-11e8-887b-17302bc8c000.png) ![screenshot 2018-12-04 12 20 43](https://user-images.githubusercontent.com/536435/49460337-12bd2500-f7bf-11e8-8124-f05226804300.png) ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: it's css ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Airflow log view and code view should wrap > -- > > Key: AIRFLOW-3217 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3217 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: 1.10.0 >Reporter: James Meickle >Priority: Minor > > Being able to look at logs and code in Airflow is great, but the Kubernetes > operator in particular makes very long loglines. For example, a pod log might > have something like this: > > `{{[2018-10-16 10:32:38,195] {{logging_mixin.py:95}} INFO - [2018-10-16 > 10:32:38,195] {{pod_launcher.py:95}} INFO - b'[2018-10-16 10:32:38.190104] > INFO:`}} > > ...and that's before any log text! Since there is no wrapping or navigation > on this screen, it's very challenging to make use of these logs. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] Eronarn opened a new pull request #4277: [AIRFLOW-3217] Airflow log view and code view should wrap
Eronarn opened a new pull request #4277: [AIRFLOW-3217] Airflow log view and code view should wrap URL: https://github.com/apache/incubator-airflow/pull/4277 ### Jira - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues: https://issues.apache.org/jira/browse/AIRFLOW-3217 ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: DAG code view before/after: ![screenshot 2018-12-04 12 19 46](https://user-images.githubusercontent.com/536435/49460335-12248e80-f7bf-11e8-9bb1-f730f91d0e47.png) ![screenshot 2018-12-04 12 19 23](https://user-images.githubusercontent.com/536435/49460327-0d5fda80-f7bf-11e8-8aef-2bcebfadba61.png) Log view before/after: ![screenshot 2018-12-04 12 20 21](https://user-images.githubusercontent.com/536435/49460336-12248e80-f7bf-11e8-887b-17302bc8c000.png) ![screenshot 2018-12-04 12 20 43](https://user-images.githubusercontent.com/536435/49460337-12bd2500-f7bf-11e8-8124-f05226804300.png) ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: it's css ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238722905 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -106,8 +102,8 @@ def load_user(userid, session=None): if not userid or userid == 'None': return None -user = session.query(models.User).filter(models.User.id == int(userid)).first() -return PasswordUser(user) +user = session.query(PasswordUser).filter(PasswordUser.id == int(userid)).first() Review comment: No timing on 2.0.0 (q1 or q2 next year) but the "RBAC" based UI is available behind a config flag on 1.10.0 so you can start experimenting with it already https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#new-webserver-ui-with-role-based-access-control This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] thomasbrockmeier commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
thomasbrockmeier commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238721561 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -106,8 +102,8 @@ def load_user(userid, session=None): if not userid or userid == 'None': return None -user = session.query(models.User).filter(models.User.id == int(userid)).first() -return PasswordUser(user) +user = session.query(PasswordUser).filter(PasswordUser.id == int(userid)).first() Review comment: If it keeps the API from breaking, I can see if I can fix it in [views.py:2070](https://github.com/apache/incubator-airflow/blob/master/airflow/www/views.py#L2070) with something like ``` do_filter = FILTER_BY_OWNER and (not current_user.user.is_superuser()) ``` Not the nicest way to handle this, I guess, but perhaps worth considering if it enables filtering DAGs by owner for the time being? Do you have any indication when 2.0.0 is scheduled for release? Depending on the time frame, I may be able to invest a couple of hours to look into this further as this feature would make my life a lot easier :) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] thomasbrockmeier commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
thomasbrockmeier commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238721561 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -106,8 +102,8 @@ def load_user(userid, session=None): if not userid or userid == 'None': return None -user = session.query(models.User).filter(models.User.id == int(userid)).first() -return PasswordUser(user) +user = session.query(PasswordUser).filter(PasswordUser.id == int(userid)).first() Review comment: If it keeps the API from breaking, I can see if I can fix it in [views.py:2070](https://github.com/apache/incubator-airflow/blob/master/airflow/www/views.py#L2070) with something like ```suggestion do_filter = FILTER_BY_OWNER and (not current_user.user.is_superuser()) ``` Not the nicest way to handle this, I guess, but perhaps worth considering if it enables filtering DAGs by owner for the time being? Do you have any indication when 2.0.0 is scheduled for release? Depending on the time frame, I may be able to invest a couple of hours to look into this further as this feature would make my life a lot easier :) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238710034 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -106,8 +102,8 @@ def load_user(userid, session=None): if not userid or userid == 'None': return None -user = session.query(models.User).filter(models.User.id == int(userid)).first() -return PasswordUser(user) +user = session.query(PasswordUser).filter(PasswordUser.id == int(userid)).first() Review comment: I suspect this will fail tests as https://github.com/apache/incubator-airflow/blob/b7a7fd66d693dbfbc471a6d08bc274441ee4841c/airflow/www/utils.py#L299 won't work anymore. And this (admittedly silly) API is part of the external API that people have written custom auth backends against so we can't change it. If we were keeping these I'd say it's worth "fixing" this (as you have here) but since we're removing the custom auth backends in Favour of Flask-AppBuilder in 2.0.0 it's probably worth just keeping the sillyness. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#issuecomment-444139550 @thomasbrockmeier FYI: This auth mechanism is probably going away in Airflow 2.0.0 @kaxil one for 1.10.2 perhaps? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] EamonKeane commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
EamonKeane commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238708915 ## File path: tests/contrib/minikube/test_kubernetes_pod_operator.py ## @@ -111,6 +111,21 @@ def test_delete_operator_pod(): ) k.execute(None) +def test_keep_failed_pod(self): +k = KubernetesPodOperator( +namespace='default', +image="ubuntu:16.04", +cmds=["bash", "-cx"], +arguments=["exit 1"], +labels={"foo": "bar"}, +name="test", +task_id="task", +is_delete_operator_pod=True, +keep_failed_pod=True +) +with self.assertRaises(AirflowException): +k.execute(None) Review comment: The pod will enter a failed state instantly. We just need to confirm it is still there after this the `Kubernetes Pod Operator` has gotten past the line below (which currently deletes all pods regardless of exit code): https://github.com/apache/incubator-airflow/blob/a8203aa3263656c32ebc2e8090322b08be4e1ded/airflow/contrib/operators/kubernetes_pod_operator.py#L136 In practice waiting a second or two should suffice, or there may be another way to structure it if you have any ideas? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238708599 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -106,8 +102,8 @@ def load_user(userid, session=None): if not userid or userid == 'None': return None -user = session.query(models.User).filter(models.User.id == int(userid)).first() -return PasswordUser(user) +user = session.query(PasswordUser).filter(PasswordUser.id == int(userid)).first() Review comment: This change looks better, but I need to check it doesn't actually break other things - parts of the code may expect user.user to exist (as bonkers as that sounds) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io commented on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
codecov-io commented on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#issuecomment-444138570 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=h1) Report > Merging [#4276](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4276/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4276 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=footer). Last update [9c04e8f...b7a7fd6](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] codecov-io edited a comment on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
codecov-io edited a comment on issue #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#issuecomment-444138570 # [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=h1) Report > Merging [#4276](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=desc) into [master](https://codecov.io/gh/apache/incubator-airflow/commit/9c04e8f339a6d84b2fff983e6584af2b81249652?src=pr=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-airflow/pull/4276/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=tree) ```diff @@ Coverage Diff @@ ## master#4276 +/- ## === Coverage 78.08% 78.08% === Files 201 201 Lines 1645816458 === Hits1285112851 Misses 3607 3607 ``` -- [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=footer). Last update [9c04e8f...b7a7fd6](https://codecov.io/gh/apache/incubator-airflow/pull/4276?src=pr=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
ashb commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238703635 ## File path: tests/contrib/minikube/test_kubernetes_pod_operator.py ## @@ -111,6 +111,21 @@ def test_delete_operator_pod(): ) k.execute(None) +def test_keep_failed_pod(self): +k = KubernetesPodOperator( +namespace='default', +image="ubuntu:16.04", +cmds=["bash", "-cx"], +arguments=["exit 1"], +labels={"foo": "bar"}, +name="test", +task_id="task", +is_delete_operator_pod=True, +keep_failed_pod=True +) +with self.assertRaises(AirflowException): +k.execute(None) Review comment: I'd like to avoid sleep in tests if we can (but it might not be avoidable) - does the pod enter a "deleting" state instantly? If so could we test that `not pod_exists or pod_in_deleteing_state`? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] EamonKeane commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
EamonKeane commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238701520 ## File path: tests/contrib/minikube/test_kubernetes_pod_operator.py ## @@ -111,6 +111,21 @@ def test_delete_operator_pod(): ) k.execute(None) +def test_keep_failed_pod(self): +k = KubernetesPodOperator( +namespace='default', +image="ubuntu:16.04", +cmds=["bash", "-cx"], +arguments=["exit 1"], +labels={"foo": "bar"}, +name="test", +task_id="task", +is_delete_operator_pod=True, +keep_failed_pod=True +) +with self.assertRaises(AirflowException): +k.execute(None) Review comment: many thanks fo reviewing the PR, those comments all make sense. That's a fair comment about this test. Would sleeping for e.g. 10 seconds and then asserting the pod is still in the namespace and in a failed state be sufficient to test it? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238694426 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -94,10 +94,6 @@ def data_profiling(self): """Provides access to data profiling tools""" return True -def is_superuser(self): -"""Access all the things""" -return True Review comment: Oh, just read the PR message, NM. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth
ashb commented on a change in pull request #4276: [AIRFLOW-1552] Airflow Filter_by_owner not working with password_auth URL: https://github.com/apache/incubator-airflow/pull/4276#discussion_r238694141 ## File path: airflow/contrib/auth/backends/password_auth.py ## @@ -94,10 +94,6 @@ def data_profiling(self): """Provides access to data profiling tools""" return True -def is_superuser(self): -"""Access all the things""" -return True Review comment: Why was this removed? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on issue #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
XD-DENG commented on issue #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#issuecomment-444127262 Hi @EamonKeane please find my a few comments for your reference. In addition, you may need to rebase your branch to solve conflicting files. Cheers This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238693290 ## File path: airflow/contrib/operators/kubernetes_pod_operator.py ## @@ -123,7 +130,10 @@ def execute(self, context): get_logs=self.get_logs) if self.is_delete_operator_pod: -launcher.delete_pod(pod) +if final_state != State.SUCCESS and self.keep_failed_pod: +pass +else: +launcher.delete_pod(pod) Review comment: Another very very minor one: would ```python if not self.keep_failed_pod or final_state == State.SUCCESS: launcher.delete_pod(pod) ``` be more concise? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] thomasbrockmeier opened a new pull request #4276: [AIRFLOW-1552] Query PasswordUser on password login
thomasbrockmeier opened a new pull request #4276: [AIRFLOW-1552] Query PasswordUser on password login URL: https://github.com/apache/incubator-airflow/pull/4276 Make sure you have checked _all_ steps below. ### Jira - [x] Fix [AIRFLOW-1552](https://issues.apache.org/jira/browse/AIRFLOW-1552), Airflow filter_by_owner not working with password_auth ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: `airflow.contrib.auth.backends.password_auth.load_user` would query `models.User` and cast the result to `PasswordUser` instead of querying `PasswordUser` directly ### Tests ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-1552) Airflow Filter_by_owner not working with password_auth
[ https://issues.apache.org/jira/browse/AIRFLOW-1552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16708787#comment-16708787 ] ASF GitHub Bot commented on AIRFLOW-1552: - thomasbrockmeier opened a new pull request #4276: [AIRFLOW-1552] Query PasswordUser on password login URL: https://github.com/apache/incubator-airflow/pull/4276 Make sure you have checked _all_ steps below. ### Jira - [x] Fix [AIRFLOW-1552](https://issues.apache.org/jira/browse/AIRFLOW-1552), Airflow filter_by_owner not working with password_auth ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: `airflow.contrib.auth.backends.password_auth.load_user` would query `models.User` and cast the result to `PasswordUser` instead of querying `PasswordUser` directly ### Tests ### Commits - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Airflow Filter_by_owner not working with password_auth > -- > > Key: AIRFLOW-1552 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1552 > Project: Apache Airflow > Issue Type: Bug > Components: configuration >Affects Versions: 1.8.0 > Environment: CentOS , python 2.7 >Reporter: raghu ram reddy >Priority: Major > > Airflow Filter_by_owner parameter is not working with password_auth. > I created sample user using the below code from airflow documentation and > enabled password_auth. > I'm able to login as the user created but by default this user is superuser > and there is noway to modify it, default all users created by PasswordUser > are superusers. > import airflow > from airflow import models, settings > from airflow.contrib.auth.backends.password_auth import PasswordUser > user = PasswordUser(models.User()) > user.username = 'test1' > user.password = 'test1' > user.is_superuser() > session = settings.Session() > session.add(user) > session.commit() > session.close() > exit() -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238688263 ## File path: tests/contrib/minikube/test_kubernetes_pod_operator.py ## @@ -111,6 +111,21 @@ def test_delete_operator_pod(): ) k.execute(None) +def test_keep_failed_pod(self): +k = KubernetesPodOperator( +namespace='default', +image="ubuntu:16.04", +cmds=["bash", "-cx"], +arguments=["exit 1"], +labels={"foo": "bar"}, +name="test", +task_id="task", +is_delete_operator_pod=True, +keep_failed_pod=True +) +with self.assertRaises(AirflowException): +k.execute(None) Review comment: Seems this test can't really test what happens when `keep_failed_pod=True`? It's only testing `arguments`. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238686376 ## File path: airflow/contrib/operators/kubernetes_pod_operator.py ## @@ -78,6 +78,13 @@ class KubernetesPodOperator(BaseOperator): /airflow/xcom/return.json in the container will also be pushed to an XCom when the container completes. :type xcom_push: bool +:param is_delete_operator_pod: Delete the pod after it completes +:type is_delete_operator_pod: bool +:param keep_failed_pod: Keep pods that exit with non zero error code. This allows +for easy log inspection of failed pods - assuming the +node is still in the cluster +(only works when is_delete_operator_pod is True) +:type keep_failed_pod: bool :param tolerations: Kubernetes tolerations :type list of tolerations Review comment: Actually this line `:type list of tolerations` was incorrect. Would be good if you can fix it alongside. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods
XD-DENG commented on a change in pull request #4160: [AIRFLOW-3311] Allow pod operator to keep failed pods URL: https://github.com/apache/incubator-airflow/pull/4160#discussion_r238685951 ## File path: airflow/contrib/operators/kubernetes_pod_operator.py ## @@ -78,6 +78,13 @@ class KubernetesPodOperator(BaseOperator): /airflow/xcom/return.json in the container will also be pushed to an XCom when the container completes. :type xcom_push: bool +:param is_delete_operator_pod: Delete the pod after it completes +:type is_delete_operator_pod: bool +:param keep_failed_pod: Keep pods that exit with non zero error code. This allows +for easy log inspection of failed pods - assuming the +node is still in the cluster +(only works when is_delete_operator_pod is True) Review comment: These 3 lines are not indented correctly, which will fail Sphinx rendering. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
ashb commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444115552 > I would recommend submitting a glue job and then using the sensor to poll for updates. Could you explain your thinking? If this is what 90% of people will want, then does it make more sense having it be the default and built as a single operator? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238669144 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) + +# Create the collection as well +hook.create_collection(self.collection_name, self.database_name) Review comment: The same question as above. You're using a `try-except` to handle the case where the collection already exists. Personally I think it may be better to handle it in a more explicit fashion. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238669144 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) + +# Create the collection as well +hook.create_collection(self.collection_name, self.database_name) Review comment: The same question as above. You're using a `try-except` to handle the case where the collection already exists. It may be better to handle it in a more explicit fashion. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238669024 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) Review comment: Seems you're using a `try-except` to handle the case in which the DB already exists. Is there any reason that it can not be handled in a more explicit way? Possibly better to be `Check if the DB already exists -> if yes, skip creatiing DB; if no, create`? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238669144 ## File path: airflow/contrib/operators/azure_cosmos_insertdocument_operator.py ## @@ -0,0 +1,67 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.models import BaseOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosInsertDocumentOperator(BaseOperator): +""" +Inserts a new document into the specified Cosmos database and collection +It will create both the database and collection if they do not already exist + +:param database_name: The name of the database. (templated) +:type database_name: str +:param collection_name: The name of the collection. (templated) +:type collection_name: str +:param document: The document to insert +:type document: json +:param azure_cosmos_conn_id: reference to a CosmosDB connection. +:type azure_cosmos_conn_id: str +""" +template_fields = ('database_name', 'collection_name') +ui_color = '#e4f0e8' + +@apply_defaults +def __init__(self, + database_name, + collection_name, + document, + azure_cosmos_conn_id='azure_cosmos_default', + *args, + **kwargs): +super(AzureCosmosInsertDocumentOperator, self).__init__(*args, **kwargs) +self.database_name = database_name +self.collection_name = collection_name +self.document = document +self.azure_cosmos_conn_id = azure_cosmos_conn_id + +def execute(self, context): +# Create the hook +hook = AzureCosmosDBHook(azure_cosmos_conn_id=self.azure_cosmos_conn_id) + +# Create the DB if it doesn't already exist +hook.create_database(self.database_name) + +# Create the collection as well +hook.create_collection(self.collection_name, self.database_name) Review comment: The same question as above This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator
XD-DENG commented on a change in pull request #4265: [AIRFLOW-3406] Implement an Azure CosmosDB operator URL: https://github.com/apache/incubator-airflow/pull/4265#discussion_r238669611 ## File path: airflow/contrib/sensors/azure_cosmos_sensor.py ## @@ -0,0 +1,68 @@ +# -*- coding: utf-8 -*- +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from airflow.contrib.hooks.azure_cosmos_hook import AzureCosmosDBHook +from airflow.sensors.base_sensor_operator import BaseSensorOperator +from airflow.utils.decorators import apply_defaults + + +class AzureCosmosDocumentSensor(BaseSensorOperator): +""" +Checks for the existence of a document which +matches the given query in CosmosDB. Example: + +>>> azure_cosmos_sensor = AzureCosmosDocumentSensor(database_name="somedatabase_name", +...collection_name="somecollection_name", +...document_id="unique-doc-id", +...azure_cosmos_conn_id="azure_cosmos_default", +...task_id="azure_cosmos_sensor") +""" +template_fields = ('database_name', 'collection_name', 'document_id') + +@apply_defaults +def __init__( +self, +database_name, +collection_name, +document_id, +azure_cosmos_conn_id="azure_cosmos_default", +*args, +**kwargs): +""" +Create a new AzureCosmosDocumentSensor + +:param database_name: Target CosmosDB database_name. +:type database_name: str +:param collection_name: Target CosmosDB collection_name. +:type collection_name: str +:param document_id: The ID of the target document. +:type query: str +:param azure_cosmos_conn_id: The connection ID to use + when connecting to CosmosDB. Review comment: This line is not indented correctly. It will fail Sphinx rendering. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] oelesinsc24 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow
oelesinsc24 commented on issue #4068: [AIRFLOW-2310]: Add AWS Glue Job Compatibility to Airflow URL: https://github.com/apache/incubator-airflow/pull/4068#issuecomment-444106684 @ashb, great question. But I think there may be cases for both. However, I would recommend submitting a glue job and then using the sensor to poll for updates. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-2670) SSHOperator's timeout parameter doesn't affect SSHook timeoot
[ https://issues.apache.org/jira/browse/AIRFLOW-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-2670: Fix Version/s: (was: 2.0.0) 1.10.2 > SSHOperator's timeout parameter doesn't affect SSHook timeoot > - > > Key: AIRFLOW-2670 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2670 > Project: Apache Airflow > Issue Type: Improvement > Components: contrib >Affects Versions: 2.0.0 >Reporter: jin zhang >Priority: Major > Fix For: 1.10.2 > > > when I use SSHOperator, SSHOperator's timeout parameter can't set in SSHHook > and it's just effect exce_command. > old version: > self.ssh_hook = SSHHook(ssh_conn_id=self.ssh_conn_id) > I change it to : > self.ssh_hook = SSHHook(ssh_conn_id=self.ssh_conn_id, timeout=self.timeout) -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] ashb commented on issue #4032: [AIRFLOW-2636] - fix task duration tab NoneType exception
ashb commented on issue #4032: [AIRFLOW-2636] - fix task duration tab NoneType exception URL: https://github.com/apache/incubator-airflow/pull/4032#issuecomment-444087972 And unit tests would be good too - as simple as checking that the page renders okay would be enough (along with making sure there is some TaskInstance data in the DB for the test case) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] tal181 commented on issue #4267: [AIRFLOW-3411] create openfaas hook
tal181 commented on issue #4267: [AIRFLOW-3411] create openfaas hook URL: https://github.com/apache/incubator-airflow/pull/4267#issuecomment-444082973 ping @kaxil This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3066) Add job parameters to AWSbatch Operator
[ https://issues.apache.org/jira/browse/AIRFLOW-3066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16708608#comment-16708608 ] ASF GitHub Bot commented on AIRFLOW-3066: - hugoprudente closed pull request #4231: [AIRFLOW-3066] Adding support for AWS Batch parameters URL: https://github.com/apache/incubator-airflow/pull/4231 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Add job parameters to AWSbatch Operator > --- > > Key: AIRFLOW-3066 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3066 > Project: Apache Airflow > Issue Type: Improvement > Components: contrib, operators >Reporter: Raphael Norman-Tenazas >Assignee: Hugo Prudente >Priority: Minor > Labels: AWS, easyfix, features, newbie > Original Estimate: 5m > Remaining Estimate: 5m > > Sometimes it is necessary to add parameters at runtime to AWS batch jobs in a > workflow. Currently, the AWSbatchOperator does not support this, and will use > the default parameters defined in the AWS job description. > This can be implemented by adding a job_description={} parameter to the > AWSBatchOperator's __init__ and pass that into the client.submit_job call > with the keyword parameters. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] hugoprudente closed pull request #4231: [AIRFLOW-3066] Adding support for AWS Batch parameters
hugoprudente closed pull request #4231: [AIRFLOW-3066] Adding support for AWS Batch parameters URL: https://github.com/apache/incubator-airflow/pull/4231 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-3442) Show first and second name of user in audit logs
[ https://issues.apache.org/jira/browse/AIRFLOW-3442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Lukyanchikov updated AIRFLOW-3442: - Description: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise was: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png! > Show first and second name of user in audit logs > > > Key: AIRFLOW-3442 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 > Project: Apache Airflow > Issue Type: Improvement > Components: logging >Reporter: Dmitry Lukyanchikov >Priority: Minor > Fix For: 1.10.1 > > Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png > > > When i start to use new RBAC feature with google oauth i notice that owner in > audit logs shows as google_[some number]. Its pretty hard to understand how > is actually make some changes or run dag. But if there will be first name > second name or at least email it will be pretty easy without checking who is > using this google_ username. Maybe it can be configured somehow, please advise -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3442) Show first and second name of user in audit logs
[ https://issues.apache.org/jira/browse/AIRFLOW-3442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Lukyanchikov updated AIRFLOW-3442: - Issue Type: Bug (was: Improvement) > Show first and second name of user in audit logs > > > Key: AIRFLOW-3442 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Reporter: Dmitry Lukyanchikov >Priority: Major > Fix For: 1.10.1 > > Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png > > > When i start to use new RBAC feature with google oauth i notice that owner in > audit logs shows as google_[some number]. Its pretty hard to understand how > is actually make some changes or run dag. But if there will be first name > second name or at least email it will be pretty easy without checking who is > using this google_ username. Maybe it can be configured somehow, please advise -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3442) Show first and second name of user in audit logs
[ https://issues.apache.org/jira/browse/AIRFLOW-3442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Lukyanchikov updated AIRFLOW-3442: - Priority: Major (was: Minor) > Show first and second name of user in audit logs > > > Key: AIRFLOW-3442 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 > Project: Apache Airflow > Issue Type: Improvement > Components: logging >Reporter: Dmitry Lukyanchikov >Priority: Major > Fix For: 1.10.1 > > Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png > > > When i start to use new RBAC feature with google oauth i notice that owner in > audit logs shows as google_[some number]. Its pretty hard to understand how > is actually make some changes or run dag. But if there will be first name > second name or at least email it will be pretty easy without checking who is > using this google_ username. Maybe it can be configured somehow, please advise -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3442) Show first and second name of user in audit logs
[ https://issues.apache.org/jira/browse/AIRFLOW-3442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Lukyanchikov updated AIRFLOW-3442: - Description: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png|thumbnail! was: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png! > Show first and second name of user in audit logs > > > Key: AIRFLOW-3442 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 > Project: Apache Airflow > Issue Type: Improvement > Components: logging >Reporter: Dmitry Lukyanchikov >Priority: Minor > Fix For: 1.10.1 > > Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png > > > When i start to use new RBAC feature with google oauth i notice that owner in > audit logs shows as google_[some number]. Its pretty hard to understand how > is actually make some changes or run dag. But if there will be first name > second name or at least email it will be pretty easy without checking who is > using this google_ username. Maybe it can be configured somehow, please advise > !Screen Shot 2018-12-04 at 12.41.50 PM.png|thumbnail! -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (AIRFLOW-3442) Show first and second name of user in audit logs
[ https://issues.apache.org/jira/browse/AIRFLOW-3442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Lukyanchikov updated AIRFLOW-3442: - Description: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png! was: When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png|thumbnail! > Show first and second name of user in audit logs > > > Key: AIRFLOW-3442 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 > Project: Apache Airflow > Issue Type: Improvement > Components: logging >Reporter: Dmitry Lukyanchikov >Priority: Minor > Fix For: 1.10.1 > > Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png > > > When i start to use new RBAC feature with google oauth i notice that owner in > audit logs shows as google_[some number]. Its pretty hard to understand how > is actually make some changes or run dag. But if there will be first name > second name or at least email it will be pretty easy without checking who is > using this google_ username. Maybe it can be configured somehow, please advise > !Screen Shot 2018-12-04 at 12.41.50 PM.png! -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (AIRFLOW-3442) Show first and second name of user in audit logs
Dmitry Lukyanchikov created AIRFLOW-3442: Summary: Show first and second name of user in audit logs Key: AIRFLOW-3442 URL: https://issues.apache.org/jira/browse/AIRFLOW-3442 Project: Apache Airflow Issue Type: Improvement Components: logging Reporter: Dmitry Lukyanchikov Fix For: 1.10.1 Attachments: Screen Shot 2018-12-04 at 12.41.50 PM.png When i start to use new RBAC feature with google oauth i notice that owner in audit logs shows as google_[some number]. Its pretty hard to understand how is actually make some changes or run dag. But if there will be first name second name or at least email it will be pretty easy without checking who is using this google_ username. Maybe it can be configured somehow, please advise !Screen Shot 2018-12-04 at 12.41.50 PM.png! -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] ashb closed pull request #2530: [WIP] fix non imported tests
ashb closed pull request #2530: [WIP] fix non imported tests URL: https://github.com/apache/incubator-airflow/pull/2530 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/tests/__init__.py b/tests/__init__.py index 20f8c48d2a..c6ba0297d5 100644 --- a/tests/__init__.py +++ b/tests/__init__.py @@ -22,6 +22,7 @@ from .jobs import * from .impersonation import * from .models import * +from .plugins_manager import * from .operators import * from .security import * -from .utils import * +from .test_utils import * diff --git a/tests/api/common/experimental/mark_tasks.py b/tests/api/common/experimental/test_mark_tasks.py similarity index 100% rename from tests/api/common/experimental/mark_tasks.py rename to tests/api/common/experimental/test_mark_tasks.py diff --git a/tests/contrib/hooks/gcp_pubsub_hook.py b/tests/contrib/hooks/test_gcp_pubsub_hook.py similarity index 100% rename from tests/contrib/hooks/gcp_pubsub_hook.py rename to tests/contrib/hooks/test_gcp_pubsub_hook.py diff --git a/tests/contrib/operators/pubsub_operator.py b/tests/contrib/operators/test_pubsub_operator.py similarity index 100% rename from tests/contrib/operators/pubsub_operator.py rename to tests/contrib/operators/test_pubsub_operator.py diff --git a/tests/utils.py b/tests/test_utils.py similarity index 100% rename from tests/utils.py rename to tests/test_utils.py This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on issue #2530: [WIP] fix non imported tests
ashb commented on issue #2530: [WIP] fix non imported tests URL: https://github.com/apache/incubator-airflow/pull/2530#issuecomment-444051145 Yup, changes have been done in other PRs. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (AIRFLOW-3441) Dag View Customization
Yuvaraj created AIRFLOW-3441: Summary: Dag View Customization Key: AIRFLOW-3441 URL: https://issues.apache.org/jira/browse/AIRFLOW-3441 Project: Apache Airflow Issue Type: Improvement Components: api, webapp Reporter: Yuvaraj Attachments: Screen Shot 2018-12-04 at 3.53.37 PM.png I have a series of questions as part of airflow UI. Experts please guide. # Is there a possibility to view DAGS in a folder structure. At present we do have all the DAGS in segregated fashion and we have to filter to get DAGS for a specific project. If dags are represented in terms of subfolder name (Say Project Name) under the MAIN DAG folder. It will be easy to navigate in terms of better operations. # How to we filter *DAG/TASK* instances using *contains* filter. *Eg:* Dag ID contains (how multiple values are specified. Is it comma/colon/space separated?) # Is there a option to *hold/pause* only a individual task instance in a dag. By doing so the next DAG run should also not be coming into queue for execution. Say for activities like fixing a issue/maintenance and then release the task instance to execute. # How can we monitor the user activities performed on a DAG (Like operating/changing the Dag). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] ashb commented on issue #2530: [WIP] fix non imported tests
ashb commented on issue #2530: [WIP] fix non imported tests URL: https://github.com/apache/incubator-airflow/pull/2530#issuecomment-444047654 I think most of these changes have been merged in other PRs. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3405) Task instance fail intermittently due to MySQL error
[ https://issues.apache.org/jira/browse/AIRFLOW-3405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16708512#comment-16708512 ] Yuvaraj commented on AIRFLOW-3405: -- We are in process of upgrading to 1.10.1. Will keep posted if this gets fixed in latest. > Task instance fail intermittently due to MySQL error > > > Key: AIRFLOW-3405 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3405 > Project: Apache Airflow > Issue Type: Improvement > Environment: MySQL, Redhat Linux >Reporter: Yuvaraj >Priority: Major > Labels: performance, usability > > Dags are getting failed intermittently due to below error. > OperationalError: (_mysql_exceptions.OperationalError) (1040, 'Too many > connections') > [2018-11-25 12:24:16,952] - Heartbeat time limited exceeded! > We have max_connections defined as 2000 in DB. > Below are the setting in cfg. > sql_alchemy_pool_size = 1980 > sql_alchemy_pool_recycle = 3600 > As per DBA, The airflow scheduler keeps opening connections to the database, > these connections are mostly idle, they get reset whenever the scheduler > restarts but with max_connections at 2000 and scheduler holding on to 1600 of > these, other apps trying to connect might start running out of connections. > How do we remediate these idle connections. What should be the optimal value > for these configs and max_connections that to be set at DB. Consider we need > to build a large environment serving 500+ definitions with 1+ runs per > day. Need suggestions... > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] ron819 commented on issue #3210: [AIRFLOW-2307] Add uuid attribute to TaskInstance, return uuid in Executor
ron819 commented on issue #3210: [AIRFLOW-2307] Add uuid attribute to TaskInstance, return uuid in Executor URL: https://github.com/apache/incubator-airflow/pull/3210#issuecomment-444044949 @johnarnold are you still working on this PR? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ron819 commented on issue #2530: [WIP] fix non imported tests
ron819 commented on issue #2530: [WIP] fix non imported tests URL: https://github.com/apache/incubator-airflow/pull/2530#issuecomment-444044476 Is this PR still needed? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] SaumyaRackspace commented on issue #4275: AIRFLOW-3437 Adding better json
SaumyaRackspace commented on issue #4275: AIRFLOW-3437 Adding better json URL: https://github.com/apache/incubator-airflow/pull/4275#issuecomment-444040011 Under PR https://github.com/apache/incubator-airflow/pull/4271 I got below comment "Additionally you need tests - I think you end up with "start_date": True in your output". I have well tested this start_date wont be True but rather empty in case it is not fetched or if formate is not correct. This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (AIRFLOW-3437) Formatted json should be returned when dag_run is triggered with experimental api
[ https://issues.apache.org/jira/browse/AIRFLOW-3437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16708488#comment-16708488 ] ASF GitHub Bot commented on AIRFLOW-3437: - SaumyaRackspace opened a new pull request #4275: AIRFLOW-3437 Adding better json URL: https://github.com/apache/incubator-airflow/pull/4275 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW-3437) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3437 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-3437\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: Current rest API "/api/experimental/dags/dag_runs"returns message like below, which makes difficult for developer to figure out execution date/run_id and execution date/run_id extract logic has to be written { "message": "Created " ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: - test_trigger_dag_detailed_output ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Formatted json should be returned when dag_run is triggered with experimental > api > - > > Key: AIRFLOW-3437 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3437 > Project: Apache Airflow > Issue Type: Improvement > Components: api >Affects Versions: 1.10.0, 2.0.0 >Reporter: Saumya Saxena Gupta >Assignee: Saumya Saxena Gupta >Priority: Major > > *Scenario* -> Developer wants to trigger DAG_RUN through API > *Issue* -> Current rest API "/api/experimental/dags/dag_runs"returns > message like below, which makes difficult for developer to figure out > execution date/run_id and execution date/run_id extract logic has to be > written > { > "message": "Created 11:16:36+00:00: manual__2018-12-03T11:16:36+00:00, externally triggered: > True>" > } > *Improvement Suggestion* -> rest API > "/api/experimental/dags/dag_runs" should return json representing > dag_run object , something like below > { > "dag_id": "example_bash_operator", > "dag_run_url": > "/admin/airflow/graph?execution_date=2018-12-03+11%3A11%3A18%2B00%3A00_id=example_bash_operator", > "execution_date": "2018-12-03T11:11:18+00:00", > "id": 142, > "run_id": "manual__2018-12-03T11:11:18+00:00", > "start_date": "2018-12-03T11:11:18.267197+00:00", > "state": "running" > }\ > With the Json returned as shown above , picking dag_run details becomes easy. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] SaumyaRackspace opened a new pull request #4275: AIRFLOW-3437 Adding better json
SaumyaRackspace opened a new pull request #4275: AIRFLOW-3437 Adding better json URL: https://github.com/apache/incubator-airflow/pull/4275 Make sure you have checked _all_ steps below. ### Jira - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW-3437) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR" - https://issues.apache.org/jira/browse/AIRFLOW-3437 - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-3437\], code changes always need a Jira issue. ### Description - [x] Here are some details about my PR, including screenshots of any UI changes: Current rest API "/api/experimental/dags/dag_runs"returns message like below, which makes difficult for developer to figure out execution date/run_id and execution date/run_id extract logic has to be written { "message": "Created " ### Tests - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason: - test_trigger_dag_detailed_output ### Commits - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)": 1. Subject is separated from body by a blank line 1. Subject is limited to 50 characters (not including Jira issue reference) 1. Subject does not end with a period 1. Subject uses the imperative mood ("add", not "adding") 1. Body wraps at 72 characters 1. Body explains "what" and "why", not "how" ### Documentation - [ ] In case of new functionality, my PR adds documentation that describes how to use it. - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added. ### Code Quality - [x] Passes `flake8` This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (AIRFLOW-3309) Missing Mongo DB connection type
[ https://issues.apache.org/jira/browse/AIRFLOW-3309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-3309: Fix Version/s: (was: 2.0.0) 1.10.2 > Missing Mongo DB connection type > > > Key: AIRFLOW-3309 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3309 > Project: Apache Airflow > Issue Type: Bug > Components: database >Affects Versions: 1.10.0 >Reporter: John Cheng >Assignee: John Cheng >Priority: Minor > Fix For: 1.10.2 > > > Unable to choose Mongo DB on the admin console connection page. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] SaumyaRackspace closed pull request #4271: Adding better json return
SaumyaRackspace closed pull request #4271: Adding better json return URL: https://github.com/apache/incubator-airflow/pull/4271 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/airflow/api/common/experimental/get_dag_runs.py b/airflow/api/common/experimental/get_dag_runs.py index 63b1f993d3..4dd1306039 100644 --- a/airflow/api/common/experimental/get_dag_runs.py +++ b/airflow/api/common/experimental/get_dag_runs.py @@ -53,3 +53,16 @@ def get_dag_runs(dag_id, state=None): }) return dag_runs + +def format_dag_run(run): +return { +'id': run.id, +'run_id': run.run_id, +'state': run.state, +'dag_id': run.dag_id, +'execution_date': run.execution_date.isoformat(), +'start_date': ((run.start_date or '') and + run.start_date.isoformat()), +'dag_run_url': url_for('airflow.graph', dag_id=run.dag_id, + execution_date=run.execution_date) +} diff --git a/airflow/www/api/experimental/endpoints.py b/airflow/www/api/experimental/endpoints.py index f0bc319eb6..675ff6c062 100644 --- a/airflow/www/api/experimental/endpoints.py +++ b/airflow/www/api/experimental/endpoints.py @@ -26,6 +26,7 @@ from airflow.api.common.experimental import trigger_dag as trigger from airflow.api.common.experimental.get_task import get_task from airflow.api.common.experimental.get_task_instance import get_task_instance +from airflow.api.common.experimental.get_dag_runs import format_dag_run from airflow.exceptions import AirflowException from airflow.utils import timezone from airflow.utils.log.logging_mixin import LoggingMixin @@ -52,6 +53,10 @@ def trigger_dag(dag_id): if 'run_id' in data: run_id = data['run_id'] +detailed_output = None +if 'detailed_output' in data: +detailed_output = data['detailed_output'] + conf = None if 'conf' in data: conf = data['conf'] @@ -84,8 +89,11 @@ def trigger_dag(dag_id): if getattr(g, 'user', None): _log.info("User {} created {}".format(g.user, dr)) - -response = jsonify(message="Created {}".format(dr)) +if detailed_output : +dr = format_dag_run(dr) +response = jsonify(**dr) +else: +response = jsonify(message="Created {}".format(dr)) return response This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] SaumyaRackspace commented on issue #4271: Adding better json return
SaumyaRackspace commented on issue #4271: Adding better json return URL: https://github.com/apache/incubator-airflow/pull/4271#issuecomment-444029380 Will reopen PR in master This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ashb commented on issue #3684: [AIRFLOW-2840] - add update connections cli option
ashb commented on issue #3684: [AIRFLOW-2840] - add update connections cli option URL: https://github.com/apache/incubator-airflow/pull/3684#issuecomment-444026646 Great! Sorry I didn’t respond - paid work has been busy :) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] ron819 commented on issue #3367: [AIRFLOW-2475] Reference triggered dag when using TriggerDagRunOperator
ron819 commented on issue #3367: [AIRFLOW-2475] Reference triggered dag when using TriggerDagRunOperator URL: https://github.com/apache/incubator-airflow/pull/3367#issuecomment-444011654 @BasPH are you still working on this PR? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services