[GitHub] [airflow] stale[bot] commented on issue #6501: [AIRFLOW-5831] Prod image support. Depends on [AIRFLOW-5704] [AIRFLOW-5842] [AIRFLOW-5828]

2020-01-02 Thread GitBox
stale[bot] commented on issue #6501: [AIRFLOW-5831] Prod image support. Depends 
on [AIRFLOW-5704] [AIRFLOW-5842] [AIRFLOW-5828]
URL: https://github.com/apache/airflow/pull/6501#issuecomment-570488487
 
 
   This issue has been automatically marked as stale because it has not had 
recent activity. It will be closed if no further activity occurs. Thank you for 
your contributions.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow-site] chandulal opened a new pull request #231: Add blogs for airflow DAGs testing

2020-01-02 Thread GitBox
chandulal opened a new pull request #231: Add blogs for airflow DAGs testing
URL: https://github.com/apache/airflow-site/pull/231
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.94%`.
   > The diff coverage is `4.44%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   85.03%   84.08%   -0.95% 
   ==
 Files 679  680   +1 
 Lines   3854538590  +45 
   ==
   - Hits3277532447 -328 
   - Misses   5770 6143 +373
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `91.43% <33.33%> (-1.22%)` | :arrow_down: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/executors/celery\_executor.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvY2VsZXJ5X2V4ZWN1dG9yLnB5)
 | `49.65% <0%> (-38.78%)` | :arrow_down: |
   | ... and [13 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[f391039...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.94%`.
   > The diff coverage is `4.44%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   85.03%   84.08%   -0.95% 
   ==
 Files 679  680   +1 
 Lines   3854538590  +45 
   ==
   - Hits3277532447 -328 
   - Misses   5770 6143 +373
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `91.43% <33.33%> (-1.22%)` | :arrow_down: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/executors/celery\_executor.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvY2VsZXJ5X2V4ZWN1dG9yLnB5)
 | `49.65% <0%> (-38.78%)` | :arrow_down: |
   | ... and [13 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[f391039...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.94%`.
   > The diff coverage is `4.44%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   85.03%   84.08%   -0.95% 
   ==
 Files 679  680   +1 
 Lines   3854538590  +45 
   ==
   - Hits3277532447 -328 
   - Misses   5770 6143 +373
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `91.43% <33.33%> (-1.22%)` | :arrow_down: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/executors/celery\_executor.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvY2VsZXJ5X2V4ZWN1dG9yLnB5)
 | `49.65% <0%> (-38.78%)` | :arrow_down: |
   | ... and [13 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[f391039...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io commented on issue #7015: [AIRFLOW-6436] Create & Automate docs on 
Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.94%`.
   > The diff coverage is `4.44%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   85.03%   84.08%   -0.95% 
   ==
 Files 679  680   +1 
 Lines   3854538590  +45 
   ==
   - Hits3277532447 -328 
   - Misses   5770 6143 +373
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `91.43% <33.33%> (-1.22%)` | :arrow_down: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/executors/celery\_executor.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvY2VsZXJ5X2V4ZWN1dG9yLnB5)
 | `49.65% <0%> (-38.78%)` | :arrow_down: |
   | ... and [13 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[f391039...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7015: [AIRFLOW-6436] Create & Automate 
docs on Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570480731
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=h1) 
Report
   > Merging 
[#7015](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0f777f74b776b45e94615f423f61a8eb0025db83?src=pr=desc)
 will **decrease** coverage by `0.76%`.
   > The diff coverage is `60.45%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7015/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7015  +/-   ##
   ==
   - Coverage   84.85%   84.08%   -0.77% 
   ==
 Files 679  680   +1 
 Lines   3853638590  +54 
   ==
   - Hits3269832447 -251 
   - Misses   5838 6143 +305
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/state.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zdGF0ZS5weQ==)
 | `96.29% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_gcs.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZ2NzLnB5)
 | `100% <ø> (+11.42%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `84.78% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `90.95% <ø> (ø)` | :arrow_up: |
   | 
[airflow/cli/commands/task\_command.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy9jbGkvY29tbWFuZHMvdGFza19jb21tYW5kLnB5)
 | `71.81% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/weight\_rule.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy93ZWlnaHRfcnVsZS5weQ==)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/config\_yaml\_to\_cfg.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9jb25maWdfeWFtbF90b19jZmcucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/log/wasb\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvd2FzYl90YXNrX2hhbmRsZXIucHk=)
 | `42.46% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `85% <100%> (-11.62%)` | :arrow_down: |
   | ... and [44 
more](https://codecov.io/gh/apache/airflow/pull/7015/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=footer). 
Last update 
[0f777f7...ef62156](https://codecov.io/gh/apache/airflow/pull/7015?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #7007: [AIRFLOW-6428] Fix import path for airflow.utils.dates.days_ago in Example DAGs

2020-01-02 Thread GitBox
kaxil commented on issue #7007: [AIRFLOW-6428] Fix import path for 
airflow.utils.dates.days_ago in Example DAGs
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570477304
 
 
   > Just wanted to discuss general import approach in this case.
   
   I agree the 2nd option sounds lot better, I have changed the PR and updated 
all the example DAGs by importing just the function from module instead of 
__init__.py.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6428) Fix import path for airflow.utils.dates.days_ago

2020-01-02 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-6428:

Summary: Fix import path for airflow.utils.dates.days_ago  (was: Add dates 
module to airflow/utils/__init__.py)

> Fix import path for airflow.utils.dates.days_ago
> 
>
> Key: AIRFLOW-6428
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6428
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: utils
>Affects Versions: 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.8
>
> Attachments: image-2020-01-02-15-33-55-601.png
>
>
> Currently, without the entry in __init__.py, IDEs show that it could not find 
> the reference to *dates* and hence if you try to find the reference for 
> *days_ago* function or *dates* modules it can't find it.
> Check attachment



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6428) Fix import path for airflow.utils.dates.days_ago in Example DAGs

2020-01-02 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-6428:

Summary: Fix import path for airflow.utils.dates.days_ago in Example DAGs  
(was: Fix import path for airflow.utils.dates.days_ago)

> Fix import path for airflow.utils.dates.days_ago in Example DAGs
> 
>
> Key: AIRFLOW-6428
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6428
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: utils
>Affects Versions: 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 1.10.8
>
> Attachments: image-2020-01-02-15-33-55-601.png
>
>
> Currently, without the entry in __init__.py, IDEs show that it could not find 
> the reference to *dates* and hence if you try to find the reference for 
> *days_ago* function or *dates* modules it can't find it.
> Check attachment



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6427) Fix broken example_qubole_operator dag

2020-01-02 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6427?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007250#comment-17007250
 ] 

ASF subversion and git services commented on AIRFLOW-6427:
--

Commit 654f581bf382322d2c4b2467751383bba8ea36b3 in airflow's branch 
refs/heads/master from Kaxil Naik
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=654f581 ]

[AIRFLOW-6427] Fix broken example_qubole_operator dag (#7005)



> Fix broken example_qubole_operator dag
> --
>
> Key: AIRFLOW-6427
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6427
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, examples
>Affects Versions: 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 1.10.8
>
>
> With the current example 
> (https://github.com/apache/airflow/blob/aa90753cf5f64bf435044cf2e6b81a02fdcf6b33/airflow/contrib/example_dags/example_qubole_operator.py#L191),
>  I get the following error:
> {noformat}
> airflow.exceptions.AirflowException: Invalid arguments were passed to 
> QuboleOperator (task_id: db_import). Invalid arguments were:
> *args: ()
> **kwargs: {'db_parallelism': 2}
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6427) Fix broken example_qubole_operator dag

2020-01-02 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6427?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-6427.
-
Resolution: Fixed

> Fix broken example_qubole_operator dag
> --
>
> Key: AIRFLOW-6427
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6427
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, examples
>Affects Versions: 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 1.10.8
>
>
> With the current example 
> (https://github.com/apache/airflow/blob/aa90753cf5f64bf435044cf2e6b81a02fdcf6b33/airflow/contrib/example_dags/example_qubole_operator.py#L191),
>  I get the following error:
> {noformat}
> airflow.exceptions.AirflowException: Invalid arguments were passed to 
> QuboleOperator (task_id: db_import). Invalid arguments were:
> *args: ()
> **kwargs: {'db_parallelism': 2}
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil merged pull request #7005: [AIRFLOW-6427] Fix error in example_qubole_operator dag

2020-01-02 Thread GitBox
kaxil merged pull request #7005: [AIRFLOW-6427] Fix error in 
example_qubole_operator dag
URL: https://github.com/apache/airflow/pull/7005
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6427) Fix broken example_qubole_operator dag

2020-01-02 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6427?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007249#comment-17007249
 ] 

ASF GitHub Bot commented on AIRFLOW-6427:
-

kaxil commented on pull request #7005: [AIRFLOW-6427] Fix error in 
example_qubole_operator dag
URL: https://github.com/apache/airflow/pull/7005
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Fix broken example_qubole_operator dag
> --
>
> Key: AIRFLOW-6427
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6427
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, examples
>Affects Versions: 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 1.10.8
>
>
> With the current example 
> (https://github.com/apache/airflow/blob/aa90753cf5f64bf435044cf2e6b81a02fdcf6b33/airflow/contrib/example_dags/example_qubole_operator.py#L191),
>  I get the following error:
> {noformat}
> airflow.exceptions.AirflowException: Invalid arguments were passed to 
> QuboleOperator (task_id: db_import). Invalid arguments were:
> *args: ()
> **kwargs: {'db_parallelism': 2}
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007248#comment-17007248
 ] 

Brandon Clark edited comment on AIRFLOW-6435 at 1/3/20 6:15 AM:


I have created an implementation which requires the following entry in 
airflow.cfg which will reference a Kubernetes secret configuration for keys 
matching each filename and place them in the corresponding folder:
{code:java}
secret_file_secret = airflow
secret_file_dir = /root/.ssh
secret_file_filenames = airflow.pem,client.key{code}
 

The generated output results in the following additions to a pod configuration:
{code:java}
apiVersion: v1
kind: Pod
metadata:
  name: example-dag
spec:
  containers:
volumeMounts:
- mountPath: /root/.ssh/airflow.pem
  name: airflow-secret-files
  subPath: airflow.pem
- mountPath: /root/.ssh/client.key
  name: airflow-secret-files
  subPath: client.key
  volumes:
  - name: airflow-secret-files
secret:
  defaultMode: 256
  items:
  - key: airflow.pem
path: airflow.pem
  - key: client.key
path: client.key
  secretName: airflow{code}
 

If this seems acceptable I can make a commit to 1.10.x and 2.0.x code bases.  
Any suggestions before I attempt to make a commit request?


was (Author: webmind):
I have created an implementation which requires the following entry in 
airflow.cfg which will reference a Kubernetes secret configuration for keys 
matching each filename and place them in the corresponding folder:
{code:java}
secret_file_filenames = airflow.pem,client.key
secret_file_secret = airflow
secret_file_dir = /root/.ssh {code}
 

The generated output results in the following additions to a pod configuration:
{code:java}
apiVersion: v1
kind: Pod
metadata:
  name: example-dag
spec:
  containers:
volumeMounts:
- mountPath: /root/.ssh/airflow.pem
  name: airflow-secret-files
  subPath: airflow.pem
- mountPath: /root/.ssh/client.key
  name: airflow-secret-files
  subPath: client.key
  volumes:
  - name: airflow-secret-files
secret:
  defaultMode: 256
  items:
  - key: airflow.pem
path: airflow.pem
  - key: client.key
path: client.key
  secretName: airflow{code}
 

If this seems acceptable I can make a commit to 1.10.x and 2.0.x code bases.  
Any suggestions before I attempt to make a commit request?

> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007248#comment-17007248
 ] 

Brandon Clark edited comment on AIRFLOW-6435 at 1/3/20 6:15 AM:


I have created an implementation which requires the following entry in 
airflow.cfg which will reference a Kubernetes secret configuration for keys 
matching each filename and place them in the corresponding folder:
{code:java}
[kubernetes]
secret_file_secret = airflow
secret_file_dir = /root/.ssh
secret_file_filenames = airflow.pem,client.key{code}
 

The generated output results in the following additions to a pod configuration:
{code:java}
apiVersion: v1
kind: Pod
metadata:
  name: example-dag
spec:
  containers:
volumeMounts:
- mountPath: /root/.ssh/airflow.pem
  name: airflow-secret-files
  subPath: airflow.pem
- mountPath: /root/.ssh/client.key
  name: airflow-secret-files
  subPath: client.key
  volumes:
  - name: airflow-secret-files
secret:
  defaultMode: 256
  items:
  - key: airflow.pem
path: airflow.pem
  - key: client.key
path: client.key
  secretName: airflow{code}
 

If this seems acceptable I can make a commit to 1.10.x and 2.0.x code bases.  
Any suggestions before I attempt to make a commit request?


was (Author: webmind):
I have created an implementation which requires the following entry in 
airflow.cfg which will reference a Kubernetes secret configuration for keys 
matching each filename and place them in the corresponding folder:
{code:java}
secret_file_secret = airflow
secret_file_dir = /root/.ssh
secret_file_filenames = airflow.pem,client.key{code}
 

The generated output results in the following additions to a pod configuration:
{code:java}
apiVersion: v1
kind: Pod
metadata:
  name: example-dag
spec:
  containers:
volumeMounts:
- mountPath: /root/.ssh/airflow.pem
  name: airflow-secret-files
  subPath: airflow.pem
- mountPath: /root/.ssh/client.key
  name: airflow-secret-files
  subPath: client.key
  volumes:
  - name: airflow-secret-files
secret:
  defaultMode: 256
  items:
  - key: airflow.pem
path: airflow.pem
  - key: client.key
path: client.key
  secretName: airflow{code}
 

If this seems acceptable I can make a commit to 1.10.x and 2.0.x code bases.  
Any suggestions before I attempt to make a commit request?

> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007248#comment-17007248
 ] 

Brandon Clark commented on AIRFLOW-6435:


I have created an implementation which requires the following entry in 
airflow.cfg which will reference a Kubernetes secret configuration for keys 
matching each filename and place them in the corresponding folder:
{code:java}
secret_file_filenames = airflow.pem,client.key
secret_file_secret = airflow
secret_file_dir = /root/.ssh {code}
 

The generated output results in the following additions to a pod configuration:
{code:java}
apiVersion: v1
kind: Pod
metadata:
  name: example-dag
spec:
  containers:
volumeMounts:
- mountPath: /root/.ssh/airflow.pem
  name: airflow-secret-files
  subPath: airflow.pem
- mountPath: /root/.ssh/client.key
  name: airflow-secret-files
  subPath: client.key
  volumes:
  - name: airflow-secret-files
secret:
  defaultMode: 256
  items:
  - key: airflow.pem
path: airflow.pem
  - key: client.key
path: client.key
  secretName: airflow{code}
 

If this seems acceptable I can make a commit to 1.10.x and 2.0.x code bases.  
Any suggestions before I attempt to make a commit request?

> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6604: [AIRFLOW-5920] Neo4j operator and 
hook
URL: https://github.com/apache/airflow/pull/6604#issuecomment-558902885
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=h1) 
Report
   > Merging 
[#6604](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.26%`.
   > The diff coverage is `95.71%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6604/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6604  +/-   ##
   ==
   - Coverage   85.03%   84.76%   -0.27% 
   ==
 Files 679  681   +2 
 Lines   3854538615  +70 
   ==
   - Hits3277532732  -43 
   - Misses   5770 5883 +113
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/neo4j/hook.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbmVvNGovaG9vay5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/providers/neo4j/operators/neo4j.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbmVvNGovb3BlcmF0b3JzL25lbzRqLnB5)
 | `89.65% <89.65%> (ø)` | |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=footer). 
Last update 
[f391039...2a31a35](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6604: [AIRFLOW-5920] Neo4j operator and 
hook
URL: https://github.com/apache/airflow/pull/6604#issuecomment-558902885
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=h1) 
Report
   > Merging 
[#6604](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.26%`.
   > The diff coverage is `95.71%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6604/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6604  +/-   ##
   ==
   - Coverage   85.03%   84.76%   -0.27% 
   ==
 Files 679  681   +2 
 Lines   3854538615  +70 
   ==
   - Hits3277532732  -43 
   - Misses   5770 5883 +113
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/providers/neo4j/hook.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbmVvNGovaG9vay5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/providers/neo4j/operators/neo4j.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvbmVvNGovb3BlcmF0b3JzL25lbzRqLnB5)
 | `89.65% <89.65%> (ø)` | |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6604/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=footer). 
Last update 
[f391039...2a31a35](https://codecov.io/gh/apache/airflow/pull/6604?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-6436) Create & Automate docs on Airflow Configs

2020-01-02 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-6436 started by Kaxil Naik.
---
> Create & Automate docs on Airflow Configs
> -
>
> Key: AIRFLOW-6436
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6436
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: documentation
>Affects Versions: 2.0.0, 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Critical
> Fix For: 2.0.0, 1.10.8
>
>
> Follow Up of https://issues.apache.org/jira/browse/AIRFLOW-6414
> This PR aims to automate the creation of Config docs and add pre-commit hooks 
> that would also serve as tests.
> Also, add a structure to the documentation by adding the following sections:
> * Description
> * Example
> * Type
> * Default
> for each config



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil commented on issue #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
kaxil commented on issue #7015: [AIRFLOW-6436] Create & Automate docs on 
Airflow Configs
URL: https://github.com/apache/airflow/pull/7015#issuecomment-570472178
 
 
   This PR needs a bit of polishing but its overall logic or structure would 
remain same.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6436) Create & Automate docs on Airflow Configs

2020-01-02 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6436?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007245#comment-17007245
 ] 

ASF GitHub Bot commented on AIRFLOW-6436:
-

kaxil commented on pull request #7015: [AIRFLOW-6436] Create & Automate docs on 
Airflow Configs
URL: https://github.com/apache/airflow/pull/7015
 
 
   This PR aims to automate the creation of Config docs and add pre-commit 
hooks that would also serve as tests.
   
   **Preview**:
   - **Layout 1**: https://near-cherry.surge.sh/configurations-ref.html
   - **Layout 2**: https://flowery-ray.surge.sh/configurations-ref.html 
(Similar to our CLI doc)
   
   Also, add a structure to the documentation by adding the following sections:
   
   - Description
   - Example
   - Type
   - Default
   - for each config
   
   ---
   Link to JIRA issue: https://issues.apache.org/jira/browse/AIRFLOW-6436
   
   - [x] Description above provides context of the change
   - [x] Commit message starts with `[AIRFLOW-]`, where AIRFLOW- = JIRA 
ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   (*) For document-only changes, no JIRA issue is needed. Commit message 
starts `[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create & Automate docs on Airflow Configs
> -
>
> Key: AIRFLOW-6436
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6436
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: documentation
>Affects Versions: 2.0.0, 1.10.7
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Critical
> Fix For: 2.0.0, 1.10.8
>
>
> Follow Up of https://issues.apache.org/jira/browse/AIRFLOW-6414
> This PR aims to automate the creation of Config docs and add pre-commit hooks 
> that would also serve as tests.
> Also, add a structure to the documentation by adding the following sections:
> * Description
> * Example
> * Type
> * Default
> for each config



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil opened a new pull request #7015: [AIRFLOW-6436] Create & Automate docs on Airflow Configs

2020-01-02 Thread GitBox
kaxil opened a new pull request #7015: [AIRFLOW-6436] Create & Automate docs on 
Airflow Configs
URL: https://github.com/apache/airflow/pull/7015
 
 
   This PR aims to automate the creation of Config docs and add pre-commit 
hooks that would also serve as tests.
   
   **Preview**:
   - **Layout 1**: https://near-cherry.surge.sh/configurations-ref.html
   - **Layout 2**: https://flowery-ray.surge.sh/configurations-ref.html 
(Similar to our CLI doc)
   
   Also, add a structure to the documentation by adding the following sections:
   
   - Description
   - Example
   - Type
   - Default
   - for each config
   
   ---
   Link to JIRA issue: https://issues.apache.org/jira/browse/AIRFLOW-6436
   
   - [x] Description above provides context of the change
   - [x] Commit message starts with `[AIRFLOW-]`, where AIRFLOW- = JIRA 
ID*
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   (*) For document-only changes, no JIRA issue is needed. Commit message 
starts `[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6436) Create & Automate docs on Airflow Configs

2020-01-02 Thread Kaxil Naik (Jira)
Kaxil Naik created AIRFLOW-6436:
---

 Summary: Create & Automate docs on Airflow Configs
 Key: AIRFLOW-6436
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6436
 Project: Apache Airflow
  Issue Type: Improvement
  Components: documentation
Affects Versions: 1.10.7, 2.0.0
Reporter: Kaxil Naik
Assignee: Kaxil Naik
 Fix For: 2.0.0, 1.10.8


Follow Up of https://issues.apache.org/jira/browse/AIRFLOW-6414

This PR aims to automate the creation of Config docs and add pre-commit hooks 
that would also serve as tests.

Also, add a structure to the documentation by adding the following sections:
* Description
* Example
* Type
* Default
for each config



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j 
operator and hook
URL: https://github.com/apache/airflow/pull/6604#discussion_r362707463
 
 

 ##
 File path: airflow/providers/neo4j/operators/neo4j.py
 ##
 @@ -0,0 +1,90 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+Neo4JOperator to interact and perform action on Neo4J graph database.
+This operator is designed to use Neo4J Hook and the
+Python driver: https://neo4j.com/docs/api/python-driver/current/
+"""
+
+from os.path import isfile
+
+from neo4j import BoltStatementResult
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.providers.neo4j.hook import Neo4JHook
+from airflow.utils.decorators import apply_defaults
+
+
+class Neo4JOperator(BaseOperator):
+"""
+This operator provides Airflow DAGs the ability to execute a cypher query
+and save the results of the query to a CSV file.
+
+:param cypher_query: required cypher query to be executed on the Neo4J 
database
+:type cypher_query: str
+:param output_filename: required filename to produce with output from the 
query
+:type output_filename: str
+:param n4j_conn_id: reference to a pre-defined Neo4J Connection
+:type n4j_conn_id: str
+:param soft_fail: set True to fail when query return no result
+:type soft_fail: bool
+"""
+cypher_query = None
+output_filename = None
+n4j_conn_id = None
+soft_fail = None
+
+template_fields = ['cypher_query', 'output_filename', 'n4j_conn_id', 
'soft_fail']
+
+@apply_defaults
+def __init__(self,
+ cypher_query,
+ output_filename,
+ n4j_conn_id,
+ soft_fail=False,
+ *args,
+ **kwargs):
+super().__init__(*args, **kwargs)
+
+self.output_filename = output_filename
+self.cypher_query = cypher_query
+self.n4j_conn_id = n4j_conn_id
+self.soft_fail = soft_fail
+
+def execute(self, context):
+"""
+Executes the supplied query and saves the results as a CSV file on disk
+
+:param context:
+:return: Row Count
+:rtype: int
+"""
 
 Review comment:
   Done.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j 
operator and hook
URL: https://github.com/apache/airflow/pull/6604#discussion_r362707277
 
 

 ##
 File path: airflow/providers/neo4j/hook.py
 ##
 @@ -0,0 +1,133 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This hook provides minimal thin wrapper around the neo4j python library to 
provide query execution
+"""
+import csv
+
+from neo4j import BoltStatementResult, Driver, GraphDatabase, Session
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class Neo4JHook(BaseHook):
+"""
+This class enables the neo4j operator to execute queries against a 
configured neo4j server.
+It requires the configuration name as set in Airflow -> Admin -> 
Connections
+
+:param n4j_conn_id: Name of connection configured in Airflow
+:type n4j_conn_id: str
+"""
+n4j_conn_id: str
+
+def __init__(self, n4j_conn_id: str = 'n4j_default', *args, **kwargs):
+self.n4j_conn_id = n4j_conn_id
+
+@staticmethod
+def get_config(n4j_conn_id: str) -> dict:
+"""
+Obtain the Username + Password from the Airflow connection definition
+Store them in _config dictionary as:
+*credentials* -- a tuple of username/password eg. ("username", 
"password")
+*host* -- String for Neo4J URI eg. "bolt://1.1.1.1:7687"
+
+:param n4j_conn_id: Name of connection configured in Airflow
+:type n4j_conn_id: str
+:return: dictionary with configuration values
+:rtype: dict
+"""
+config: dict = {}
+connection_object = Neo4JHook.get_connection(n4j_conn_id)
+if connection_object.login and connection_object.host:
+config['credentials'] = connection_object.login, 
connection_object.password
+config['host'] = "bolt://{0}:{1}".format(connection_object.host, 
connection_object.port)
 
 Review comment:
   Used a f-string to join the host and port, but used `urllib.parse` to build 
a valid URL, `os.path.join` didnt really feel right for a URL. Let me know what 
you think, can simplify to just and f-string if you like.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6999: [AIRFLOW-XXXX] Clarify wait_for_downstream and execution_date

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6999: [AIRFLOW-] Clarify 
wait_for_downstream and execution_date
URL: https://github.com/apache/airflow/pull/6999#issuecomment-570196566
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=h1) 
Report
   > Merging 
[#6999](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/4bfde026d14441a9fcb26e5e7f465def3f13eaa5?src=pr=desc)
 will **decrease** coverage by `0.32%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6999/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6999  +/-   ##
   ==
   - Coverage   84.81%   84.48%   -0.33% 
   ==
 Files 679  679  
 Lines   3849538536  +41 
   ==
   - Hits3264932558  -91 
   - Misses   5846 5978 +132
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/ti\_deps/deps/prev\_dagrun\_dep.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcHJldl9kYWdydW5fZGVwLnB5)
 | `80.55% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5)
 | `96.25% <ø> (+0.17%)` | :arrow_up: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `94.91% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `90.08% <0%> (-1.66%)` | :arrow_down: |
   | ... and [15 
more](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=footer). 
Last update 
[4bfde02...4c3da95](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
tfindlay-au commented on a change in pull request #6604: [AIRFLOW-5920] Neo4j 
operator and hook
URL: https://github.com/apache/airflow/pull/6604#discussion_r362704832
 
 

 ##
 File path: airflow/providers/neo4j/hook.py
 ##
 @@ -0,0 +1,133 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This hook provides minimal thin wrapper around the neo4j python library to 
provide query execution
+"""
+import csv
+
+from neo4j import BoltStatementResult, Driver, GraphDatabase, Session
+
+from airflow.hooks.base_hook import BaseHook
+
+
+class Neo4JHook(BaseHook):
+"""
+This class enables the neo4j operator to execute queries against a 
configured neo4j server.
+It requires the configuration name as set in Airflow -> Admin -> 
Connections
+
+:param n4j_conn_id: Name of connection configured in Airflow
+:type n4j_conn_id: str
+"""
+n4j_conn_id: str
+
+def __init__(self, n4j_conn_id: str = 'n4j_default', *args, **kwargs):
+self.n4j_conn_id = n4j_conn_id
+
+@staticmethod
+def get_config(n4j_conn_id: str) -> dict:
+"""
+Obtain the Username + Password from the Airflow connection definition
+Store them in _config dictionary as:
+*credentials* -- a tuple of username/password eg. ("username", 
"password")
+*host* -- String for Neo4J URI eg. "bolt://1.1.1.1:7687"
+
+:param n4j_conn_id: Name of connection configured in Airflow
+:type n4j_conn_id: str
+:return: dictionary with configuration values
+:rtype: dict
+"""
+config: dict = {}
 
 Review comment:
   Have changed to generic Dict, but had to type the value with Any as its 
mixed with a tuple and string. Let me know what you think.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] baolsen commented on a change in pull request #6999: [AIRFLOW-XXXX] Clarify wait_for_downstream and execution_date

2020-01-02 Thread GitBox
baolsen commented on a change in pull request #6999: [AIRFLOW-] Clarify 
wait_for_downstream and execution_date
URL: https://github.com/apache/airflow/pull/6999#discussion_r362704684
 
 

 ##
 File path: docs/tutorial.rst
 ##
 @@ -313,8 +315,16 @@ to track the progress. ``airflow webserver`` will start a 
web server if you
 are interested in tracking the progress visually as your backfill progresses.
 
 Note that if you use ``depends_on_past=True``, individual task instances
-will depend on the success of the preceding task instance, except for the
-start_date specified itself, for which this dependency is disregarded.
+will depend on the success of their previous task instance (that is, previous
 
 Review comment:
   Thanks for the feedback. 
   
   So I wasn't 100% decided between "preceding" vs "previous", but decided to 
change it to previous because "previous" is the word used elsewhere in the 
documentation and code to refer to "past instances". 
   I learned yesterday that the two words have very similar definition and can 
be used interchangeably :). 
   
   It can be confusing to a developer. "previous + next" usually go together as 
concepts and refer to objects that are linked somehow, yet in this context the 
tasks that are linked are actually called "upstream/downstream". 
   
   However, across the code we have used "previous" to refer to "past 
instances" of tasks and dag runs, meaning perhaps it is more natural to refer 
to "previous" things rather than "past"/"preceding"/"prior" things.
   
   I've just found the Concepts section and I think it would be good to add 
some definitions here. As a new user is can be really confusing to understand 
upstream/downstream vs previous/next execution. Execution_date isn't even 
mentioned in Concepts at all - I'll see if I can add something.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6756: [AIRFLOW-6198] Add types to core classes.

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6756: [AIRFLOW-6198] Add types 
to core classes.
URL: https://github.com/apache/airflow/pull/6756#discussion_r362701017
 
 

 ##
 File path: airflow/cli/commands/dag_command.py
 ##
 @@ -41,7 +41,7 @@ def _tabulate_dag_runs(dag_runs: List[DagRun], 
tablefmt="fancy_grid"):
 'Run ID': dag_run.run_id,
 'State': dag_run.state,
 'DAG ID': dag_run.dag_id,
-'Execution date': dag_run.execution_date.isoformat(),
+'Execution date': dag_run.execution_date.isoformat() if 
dag_run.execution_date else "",
 
 Review comment:
   ```suggestion
   'Execution date': dag_run.execution_date.isoformat() if 
dag_run.execution_date else None,
   ```
   JSON supports null type, so we can use it instead of empty text.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4453) none_failed trigger rule cascading skipped state to downstream tasks

2020-01-02 Thread Joel Croteau (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4453?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007203#comment-17007203
 ] 

Joel Croteau commented on AIRFLOW-4453:
---

I can actually reproduce this on 1.10.3 with a much simpler DAG:
{code:java}
import datetime

from airflow.models import SkipMixin, BaseOperator, DAG
from airflow.operators.dummy_operator import DummyOperator


class SkipOperator(BaseOperator, SkipMixin):
def execute(self, context=None):
# Skip all downstream tasks
ti = context['ti']
dag_run = ti.get_dagrun()
task = ti.task

downstream_tasks = task.downstream_list

if downstream_tasks:
self.skip(dag_run, ti.execution_date, downstream_tasks)


with DAG(
schedule_interval=None,
dag_id='simple_skip',
start_date=datetime.datetime(2019, 12, 18),
) as dag:
skip_operator = SkipOperator(task_id='skip_operator')
step_1 = DummyOperator(task_id='step_1')
step_2 = DummyOperator(task_id='step_2', trigger_rule='none_failed')
skip_operator >> step_1 >> step_2
{code}
This also skips on 1.10.3. I haven't tried it in newer versions.

!simple_skip.png!

> none_failed trigger rule cascading skipped state to downstream tasks
> 
>
> Key: AIRFLOW-4453
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4453
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Affects Versions: 1.10.3
>Reporter: Dmytro Kulyk
>Priority: Major
>  Labels: skipped
> Attachments: cube_update.py, image-2019-05-02-18-11-28-307.png, 
> simple_skip.png
>
>
> Task with trigger_rule = 'none_failed' cascading *skipped *status to 
> downstream task
>  * task have multiple upstream tasks
>  * trigger_rule set to 'none_failed'
>  * some of upstream tasks can be skipped due to *latest only*
> Basing on documentation this shouldn't happen
>  !image-2019-05-02-18-11-28-307.png|width=655,height=372! 
>  DAG attached



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-4453) none_failed trigger rule cascading skipped state to downstream tasks

2020-01-02 Thread Joel Croteau (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4453?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joel Croteau updated AIRFLOW-4453:
--
Attachment: simple_skip.png

> none_failed trigger rule cascading skipped state to downstream tasks
> 
>
> Key: AIRFLOW-4453
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4453
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Affects Versions: 1.10.3
>Reporter: Dmytro Kulyk
>Priority: Major
>  Labels: skipped
> Attachments: cube_update.py, image-2019-05-02-18-11-28-307.png, 
> simple_skip.png
>
>
> Task with trigger_rule = 'none_failed' cascading *skipped *status to 
> downstream task
>  * task have multiple upstream tasks
>  * trigger_rule set to 'none_failed'
>  * some of upstream tasks can be skipped due to *latest only*
> Basing on documentation this shouldn't happen
>  !image-2019-05-02-18-11-28-307.png|width=655,height=372! 
>  DAG attached



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brandon Clark updated AIRFLOW-6435:
---
Description: There should be a dynamic way to add protected files to pod.  
Just as git-sync requires an ssh key to mounted so do other software suites and 
processes that can be ran from Airflow.  (was: There should be a dynamic way to 
add protected files to pod.  Just as git-sync requires an ssh key to mounted so 
do other software suites and processes that can be ran from Airflow.

I have created an implementation which requires the following entry in 
airflow.cfg:
{code:java}
secret_file_filenames = airflow.pem,client.key
secret_file_secret = airflow
secret_file_dir = /root/.ssh {code})

> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brandon Clark updated AIRFLOW-6435:
---
Description: 
There should be a dynamic way to add protected files to pod.  Just as git-sync 
requires an ssh key to mounted so do other software suites and processes that 
can be ran from Airflow.

I have created an implementation which requires the following entry in 
airflow.cfg:
{code:java}
secret_file_filenames = airflow.pem,client.key
secret_file_secret = airflow
secret_file_dir = /root/.ssh {code}

  was:
There should be a dynamic way to add protected files to pod.  Just as git-sync 
requires an ssh key to mounted so do other software suites and processes that 
can be ran from Airflow.

I have created an implementation which requires the following entry in 
airflow.cfg:
{code:java}

secret_file_filenames = airflow.pem,client.key
secret_file_name = airflow
secret_file_dir = /root/.ssh {code}


> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.
> I have created an implementation which requires the following entry in 
> airflow.cfg:
> {code:java}
> secret_file_filenames = airflow.pem,client.key
> secret_file_secret = airflow
> secret_file_dir = /root/.ssh {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brandon Clark updated AIRFLOW-6435:
---
Description: 
There should be a dynamic way to add protected files to pod.  Just as git-sync 
requires an ssh key to mounted so do other software suites and processes that 
can be ran from Airflow.

I have created an implementation which requires the following entry in 
airflow.cfg:
{code:java}

secret_file_filenames = airflow.pem,client.key
secret_file_name = airflow
secret_file_dir = /root/.ssh {code}

  was:There should be a dynamic way to add protected files to pod.  Just as 
git-sync requires an ssh key to mounted so do other software suites and 
processes that can be ran from Airflow.


> Mount secret files from secrets config
> --
>
> Key: AIRFLOW-6435
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 2.0.0, 1.10.8
>Reporter: Brandon Clark
>Assignee: Brandon Clark
>Priority: Major
>  Labels: Kubernetes, executor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There should be a dynamic way to add protected files to pod.  Just as 
> git-sync requires an ssh key to mounted so do other software suites and 
> processes that can be ran from Airflow.
> I have created an implementation which requires the following entry in 
> airflow.cfg:
> {code:java}
> secret_file_filenames = airflow.pem,client.key
> secret_file_name = airflow
> secret_file_dir = /root/.ssh {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6435) Mount secret files from secrets config

2020-01-02 Thread Brandon Clark (Jira)
Brandon Clark created AIRFLOW-6435:
--

 Summary: Mount secret files from secrets config
 Key: AIRFLOW-6435
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6435
 Project: Apache Airflow
  Issue Type: New Feature
  Components: executor-kubernetes
Affects Versions: 2.0.0, 1.10.8
Reporter: Brandon Clark
Assignee: Brandon Clark


There should be a dynamic way to add protected files to pod.  Just as git-sync 
requires an ssh key to mounted so do other software suites and processes that 
can be ran from Airflow.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #6999: [AIRFLOW-XXXX] Clarify wait_for_downstream and execution_date

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6999: [AIRFLOW-] Clarify 
wait_for_downstream and execution_date
URL: https://github.com/apache/airflow/pull/6999#issuecomment-570196566
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=h1) 
Report
   > Merging 
[#6999](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **decrease** coverage by `0.36%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6999/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6999  +/-   ##
   ==
   - Coverage   84.85%   84.48%   -0.37% 
   ==
 Files 679  679  
 Lines   3853638536  
   ==
   - Hits3269832558 -140 
   - Misses   5838 5978 +140
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/ti\_deps/deps/prev\_dagrun\_dep.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcHJldl9kYWdydW5fZGVwLnB5)
 | `80.55% <ø> (ø)` | :arrow_up: |
   | 
[airflow/models/baseoperator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvYmFzZW9wZXJhdG9yLnB5)
 | `96.25% <ø> (ø)` | :arrow_up: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `94.91% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `90.08% <0%> (-1.66%)` | :arrow_down: |
   | ... and [2 
more](https://codecov.io/gh/apache/airflow/pull/6999/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=footer). 
Last update 
[be812bd...4c3da95](https://codecov.io/gh/apache/airflow/pull/6999?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj opened a new pull request #7014: [AIRFLOW-XXXX] Add `airflow dags show` command guide

2020-01-02 Thread GitBox
mik-laj opened a new pull request #7014: [AIRFLOW-] Add `airflow dags show` 
command guide
URL: https://github.com/apache/airflow/pull/7014
 
 
   Add documentation about the unpopular feature.
   
   ---
   Link to JIRA issue: https://issues.apache.org/jira/browse/AIRFLOW-
   
   - [X] Description above provides context of the change
   - [X] Commit message starts with `[AIRFLOW-]`, where AIRFLOW- = JIRA 
ID*
   - [X] Unit tests coverage for changes (not needed for documentation changes)
   - [X] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [X] Relevant documentation is updated including usage instructions.
   - [X] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   (*) For document-only changes, no JIRA issue is needed. Commit message 
starts `[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bphillips-exos opened a new pull request #7013: [AIRFLOW-6434] Add return back to DockerOperator.execute

2020-01-02 Thread GitBox
bphillips-exos opened a new pull request #7013: [AIRFLOW-6434] Add return back 
to DockerOperator.execute
URL: https://github.com/apache/airflow/pull/7013
 
 
   https://issues.apache.org/jira/browse/AIRFLOW-6434
   
   This change 
(https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior. Even if xcom_push is True, DockerOperator.execute will not 
return a value and thus will not push an xcom value.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7

2020-01-02 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007172#comment-17007172
 ] 

ASF GitHub Bot commented on AIRFLOW-6434:
-

bphillips-exos commented on pull request #7013: [AIRFLOW-6434] Add return back 
to DockerOperator.execute
URL: https://github.com/apache/airflow/pull/7013
 
 
   https://issues.apache.org/jira/browse/AIRFLOW-6434
   
   This change 
(https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior. Even if xcom_push is True, DockerOperator.execute will not 
return a value and thus will not push an xcom value.
   
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Docker Operator No Longer XComs Result in 1.10.7
> 
>
> Key: AIRFLOW-6434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators, xcom
>Affects Versions: 1.10.7
>Reporter: Brian Phillips
>Priority: Trivial
>
> This change 
> ([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
>  introduced a slight (and I believe unintended) change to the Docker Operator 
> xcom behavior.
> Even if xcom_push is True, DockerOperator.execute will not return a value and 
> thus will not push an xcom value.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-4453) none_failed trigger rule cascading skipped state to downstream tasks

2020-01-02 Thread Joel Croteau (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4453?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007170#comment-17007170
 ] 

Joel Croteau commented on AIRFLOW-4453:
---

[https://xkcd.com/979/.|https://xkcd.com/979/] Definitely seeing this in 1.10.3 
(what Cloud Composer is using). I don't know if it's been fixed in newer 
versions.

> none_failed trigger rule cascading skipped state to downstream tasks
> 
>
> Key: AIRFLOW-4453
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4453
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Affects Versions: 1.10.3
>Reporter: Dmytro Kulyk
>Priority: Major
>  Labels: skipped
> Attachments: cube_update.py, image-2019-05-02-18-11-28-307.png
>
>
> Task with trigger_rule = 'none_failed' cascading *skipped *status to 
> downstream task
>  * task have multiple upstream tasks
>  * trigger_rule set to 'none_failed'
>  * some of upstream tasks can be skipped due to *latest only*
> Basing on documentation this shouldn't happen
>  !image-2019-05-02-18-11-28-307.png|width=655,height=372! 
>  DAG attached



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #7006: [AIRFLOW-6112] [AIP-21] Rename GCP SQL operators and hooks

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #7006: [AIRFLOW-6112] [AIP-21] Rename GCP 
SQL operators and hooks
URL: https://github.com/apache/airflow/pull/7006#issuecomment-570254220
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=h1) 
Report
   > Merging 
[#7006](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/aa90753cf5f64bf435044cf2e6b81a02fdcf6b33?src=pr=desc)
 will **decrease** coverage by `0.36%`.
   > The diff coverage is `56.56%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7006/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7006  +/-   ##
   ==
   - Coverage   84.85%   84.48%   -0.37% 
   ==
 Files 679  679  
 Lines   3854238590  +48 
   ==
   - Hits3270332601 -102 
   - Misses   5839 5989 +150
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `68.78% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/gcp\_sql\_operator.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3Bfc3FsX29wZXJhdG9yLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...irflow/gcp/example\_dags/example\_cloud\_sql\_query.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfY2xvdWRfc3FsX3F1ZXJ5LnB5)
 | `98.33% <100%> (ø)` | :arrow_up: |
   | 
[airflow/gcp/hooks/cloud\_sql.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvaG9va3MvY2xvdWRfc3FsLnB5)
 | `65.94% <100%> (ø)` | :arrow_up: |
   | 
[airflow/gcp/operators/cloud\_sql.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9nY3Avb3BlcmF0b3JzL2Nsb3VkX3NxbC5weQ==)
 | `84.61% <100%> (ø)` | :arrow_up: |
   | 
[airflow/gcp/example\_dags/example\_cloud\_sql.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfY2xvdWRfc3FsLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/gcp\_sql\_hook.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9zcWxfaG9vay5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | ... and [2 
more](https://codecov.io/gh/apache/airflow/pull/7006/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=footer). 
Last update 
[aa90753...1a51cbe](https://codecov.io/gh/apache/airflow/pull/7006?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7

2020-01-02 Thread Brian Phillips (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brian Phillips updated AIRFLOW-6434:

Description: 
This change 
([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior.

Even if xcom_push is True, DockerOperator.execute will not return a value and 
thus will not push an xcom value.

  was:
This 
[change]([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior.

Even if xcom_push is True, DockerOperator.execute will not return a value and 
thus will not push an xcom value.


> Docker Operator No Longer XComs Result in 1.10.7
> 
>
> Key: AIRFLOW-6434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators, xcom
>Affects Versions: 1.10.7
>Reporter: Brian Phillips
>Priority: Trivial
>
> This change 
> ([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
>  introduced a slight (and I believe unintended) change to the Docker Operator 
> xcom behavior.
> Even if xcom_push is True, DockerOperator.execute will not return a value and 
> thus will not push an xcom value.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7

2020-01-02 Thread Brian Phillips (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brian Phillips updated AIRFLOW-6434:

Description: 
This 
[change]([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior.

Even if xcom_push is True, DockerOperator.execute will not return a value and 
thus will not push an xcom value.

  was:
This 
[change|[https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]]
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior.

Even if xcom_push is True, DockerOperator.execute will not return a value and 
thus will not push an xcom value.


> Docker Operator No Longer XComs Result in 1.10.7
> 
>
> Key: AIRFLOW-6434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators, xcom
>Affects Versions: 1.10.7
>Reporter: Brian Phillips
>Priority: Trivial
>
> This 
> [change]([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]
>  introduced a slight (and I believe unintended) change to the Docker Operator 
> xcom behavior.
> Even if xcom_push is True, DockerOperator.execute will not return a value and 
> thus will not push an xcom value.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7

2020-01-02 Thread Brian Phillips (Jira)
Brian Phillips created AIRFLOW-6434:
---

 Summary: Docker Operator No Longer XComs Result in 1.10.7
 Key: AIRFLOW-6434
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6434
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators, xcom
Affects Versions: 1.10.7
Reporter: Brian Phillips


This 
[change|[https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212]]
 introduced a slight (and I believe unintended) change to the Docker Operator 
xcom behavior.

Even if xcom_push is True, DockerOperator.execute will not return a value and 
thus will not push an xcom value.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] houqp commented on issue #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-01-02 Thread GitBox
houqp commented on issue #6342: [AIRFLOW-5662] fix incorrect naming for 
scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#issuecomment-570424113
 
 
   @ashb ready for another round of review :) Now pool slot metrics collection 
has been reduced to 2 DB queries regardless of number of pools in the system.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6342: [AIRFLOW-5662] fix incorrect naming 
for scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#issuecomment-547121627
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=h1) 
Report
   > Merging 
[#6342](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/13c21c14b2f22e0c9f0b429ad4564b4b61c17f76?src=pr=desc)
 will **increase** coverage by `0.11%`.
   > The diff coverage is `95.23%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6342/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6342  +/-   ##
   ==
   + Coverage   84.63%   84.75%   +0.11% 
   ==
 Files 679  679  
 Lines   3854538572  +27 
   ==
   + Hits3262332691  +68 
   + Misses   5922 5881  -41
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `96.55% <95%> (-0.82%)` | :arrow_down: |
   | 
[airflow/jobs/scheduler\_job.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL3NjaGVkdWxlcl9qb2IucHk=)
 | `89.38% <95.23%> (+0.1%)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `91.73% <0%> (+0.82%)` | :arrow_up: |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `68.78% <0%> (+0.97%)` | :arrow_up: |
   | ... and [5 
more](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=footer). 
Last update 
[13c21c1...07f2628](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6342: [AIRFLOW-5662] fix incorrect naming for scheduler used slot metric

2020-01-02 Thread GitBox
codecov-io edited a comment on issue #6342: [AIRFLOW-5662] fix incorrect naming 
for scheduler used slot metric
URL: https://github.com/apache/airflow/pull/6342#issuecomment-547121627
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=h1) 
Report
   > Merging 
[#6342](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f391039be9cb5a767f4c66771ba70031210d3e76?src=pr=desc)
 will **decrease** coverage by `0.34%`.
   > The diff coverage is `95.23%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6342/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6342  +/-   ##
   ==
   - Coverage   85.03%   84.68%   -0.35% 
   ==
 Files 679  679  
 Lines   3854538572  +27 
   ==
   - Hits3277532665 -110 
   - Misses   5770 5907 +137
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `96.55% <95%> (-0.82%)` | :arrow_down: |
   | 
[airflow/jobs/scheduler\_job.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzL3NjaGVkdWxlcl9qb2IucHk=)
 | `89.38% <95.23%> (+0.1%)` | :arrow_up: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/refresh\_config.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3JlZnJlc2hfY29uZmlnLnB5)
 | `50.98% <0%> (-23.53%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `78.75% <0%> (-20%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `95% <0%> (-1.67%)` | :arrow_down: |
   | ... and [2 
more](https://codecov.io/gh/apache/airflow/pull/6342/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=footer). 
Last update 
[f391039...07f2628](https://codecov.io/gh/apache/airflow/pull/6342?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6433) reduce conf.get lookups in scheduler_job.py loops

2020-01-02 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007142#comment-17007142
 ] 

ASF GitHub Bot commented on AIRFLOW-6433:
-

tooptoop4 commented on pull request #7012: [AIRFLOW-6433] reduce conf.get 
lookups in scheduler_job.py loops
URL: https://github.com/apache/airflow/pull/7012
 
 
   ---
   Link to JIRA issue: https://issues.apache.org/jira/browse/AIRFLOW-6433
   
   - [ X] Description above provides context of the change
   - [X ] Commit message starts with `[AIRFLOW-6433]`, where AIRFLOW- = 
JIRA ID*
   - [X ] Unit tests coverage for changes (not needed for documentation changes)
   - [X ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [X ] Relevant documentation is updated including usage instructions.
   - [X ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   (*) For document-only changes, no JIRA issue is needed. Commit message 
starts `[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> reduce conf.get lookups in scheduler_job.py loops
> -
>
> Key: AIRFLOW-6433
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6433
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Priority: Trivial
>
> trivial perf benefit available



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work started] (AIRFLOW-6433) reduce conf.get lookups in scheduler_job.py loops

2020-01-02 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-6433 started by t oo.
-
> reduce conf.get lookups in scheduler_job.py loops
> -
>
> Key: AIRFLOW-6433
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6433
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Assignee: t oo
>Priority: Trivial
>
> trivial perf benefit available



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] tooptoop4 opened a new pull request #7012: [AIRFLOW-6433] reduce conf.get lookups in scheduler_job.py loops

2020-01-02 Thread GitBox
tooptoop4 opened a new pull request #7012: [AIRFLOW-6433] reduce conf.get 
lookups in scheduler_job.py loops
URL: https://github.com/apache/airflow/pull/7012
 
 
   ---
   Link to JIRA issue: https://issues.apache.org/jira/browse/AIRFLOW-6433
   
   - [ X] Description above provides context of the change
   - [X ] Commit message starts with `[AIRFLOW-6433]`, where AIRFLOW- = 
JIRA ID*
   - [X ] Unit tests coverage for changes (not needed for documentation changes)
   - [X ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [X ] Relevant documentation is updated including usage instructions.
   - [X ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   (*) For document-only changes, no JIRA issue is needed. Commit message 
starts `[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-6433) reduce conf.get lookups in scheduler_job.py loops

2020-01-02 Thread t oo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-6433:
--
Description: trivial perf benefit available

> reduce conf.get lookups in scheduler_job.py loops
> -
>
> Key: AIRFLOW-6433
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6433
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Priority: Trivial
>
> trivial perf benefit available



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6433) reduce conf.get lookups in scheduler_job.py loops

2020-01-02 Thread t oo (Jira)
t oo created AIRFLOW-6433:
-

 Summary: reduce conf.get lookups in scheduler_job.py loops
 Key: AIRFLOW-6433
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6433
 Project: Apache Airflow
  Issue Type: Improvement
  Components: scheduler
Affects Versions: 1.10.7
Reporter: t oo






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362663382
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
 
 Review comment:
   Here is  doc with sections: https://pastebin.com/aL7qbni1 (take it or throw 
it away ;-))


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dstandish edited a comment on issue #6850: [AIRFLOW-6296] add mssql odbc hook

2020-01-02 Thread GitBox
dstandish edited a comment on issue #6850: [AIRFLOW-6296] add mssql odbc hook
URL: https://github.com/apache/airflow/pull/6850#issuecomment-570388793
 
 
   @baolsen 
   > I really feel that SQL Alchemy might be useful here. Using different 
dialects allows us to be backwards compatible and let the user decide whether 
to use pymssql or pyodbc (with the extra effort that it requires).
   
   it's an interesting idea but i think it actually increases complexity to go 
that way.  then your hook logic, and the design of your connection object 
parsing, has to be able to handle arbitrary mssql libraries. 
   e.g. with pyodbc we can supply arbitrary odbc connection properties as 
key-value in conn.extra.  what would you do with these in pymssql?you could 
do it, but i think it is cleaner and safer to keep things separate.
   and where in the conn object would you put the dialect selector?
   you'd still have to have a extra option for odbc support at install to 
enable pyodbc.
   and it while this could potentially be used for switching between pyodbc and 
pymssql, it could not be used for turbodbc or bcp.
   
   and i think that might be why conventionally in airflow it's the other way 
around -- hooks are defined in accordance with the idiosyncrasies of their 
connectors, and they can optionally generate a sqlalchemy connection.  and this 
PR does provide support for that here: 
(https://github.com/apache/airflow/blob/1b3b907ed6e8a45affa269b624dfe420d74424ed/airflow/providers/mssql/hooks/mssql_odbc.py#L156)
   (the `get_uri` method is used by dbapi hook to produce sqlalchemy engine; i 
also provide a get_sqlalchemy connection method)
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dstandish commented on issue #6850: [AIRFLOW-6296] add mssql odbc hook

2020-01-02 Thread GitBox
dstandish commented on issue #6850: [AIRFLOW-6296] add mssql odbc hook
URL: https://github.com/apache/airflow/pull/6850#issuecomment-570388793
 
 
   @baolsen 
   > I really feel that SQL Alchemy might be useful here. Using different 
dialects allows us to be backwards compatible and let the user decide whether 
to use pymssql or pyodbc (with the extra effort that it requires).
   
   it's an interesting idea but i think it actually increases complexity to go 
that way.  then your hook logic, and the design of your connection object 
parsing, has to be able to handle arbitrary mssql libraries. 
   e.g. with pyodbc we can supply arbitrary odbc connection properties as 
key-value in conn.extra.  what would you do with these in pymssql?  
   and where in the conn object would you put the dialect selector?
   you'd still have to have a extra option for odbc support at install to 
enable pyodbc.
   and it while this could potentially be used for switching between pyodbc and 
pymssql, it could not be used for turbodbc or bcp.
   
   and i think that might be why conventionally in airflow it's the other way 
around -- hooks are defined in accordance with the idiosyncrasies of their 
connectors, and they can optionally generate a sqlalchemy connection.  and this 
PR does provide support for that here: 
(https://github.com/apache/airflow/blob/1b3b907ed6e8a45affa269b624dfe420d74424ed/airflow/providers/mssql/hooks/mssql_odbc.py#L156)
   (the `get_uri` method is used by dbapi hook to produce sqlalchemy engine; i 
also provide a get_sqlalchemy connection method)
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362656314
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
 
 Review comment:
   Yes. I mean creating a label for each key so that we can include a link to 
each key in the documentation. If you want I can do it very quickly using a 
regular expression. ;-D


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362656314
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
 
 Review comment:
   Yes. I mean creating a label for each key so that we can include a link to 
each key in the documentation. If you want I will do it very quickly using a 
regular expression. ;-D


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "differently" ... 
   
   I am not sure why we are using those imports in this form? Do we have 
something that holds us back from changing all "days_ago" imports to the form 
that is much more pythonic (IMHO)? 
   
   Adding anything to `__init__.py` inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to `__init__.py` - even if 
it is not used directly. This adds unnecessary dependencies (and leads to 
circular dependencies).
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions. I am happy to 
discuss it though, as we might have different understanding - and maybe we 
should expose all "exposable" classes from unit in this way as part of the 
"official airflow interface" (but I still think import should be `from 
airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "differently" ... 
   
   I am not sure why we are using those imports in this form? Do we have 
something that holds us back from changing all "days_ago" imports to the form 
that is much more pythonic (IMHO)? 
   
   Adding anything to `__init__.py` inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to `__init__.py` - even if 
it is not used directly. This adds unnecessary dependencies (and leads to 
circular dependencies).
   
   I think most of our `__init__.py `should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to `__init__.py` makes 
only sense if we provide a reusable library with one package - when if you 
import it, you should have access to all exposed functions. 
   
   I am happy to discuss it though, as we might have different understanding - 
and maybe we should expose all "exposable" classes from unit in this way as 
part of the "official airflow interface" (but I still think import should be 
`from airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "differently" ... 
   
   I am not sure why we are using those imports in this form? Do we have 
something that holds us back from changing all "days_ago" imports to the form 
that is much more pythonic (IMHO)? 
   
   Adding anything to `__init__.py` inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to `__init__.py` - even if 
it is not used directly. This adds unnecessary dependencies (and leads to 
circular dependencies).
   
   I think most of our `__init__.py `should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to `__init__.py` makes 
only sense if we provide a reusable library. with one package structure - where 
if you import it, you should have access to all exposed functions. I am happy 
to discuss it though, as we might have different understanding - and maybe we 
should expose all "exposable" classes from unit in this way as part of the 
"official airflow interface" (but I still think import should be `from 
airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "differently" ... 
   
   I am not sure why we are using those imports in this form? Do we have 
something that holds us back from changing all "days_ago" imports to the form 
that is much more pythonic (IMHO)? 
   
   Adding anything to `__init__.py` inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to `__init__.py` even if it 
is not used . This adds unnecessary dependencies (and leads to circular 
dependencies).
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions. I am happy to 
discuss it though, as we might have different understanding - and maybe we 
should expose all "exposable" classes from unit in this way as part of the 
"official airflow interface" (but I still think import should be `from 
airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "differently" ... 
   
   I am not sure why we are using those imports in this form? Do we have 
something that holds us back from changing all "days_ago" imports to the form 
that is much more pythonic (IMHO)? 
   
   Adding anything to __init__.py inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to __init__.py even tit . 
This adds unnecessary dependencies (and leads to circular dependencies).
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions. I am happy to 
discuss it though, as we might have different understanding - and maybe we 
should expose all "exposable" classes from unit in this way as part of the 
"official airflow interface" (but I still think import should be `from 
airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "properly" ... I am not sure why we are using those 
imports in this form? Do we have something that holds us back from changing all 
"days_ago" imports to the form that is much more pythonic (IMHO)? 
   
   Adding anything to __init__.py inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to __init__.py even tit . 
This adds unnecessary dependencies (and leads to circular dependencies).
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions. I am happy to 
discuss it though, as we might have different understanding - and maybe we 
should expose all "exposable" classes from unit in this way as part of the 
"official airflow interface" (but I still think import should be `from 
airflow.utils.xxx import yyy` anyway).
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "properly" ... I am not sure why we are using those 
imports in this form? Do we have something that holds us back from changing all 
"days_ago" imports to the form that is much more pythonic (IMHO)? 
   
   Adding anything to __init__.py inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to __init__.py even tit . 
This adds unnecessary dependencies (and leads to circular dependencies)..
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions.
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1 (with importing the whole 'airflow').
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: (wiht importing only the function we need)
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6432) Dataproc cluster creation

2020-01-02 Thread David Rabinowitz (Jira)
David Rabinowitz created AIRFLOW-6432:
-

 Summary: Dataproc cluster creation
 Key: AIRFLOW-6432
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6432
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: 1.10.7
Reporter: David Rabinowitz


The Dataproc ClusterGenerator: does not support the 
endpointConfig.enableHttpPortAccess flag of the clusters.create api call. 
Therefore, when creating a cluster with a significant web Ui like jupyter or 
even the Spark history server, it is not available by default from the GCP 
console.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "properly" ... I am not sure why we are using those 
imports in this form? Do we have something that holds us back from changing all 
"days_ago" imports to the form that is much more pythonic (IMHO)? 
   
   Adding anything to __init__.py inside the application creates unnecessary 
dependencies. Anybody using "airflow.utils.somethingelse" will add an implicit 
dependency to "airflow.utils.dates" if we add dates to __init__.py even tit . 
This adds unnecessary dependencies (and leads to circular dependencies)..
   
   I think most of our __init__.py should be empty (or removed  if we go to 
implicit python3 packages). I believe adding anything to __init__.py makes only 
sense if we provide a reusable library. with one package structure - where if 
you import it, you should have access to all exposed functions.
   
   Just to summary - we have two options:
   
   1. Import airflow and then rely on the __init__ packages
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   2. Import the function directly (much better IMHO).
   
   ```
   from airflow.utils.dates import days_ago
   ```
   
   Option 1:
   
   https://user-images.githubusercontent.com/595491/71696745-13efc580-2db6-11ea-855c-25ca7901ff49.png;>
   
   Option 2: 
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk edited a comment on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "properly" ... I am not sure why we are using those 
imports in this form? Do we have something that holds us back from changing all 
"days_ago" imports to the form that is much more pythonic (IMHO)? 
   
   I believe 
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   is much worse than 
   ```
   from airflow.utils.dates import days_ago
   ```
   See here:
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #7007: [AIRFLOW-6428] Add dates module to airflow/utils/__init__.py

2020-01-02 Thread GitBox
potiuk commented on issue #7007: [AIRFLOW-6428] Add dates module to 
airflow/utils/__init__.py
URL: https://github.com/apache/airflow/pull/7007#issuecomment-570373212
 
 
   Why don't we do it "properly" ... I am not sure why we are using those 
imports in this form? Do we have something that holds us back from changing all 
"days_ago" imports to the form that is much more pythonic? I believe 
   ```
   import airflow
   ```
   and then using
   ```
   airflow.utils.dates.days_ago(2)
   ```
   
   is much worse than 
   ```
   from airflow.utils.dates import days_ago
   ```
   See here:
   
   https://user-images.githubusercontent.com/595491/71696484-5d8be080-2db5-11ea-8403-49a845a25510.png;>
   
   WDYT @kaxil?
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #5788: [POC] multi-threading using asyncio

2020-01-02 Thread GitBox
potiuk commented on issue #5788: [POC] multi-threading using asyncio
URL: https://github.com/apache/airflow/pull/5788#issuecomment-570369090
 
 
   Just a comment. I think we might want to discuss and rethink the approach we 
have for different executors including the Local Executor as well.
   
   The #6750 concept of the Native Executor from @nuclearpinguin might be 
theoretically be used with any deployment option. It could pretty much replace 
(and simplify) all other executors - CeleryExecutor, KubernetesExecutor, 
LocalExecutor - by choosing the right deployment options (Celery, Kubernetes, 
simple Process Pool) + the right Kombu Transport for communication 
(Redis/Rabbitmq, In-memory, ).
   
   Something we should discuss at the next sig-scalability meeting.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tfindlay-au commented on issue #6604: [AIRFLOW-5920] Neo4j operator and hook

2020-01-02 Thread GitBox
tfindlay-au commented on issue #6604: [AIRFLOW-5920] Neo4j operator and hook
URL: https://github.com/apache/airflow/pull/6604#issuecomment-570364751
 
 
   > I really like the PR, is neat and clean Just few nits and a question: 
would you mind to add example DAG that uses the new operator? It allows users 
to see how they can use it :)
   
   See feedback here:
   
https://github.com/apache/airflow/pull/6604/files/d7f0127db93f9aec4325e0b25b1402bcb173d326#r360318691
   
   Followed by commit `5668b9d` here:
   
https://github.com/apache/airflow/pull/6604/commits/5668b9deac6916ef65867b48238f021adceb35ae
   
   I can put it back, but feel like we're going around in circles here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #5788: [POC] multi-threading using asyncio

2020-01-02 Thread GitBox
potiuk commented on issue #5788: [POC] multi-threading using asyncio
URL: https://github.com/apache/airflow/pull/5788#issuecomment-570364547
 
 
   @nuclearpinguin -> I think this might also be interesting w/regards to 
NativeExecutor #6750 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642651
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642651
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642481
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642398
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642448
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642513
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642040
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642362
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362642317
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641997
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641891
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641928
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641970
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[jira] [Commented] (AIRFLOW-6430) BigQuery hook - add tests for BigQueryBaseCursor

2020-01-02 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6430?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007097#comment-17007097
 ] 

ASF GitHub Bot commented on AIRFLOW-6430:
-

potiuk commented on pull request #7010: [AIRFLOW-6430] - BigQuery hook - add 
tests for BigQueryBaseCursor
URL: https://github.com/apache/airflow/pull/7010
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> BigQuery hook - add tests for BigQueryBaseCursor
> 
>
> Key: AIRFLOW-6430
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6430
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, hooks, tests
>Affects Versions: 1.10.7
>Reporter: Tobiasz Kedzierski
>Assignee: Tobiasz Kedzierski
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk commented on issue #7010: [AIRFLOW-6430] - BigQuery hook - add tests for BigQueryBaseCursor

2020-01-02 Thread GitBox
potiuk commented on issue #7010: [AIRFLOW-6430] - BigQuery hook - add tests for 
BigQueryBaseCursor
URL: https://github.com/apache/airflow/pull/7010#issuecomment-570361840
 
 
   Thanks @TobKed !


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641080
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[jira] [Resolved] (AIRFLOW-6430) BigQuery hook - add tests for BigQueryBaseCursor

2020-01-02 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6430?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk resolved AIRFLOW-6430.
---
Fix Version/s: 2.0.0
   Resolution: Fixed

> BigQuery hook - add tests for BigQueryBaseCursor
> 
>
> Key: AIRFLOW-6430
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6430
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, hooks, tests
>Affects Versions: 1.10.7
>Reporter: Tobiasz Kedzierski
>Assignee: Tobiasz Kedzierski
>Priority: Minor
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6430) BigQuery hook - add tests for BigQueryBaseCursor

2020-01-02 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6430?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17007099#comment-17007099
 ] 

ASF subversion and git services commented on AIRFLOW-6430:
--

Commit f391039be9cb5a767f4c66771ba70031210d3e76 in airflow's branch 
refs/heads/master from Tobiasz Kedzierski
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f391039 ]

[AIRFLOW-6430] - BigQuery hook - add tests for BigQueryBaseCursor (#7010)




> BigQuery hook - add tests for BigQueryBaseCursor
> 
>
> Key: AIRFLOW-6430
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6430
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp, hooks, tests
>Affects Versions: 1.10.7
>Reporter: Tobiasz Kedzierski
>Assignee: Tobiasz Kedzierski
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362640800
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362641027
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362640991
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

[GitHub] [airflow] mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] configuration docs

2020-01-02 Thread GitBox
mik-laj commented on a change in pull request #6983: [AIRFLOW-6414] 
configuration docs
URL: https://github.com/apache/airflow/pull/6983#discussion_r362640959
 
 

 ##
 File path: docs/howto/configurations-ref.rst
 ##
 @@ -0,0 +1,1141 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Configuration Reference
+===
+
+.. _config-ref/core:
+
+``[core]``
+^^
+
+dags_folder
+***
+The folder where your airflow pipelines live, most likely a subfolder in a 
code repository. This path must be absolute
+
+hostname_callable
+*
+
+Hostname by providing a path to a callable, which will resolve the hostname. 
The format is "package:function". For example, default value "socket:getfqdn" 
means that result from getfqdn() of "socket" package will be used as hostname. 
No argument should be required in the function specified. gIf using IP address 
as hostname is preferred, use value "airflow.utils.net:get_host_ip_address"
+
+default_timezone
+
+
+Default timezone in case supplied date times are naive. Can be utc (default), 
system, or any IANA timezone string (e.g. Europe/Amsterdam)
+
+executor
+*
+
+The executor class that airflow should use. Choices include 
SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor, 
KubernetesExecutor
+
+
+sql_alchemy_conn
+
+
+The SqlAlchemy connection string to the metadata database. SqlAlchemy supports 
many different database engine, more information their website
+
+sql_engine_encoding
+***
+
+The encoding for the databases
+
+sql_alchemy_pool_enabled
+
+
+If SqlAlchemy should pool database connections.
+
+sql_alchemy_pool_size
+*
+The SqlAlchemy pool size is the maximum number of database connections in the 
pool. 0 indicates no limit.
+
+sql_alchemy_max_overflow
+
+The maximum overflow size of the pool.  When the number of checked-out 
connections reaches the size set in pool_size, additional connections will be 
returned up to this limit.  When those additional connections are returned to 
the pool, they are disconnected and discarded.  It follows then that the total 
number of simultaneous connections the pool will allow is pool_size + 
max_overflow, and the total number of "sleeping" connections the pool will 
allow is pool_size.  max_overflow can be set to -1 to indicate no overflow 
limit; no limit will be placed on the total number of concurrent connections. 
Defaults to 10.
+
+sql_alchemy_pool_recycle
+
+The SqlAlchemy pool recycle is the number of seconds a connection can be idle 
in the pool before it is invalidated. This config does not apply to sqlite. If 
the number of DB connections is ever exceeded, a lower config value will allow 
the system to recover faster.
+
+sql_alchemy_pool_pre_ping
+*
+Check connection at the start of each connection pool checkout.  Typically, 
this is a simple statement like "SELECT 1".  More information here: 
https://docs.sqlalchemy.org/en/13/core/pooling.html#disconnect-handling-pessimistic
+sql_alchemy_schema
+**
+The schema to use for the metadata database. SqlAlchemy supports databases 
with the concept of multiple schemas.
+
+sql_alchemy_connect_args
+
+
+Import path for connect args in SqlAlchemy. Default to an empty dict.  This is 
useful when you want to configure db engine args that SqlAlchemy won't parse in 
connection string.  See 
https://docs.sqlalchemy.org/en/13/core/engines.html#sqlalchemy.create_engine.params.connect_args
+
+parallelism
+***
+
+The amount of parallelism as a setting to the executor. This defines the max 
number of task instances that should run simultaneously on this airflow 
installation
+
+dag_concurrency
+***
+
+The number of task instances allowed to run concurrently by the scheduler
+
+dags_are_paused_at_creation
+***
+
+Are DAGs paused by default at creation
+
+max_active_runs_per_dag
+***
+
+The maximum number of active DAG runs per DAG
+
+load_examples
+*
+
+Whether to load 

  1   2   3   4   >