[GitHub] [airflow] bhavaniravi commented on issue #9625: Git-sync retry for worker pods

2020-07-01 Thread GitBox


bhavaniravi commented on issue #9625:
URL: https://github.com/apache/airflow/issues/9625#issuecomment-652794305


   On approval as a valid feature, I would love to work on this :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] bhavaniravi opened a new issue #9625: Git-sync retry for worker pods

2020-07-01 Thread GitBox


bhavaniravi opened a new issue #9625:
URL: https://github.com/apache/airflow/issues/9625


   **Description**
   
   Airflow when using KubernetesExecutor and `git-mode` need to have a 
`GIT_SYNC_MAX_SYNC_FAILURES` option. 
   
   **Use case / motivation**
   
   When running a task with `git-mode` and KubernetesExecutor with self-hosted 
bitbucket or GitLab,  the pod fails because it couldn't connect to the Code 
hosting service thereby marking the task failed. 
   
   While we can set `GIT_SYNC_MAX_SYNC_FAILURES` as suggested by [Kubernetes 
git-sync repo](https://github.com/kubernetes/git-sync), there is no option to 
set this in airflow workers via config.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] tag nightly-master updated (87d83a1 -> 63a8c79)

2020-07-01 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to tag nightly-master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


*** WARNING: tag nightly-master was modified! ***

from 87d83a1  (commit)
  to 63a8c79  (commit)
from 87d83a1  Fix regression in SQLThresholdCheckOperator (#9312)
 add 58edc38  Fix typo in the word 'available' (#9599)
 add 7a54418  Move XCom tests to tests/models/test_xcom.py (#9601)
 add 2d3677f  Fix typo in tutorial.rst (#9605)
 add 8bd15ef  Switches to Helm Chart for Kubernetes tests (#9468)
 add f3e1f9a  Update Breeze documentation (#9608)
 add 48a8316  Fix quarantined tests - TestCliWebServer (#9598)
 add 65855e5  Add docs to change Colors on the Webserver (#9607)
 add 9e305d6  Change default auth for experimental backend to deny_all 
(#9611)
 add a3a52c7  Removes importlib usage - it's not needed (fails on Airflow 
1.10) (#9613)
 add 1655fa9  Restrict changing XCom values from the Webserver (#9614)
 add 7ef7f58  Update docs about the change to default auth for experimental 
API (#9617)
 add bc3f48c  Change 'initiate' to 'initialize' in installation.rst (#9619)
 add 05c88cb  Replace old Variables View Screenshot with new (#9620)
 add 63a8c79  Replace old SubDag zoom screenshot with new (#9621)

No new revisions were added by this update.

Summary of changes:
 .github/workflows/ci.yml   |  27 +-
 BREEZE.rst | 401 -
 CI.rst |   2 +-
 Dockerfile |   4 +
 IMAGES.rst |   3 +
 TESTING.rst|  67 ++--
 UPDATING.md|  21 ++
 airflow/cli/commands/webserver_command.py  |  26 +-
 airflow/config_templates/config.yml|   6 +-
 airflow/config_templates/default_airflow.cfg   |   6 +-
 airflow/kubernetes/pod_launcher.py |   2 +-
 .../cncf/kubernetes/operators/kubernetes_pod.py|   5 +-
 airflow/www/views.py   |   4 +-
 breeze |  51 ++-
 breeze-complete|  14 +-
 chart/README.md|   5 +-
 chart/requirements.lock|   4 +-
 chart/templates/_helpers.yaml  |   4 +-
 chart/templates/configmap.yaml |   2 +
 chart/templates/rbac/pod-launcher-role.yaml|   2 +-
 chart/templates/rbac/pod-launcher-rolebinding.yaml |   4 +-
 docs/howto/customize-state-colors-ui.rst   |  70 
 docs/howto/index.rst   |   1 +
 docs/img/change-ui-colors/dags-page-new.png| Bin 0 -> 483599 bytes
 docs/img/change-ui-colors/dags-page-old.png| Bin 0 -> 493009 bytes
 docs/img/change-ui-colors/graph-view-new.png   | Bin 0 -> 56973 bytes
 docs/img/change-ui-colors/graph-view-old.png   | Bin 0 -> 54884 bytes
 docs/img/change-ui-colors/tree-view-new.png| Bin 0 -> 36934 bytes
 docs/img/change-ui-colors/tree-view-old.png| Bin 0 -> 21601 bytes
 docs/img/subdag_zoom.png   | Bin 150185 -> 255915 bytes
 docs/img/variable_hidden.png   | Bin 154299 -> 121301 bytes
 docs/installation.rst  |   6 +-
 docs/security.rst  |  18 +-
 docs/tutorial.rst  |   2 +-
 kubernetes_tests/test_kubernetes_executor.py   |  40 +-
 requirements/requirements-python3.6.txt|  16 +-
 requirements/requirements-python3.7.txt|  16 +-
 requirements/requirements-python3.8.txt|  16 +-
 requirements/setup-3.6.md5 |   2 +-
 requirements/setup-3.7.md5 |   2 +-
 requirements/setup-3.8.md5 |   2 +-
 scripts/ci/ci_build_production_images.sh   |  25 --
 scripts/ci/ci_count_changed_files.sh   |   2 +-
 scripts/ci/ci_deploy_app_to_kubernetes.sh  |  16 +-
 scripts/ci/ci_docs.sh  |   2 +-
 scripts/ci/ci_flake8.sh|   2 +-
 scripts/ci/ci_generate_requirements.sh |   2 +-
 scripts/ci/ci_load_image_to_kind.sh|   7 +-
 scripts/ci/ci_mypy.sh  |   2 +-
 scripts/ci/ci_perform_kind_cluster_operation.sh|   6 +-
 scripts/ci/ci_prepare_backport_packages.sh |   2 +-
 scripts/ci/ci_prepare_backport_readme.sh   |   2 +-
 scripts/ci/ci_pylint_main.sh   |   2 +-
 scripts/ci/ci_pylint_tests.sh  |   2 +-
 scripts/ci/ci_refresh_pylint_todo.sh   |   2 +-
 scripts/ci/ci_run_airflow_testing.sh   |   2 +-
 

[GitHub] [airflow] houqp commented on pull request #9502: generate go client from openapi spec

2020-07-01 Thread GitBox


houqp commented on pull request #9502:
URL: https://github.com/apache/airflow/pull/9502#issuecomment-652739173


   @potiuk @mik-laj github actions are passing now, ready for another round of 
review.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] houqp commented on a change in pull request #9502: generate go client from openapi spec

2020-07-01 Thread GitBox


houqp commented on a change in pull request #9502:
URL: https://github.com/apache/airflow/pull/9502#discussion_r448700155



##
File path: chart/values.yaml
##
@@ -223,12 +223,12 @@ webserver:
   extraNetworkPolicies: []
 
   resources: {}
-# limits:
-#   cpu: 100m
-#   memory: 128Mi
-# requests:
-#   cpu: 100m
-#   memory: 128Mi
+  #   limits:

Review comment:
   fixed yaml lint warning reported in ci run





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow-client-go] houqp commented on a change in pull request #1: Add generated go client

2020-07-01 Thread GitBox


houqp commented on a change in pull request #1:
URL: https://github.com/apache/airflow-client-go/pull/1#discussion_r448693291



##
File path: README.md
##
@@ -1 +1,62 @@
-# Airflow API Client for Go
+
+Airflow Go client
+=
+
+Go Airflow OpenAPI client generated from [openapi 
spec](https://github.com/apache/airflow/tree/master/clients).

Review comment:
   @turbaszek I updated the link to point to 
`https://github.com/apache/airflow/blob/master/clients/gen/go.sh`, which will 
be a valid link once https://github.com/apache/airflow/pull/9502 gets merged. I 
think linking to the actual automation script might be better here since there 
is no guarantee that future of the client will be generated using the same 
v1.yaml spec. It also provides more context to users on how this client is 
actually generated so it's easier for them to contribute change back to the 
airflow repo.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy opened a new pull request #9624: Move StackdriverTaskHandler to the provider package

2020-07-01 Thread GitBox


ephraimbuddy opened a new pull request #9624:
URL: https://github.com/apache/airflow/pull/9624


   ---
   This PR fixes one of the issues listed in #9386
   
   The `StackdriverTaskHandler` class from 
`airflow.utils.log.stackdriver_task_handler` was moved to
   `airflow.providers.google.cloud.log.stackdriver_task_handler`.
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Target Github ISSUE in description if exists
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy opened a new pull request #9623: Move ElasticsearchTaskHandler to the provider package

2020-07-01 Thread GitBox


ephraimbuddy opened a new pull request #9623:
URL: https://github.com/apache/airflow/pull/9623


   ---
   This PR fixes one of the issues listed in #9386
   
   The `ElasticsearchTaskHandler` class from 
`airflow.utils.log.es_task_handler` was moved to
   `airflow.providers.elasticsearch.log.es_task_handler`.
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Target Github ISSUE in description if exists
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9622: Fix docstrings in exceptions.py

2020-07-01 Thread GitBox


kaxil opened a new pull request #9622:
URL: https://github.com/apache/airflow/pull/9622


   Fix incorrect docstrings
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 09/10: Add __repr__ for DagTag so tags display properly in /dagmodel/show (#8719)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c99194a6d4bc4327061257aa249e9d50aebfaf1e
Author: Xiaodong DENG 
AuthorDate: Tue May 5 20:32:18 2020 +0200

Add __repr__ for DagTag so tags display properly in /dagmodel/show (#8719)

(cherry picked from commit c717d12f47c604082afc106b7a4a1f71d91f73e2)
---
 airflow/models/dag.py|  3 +++
 tests/models/test_dag.py | 12 +++-
 2 files changed, 14 insertions(+), 1 deletion(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index e24c164..94c6d2e 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1711,6 +1711,9 @@ class DagTag(Base):
 name = Column(String(100), primary_key=True)
 dag_id = Column(String(ID_LEN), ForeignKey('dag.dag_id'), primary_key=True)
 
+def __repr__(self):
+return self.name
+
 
 class DagModel(Base):
 
diff --git a/tests/models/test_dag.py b/tests/models/test_dag.py
index 5d9d05d..b711a8b 100644
--- a/tests/models/test_dag.py
+++ b/tests/models/test_dag.py
@@ -36,7 +36,7 @@ from mock import patch
 from airflow import models, settings
 from airflow.configuration import conf
 from airflow.exceptions import AirflowException, AirflowDagCycleException
-from airflow.models import DAG, DagModel, TaskInstance as TI
+from airflow.models import DAG, DagModel, DagTag, TaskInstance as TI
 from airflow.operators.bash_operator import BashOperator
 from airflow.operators.dummy_operator import DummyOperator
 from airflow.operators.subdag_operator import SubDagOperator
@@ -46,6 +46,7 @@ from airflow.utils.db import create_session
 from airflow.utils.state import State
 from airflow.utils.weight_rule import WeightRule
 from tests.models import DEFAULT_DATE
+from tests.test_utils.db import clear_db_dags
 
 
 class DagTest(unittest.TestCase):
@@ -657,6 +658,15 @@ class DagTest(unittest.TestCase):
 self.assertEqual(prev_local.isoformat(), "2018-03-24T03:00:00+01:00")
 self.assertEqual(prev.isoformat(), "2018-03-24T02:00:00+00:00")
 
+def test_dagtag_repr(self):
+clear_db_dags()
+dag = DAG('dag-test-dagtag', start_date=DEFAULT_DATE, tags=['tag-1', 
'tag-2'])
+dag.sync_to_db()
+with create_session() as session:
+self.assertEqual({'tag-1', 'tag-2'},
+ {repr(t) for t in session.query(DagTag).filter(
+ DagTag.dag_id == 'dag-test-dagtag').all()})
+
 @patch('airflow.models.dag.timezone.utcnow')
 def test_sync_to_db(self, mock_now):
 dag = DAG(



[airflow] 08/10: Update version_added of configs added in 1.10.11

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 84fc6b482e4fe563f42f60914fe35cc744283715
Author: Kaxil Naik 
AuthorDate: Thu Jul 2 00:33:42 2020 +0100

Update version_added of configs added in 1.10.11
---
 airflow/config_templates/config.yml | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 0d52426..3dd0a58 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -705,7 +705,7 @@
   description: |
 If set to True, Airflow will track files in plugins_folder directory. 
When it detects changes,
 then reload the gunicorn.
-  version_added: ~
+  version_added: 1.10.11
   type: boolean
   example: ~
   default: "False"
@@ -1749,7 +1749,7 @@
 - name: pod_template_file
   description: |
 Path to the YAML pod file. If set, all other kubernetes-related fields 
are ignored.
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: ""
@@ -1776,7 +1776,7 @@
   description: |
 If False (and delete_worker_pods is True),
 failed worker pods will not be deleted so users can investigate them.
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: "False"
@@ -1847,7 +1847,7 @@
 - name: dags_volume_mount_point
   description: |
 For either git sync or volume mounted DAGs, the worker will mount the 
volume in this path
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: ""



[airflow] 04/10: Replace old Variables View Screenshot with new (#9620)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0610df17ecb4ab6d740d09f5ed25d364798acffd
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 23:51:18 2020 +0100

Replace old Variables View Screenshot with new (#9620)

(cherry picked from commit 05c88cb392e63e9f12f05915b94d216f22b652dd)
---
 docs/img/variable_hidden.png | Bin 154299 -> 121301 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/img/variable_hidden.png b/docs/img/variable_hidden.png
index e081ca3..d982b92 100644
Binary files a/docs/img/variable_hidden.png and b/docs/img/variable_hidden.png 
differ



[airflow] 05/10: Restrict changing XCom values from the Webserver (#9614)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 30325f8e87beb19148f29aa9d3aafc7439c9958a
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 22:13:10 2020 +0100

Restrict changing XCom values from the Webserver (#9614)

(cherry-picked from 1655fa9253ba8f61ccda77780a9e94766c15f565)
---
 UPDATING.md   | 6 ++
 airflow/www/views.py  | 2 ++
 airflow/www_rbac/views.py | 4 +---
 3 files changed, 9 insertions(+), 3 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index ec193f9..61734bb 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -89,6 +89,12 @@ the previous behaviour on a new install by setting this in 
your airflow.cfg:
 auth_backend = airflow.api.auth.backend.default
 ```
 
+### XCom Values can no longer be added or changed from the Webserver
+
+Since XCom values can contain pickled data, we would no longer allow adding or
+changing XCom values from the UI.
+
+
 ## Airflow 1.10.10
 
 ### Setting Empty string to a Airflow Variable will return an empty string
diff --git a/airflow/www/views.py b/airflow/www/views.py
index a3293c8..abd1b9e 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2754,6 +2754,8 @@ class VariableView(wwwutils.DataProfilingMixin, 
AirflowModelView):
 
 
 class XComView(wwwutils.SuperUserMixin, AirflowModelView):
+can_create = False
+can_edit = False
 verbose_name = "XCom"
 verbose_name_plural = "XComs"
 
diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 67a7493..96d4079 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -2233,12 +2233,10 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_list', 'can_delete']
 
 search_columns = ['key', 'value', 'timestamp', 'execution_date', 
'task_id', 'dag_id']
 list_columns = ['key', 'value', 'timestamp', 'execution_date', 'task_id', 
'dag_id']
-add_columns = ['key', 'value', 'execution_date', 'task_id', 'dag_id']
-edit_columns = ['key', 'value', 'execution_date', 'task_id', 'dag_id']
 base_order = ('execution_date', 'desc')
 
 base_filters = [['dag_id', DagFilter, lambda: []]]



[airflow] 03/10: Change 'initiate' to 'initialize' in installation.rst (#9619)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e5662b96466408f5ed9472cee41e737a150471df
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 23:34:25 2020 +0100

Change 'initiate' to 'initialize' in installation.rst (#9619)

`Initiating Airflow Database` -> `Initializing Airflow Database`

(cherry picked from commit bc3f48c96603dbd2a94a33f05dd816097ccab3f1)
---
 docs/installation.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/installation.rst b/docs/installation.rst
index e7c8970..d5652cb 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -180,10 +180,10 @@ Here's the list of the subpackages and what they enable:
 | vertica | ``pip install 'apache-airflow[vertica]'``   | 
Vertica hook support as an Airflow backend   |
 
+-+-+--+
 
-Initiating Airflow Database
-'''
+Initializing Airflow Database
+'
 
-Airflow requires a database to be initiated before you can run tasks. If
+Airflow requires a database to be initialized before you can run tasks. If
 you're just experimenting and learning Airflow, you can stick with the
 default SQLite option. If you don't want to use SQLite, then take a look at
 :doc:`howto/initialize-database` to setup a different database.



[airflow] 02/10: Add docs to change Colors on the Webserver (#9607)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 127104b7e4388be2d0e0b64e4316980d9b6928a9
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 16:17:29 2020 +0100

Add docs to change Colors on the Webserver (#9607)

Feature was added in https://github.com/apache/airflow/pull/9520

(cherry picked from commit 65855e558f9b070004ba817e8cf94fd834d2a67a)
---
 docs/howto/customize-state-colors-ui.rst |  70 +++
 docs/howto/index.rst |   1 +
 docs/img/change-ui-colors/dags-page-new.png  | Bin 0 -> 483599 bytes
 docs/img/change-ui-colors/dags-page-old.png  | Bin 0 -> 493009 bytes
 docs/img/change-ui-colors/graph-view-new.png | Bin 0 -> 56973 bytes
 docs/img/change-ui-colors/graph-view-old.png | Bin 0 -> 54884 bytes
 docs/img/change-ui-colors/tree-view-new.png  | Bin 0 -> 36934 bytes
 docs/img/change-ui-colors/tree-view-old.png  | Bin 0 -> 21601 bytes
 8 files changed, 71 insertions(+)

diff --git a/docs/howto/customize-state-colors-ui.rst 
b/docs/howto/customize-state-colors-ui.rst
new file mode 100644
index 000..c856950
--- /dev/null
+++ b/docs/howto/customize-state-colors-ui.rst
@@ -0,0 +1,70 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Customizing state colours in UI
+===
+
+.. versionadded:: 1.10.11
+
+To change the colors for TaskInstance/DagRun State in the Airflow Webserver, 
perform the
+following steps:
+
+1.  Create ``airflow_local_settings.py`` file and put in on ``$PYTHONPATH`` or
+to ``$AIRFLOW_HOME/config`` folder. (Airflow adds ``$AIRFLOW_HOME/config`` 
on ``PYTHONPATH`` when
+Airflow is initialized)
+
+2.  Add the following contents to ``airflow_local_settings.py`` file. Change 
the colors to whatever you
+would like.
+
+.. code-block:: python
+
+  STATE_COLORS = {
+"queued": 'darkgray',
+"running": '#01FF70',
+"success": '#2ECC40',
+"failed": 'firebrick',
+"up_for_retry": 'yellow',
+"up_for_reschedule": 'turquoise',
+"upstream_failed": 'orange',
+"skipped": 'darkorchid',
+"scheduled": 'tan',
+  }
+
+
+
+3.  Restart Airflow Webserver.
+
+Screenshots
+---
+
+Before
+^^
+
+.. image:: ../img/change-ui-colors/dags-page-old.png
+
+.. image:: ../img/change-ui-colors/graph-view-old.png
+
+.. image:: ../img/change-ui-colors/tree-view-old.png
+
+After
+^^
+
+.. image:: ../img/change-ui-colors/dags-page-new.png
+
+.. image:: ../img/change-ui-colors/graph-view-new.png
+
+.. image:: ../img/change-ui-colors/tree-view-new.png
diff --git a/docs/howto/index.rst b/docs/howto/index.rst
index ae20c91..2b91c77 100644
--- a/docs/howto/index.rst
+++ b/docs/howto/index.rst
@@ -32,6 +32,7 @@ configuring an Airflow environment.
 set-config
 initialize-database
 operator/index
+customize-state-colors-ui
 custom-operator
 connection/index
 secure-connections
diff --git a/docs/img/change-ui-colors/dags-page-new.png 
b/docs/img/change-ui-colors/dags-page-new.png
new file mode 100644
index 000..d2ffe1f
Binary files /dev/null and b/docs/img/change-ui-colors/dags-page-new.png differ
diff --git a/docs/img/change-ui-colors/dags-page-old.png 
b/docs/img/change-ui-colors/dags-page-old.png
new file mode 100644
index 000..5078d01
Binary files /dev/null and b/docs/img/change-ui-colors/dags-page-old.png differ
diff --git a/docs/img/change-ui-colors/graph-view-new.png 
b/docs/img/change-ui-colors/graph-view-new.png
new file mode 100644
index 000..b367461
Binary files /dev/null and b/docs/img/change-ui-colors/graph-view-new.png differ
diff --git a/docs/img/change-ui-colors/graph-view-old.png 
b/docs/img/change-ui-colors/graph-view-old.png
new file mode 100644
index 000..ceaf8d4
Binary files /dev/null and b/docs/img/change-ui-colors/graph-view-old.png differ
diff --git a/docs/img/change-ui-colors/tree-view-new.png 
b/docs/img/change-ui-colors/tree-view-new.png
new file mode 100644
index 000..6a5b2d7
Binary files /dev/null and b/docs/img/change-ui-colors/tree-view-new.png differ
diff 

[airflow] 06/10: Replace old SubDag zoom screenshot with new (#9621)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0663b4ecce46b5ea95febca345f5605cb232786a
Author: Kaxil Naik 
AuthorDate: Thu Jul 2 00:18:08 2020 +0100

Replace old SubDag zoom screenshot with new (#9621)

(cherry picked from commit 63a8c79aa9f2dfc3f0802e19b98b0c0b0f6b7858)
---
 docs/img/subdag_zoom.png | Bin 150185 -> 255915 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/img/subdag_zoom.png b/docs/img/subdag_zoom.png
index 08fcf5c..fe5ce5a 100644
Binary files a/docs/img/subdag_zoom.png and b/docs/img/subdag_zoom.png differ



[airflow] 10/10: Add Changelog for 1.10.11

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9a212d11eb4f96c38fac8d6bfc7c0ded0493fa42
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 19:40:29 2020 +0100

Add Changelog for 1.10.11
---
 .pre-commit-config.yaml |   3 +-
 CHANGELOG.txt   | 198 
 2 files changed, 200 insertions(+), 1 deletion(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 469c606..1b7d88a 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -245,7 +245,8 @@ repos:
   (?x)
   ^airflow/contrib/hooks/cassandra_hook.py$|
   ^airflow/operators/hive_stats_operator.py$|
-  ^tests/contrib/hooks/test_cassandra_hook.py
+  ^tests/contrib/hooks/test_cassandra_hook.py|
+  ^CHANGELOG.txt
   - id: dont-use-safe-filter
 language: pygrep
 name: Don't use safe in templates
diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index a8aa353..81be084 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,3 +1,201 @@
+Airflow 1.10.11, 2020-07-05
+-
+
+New Features
+
+
+- Add task instance mutation hook (#8852)
+- Allow changing Task States Colors (#9520)
+- Add support for AWS Secrets Manager as Secrets Backend (#8186)
+- Add airflow info command to the CLI (#8704)
+- Add Local Filesystem Secret Backend (#8596)
+- Add Airflow config CLI command (#8694)
+- Add Support for Python 3.8 (#8836)(#8823)
+- Allow K8S worker pod to be configured from JSON/YAML file (#6230)
+- Add quarterly to crontab presets (#6873)
+- Add support for ephemeral storage on KubernetesPodOperator (#6337)
+- Add AirflowFailException to fail without any retry (#7133)
+- Add SQL Branch Operator (#8942)
+
+Bug Fixes
+"
+
+- Use NULL as dag.description default value (#7593)
+- BugFix: DAG trigger via UI error in RBAC UI (#8411)
+- Fix logging issue when running tasks (#9363)
+- Fix JSON encoding error in DockerOperator (#8287)
+- Fix alembic crash due to typing import (#6547)
+- Correctly restore upstream_task_ids when deserializing Operators (#8775)
+- Correctly store non-default Nones in serialized tasks/dags (#8772)
+- Correctly deserialize dagrun_timeout field on DAGs (#8735)
+- Fix tree view if config contains " (#9250)
+- Fix Dag Run UI execution date with timezone cannot be saved issue (#8902)
+- Fix Migration for MSSQL (#8385)
+- RBAC ui: Fix missing Y-axis labels with units in plots (#8252)
+- RBAC ui: Fix missing task runs being rendered as circles instead (#8253)
+- Fix: DagRuns page renders the state column with artifacts in old UI (#9612)
+- Fix task and dag stats on home page (#8865)
+- Fix the trigger_dag api in the case of nested subdags (#8081)
+- UX Fix: Prevent undesired text selection with DAG title selection in Chrome 
(#8912)
+- Fix connection add/edit for spark (#8685)
+- Fix retries causing constraint violation on MySQL with DAG Serialization 
(#9336)
+- [AIRFLOW-4472] Use json.dumps/loads for templating lineage data (#5253)
+- Restrict google-cloud-texttospeach to 

[airflow] branch v1-10-test updated (0e4c3a2 -> 9a212d1)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 0e4c3a2  Update version_added of configs added in 1.10.11
 discard 011282e  fixup! fixup! Add Changelog for 1.10.11
 discard efe86e5  Update docs about the change to default auth for experimental 
API (#9617)
 discard 3c8fb4a  Replace old SubDag zoom screenshot with new (#9621)
 discard ccf47ae  Restrict changing XCom values from the Webserver (#9614)
 discard a177e5d  Replace old Variables View Screenshot with new (#9620)
 discard cc3f09f  Change 'initiate' to 'initialize' in installation.rst (#9619)
 discard 670c7d4  Add docs to change Colors on the Webserver (#9607)
 discard d56fdb4  fixup! Add Changelog for 1.10.11
 discard 89329c4  Change default auth for experimental backend to deny_all 
(#9611)
omit fae2e63  Add Changelog for 1.10.11
 new 92c5f46  Change default auth for experimental backend to deny_all 
(#9611)
 new 127104b  Add docs to change Colors on the Webserver (#9607)
 new e5662b9  Change 'initiate' to 'initialize' in installation.rst (#9619)
 new 0610df1  Replace old Variables View Screenshot with new (#9620)
 new 30325f8  Restrict changing XCom values from the Webserver (#9614)
 new 0663b4e  Replace old SubDag zoom screenshot with new (#9621)
 new 0d520f2  Update docs about the change to default auth for experimental 
API (#9617)
 new 84fc6b4  Update version_added of configs added in 1.10.11
 new c99194a  Add __repr__ for DagTag so tags display properly in 
/dagmodel/show (#8719)
 new 9a212d1  Add Changelog for 1.10.11

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (0e4c3a2)
\
 N -- N -- N   refs/heads/v1-10-test (9a212d1)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 10 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 CHANGELOG.txt|  1 +
 airflow/models/dag.py|  3 +++
 tests/models/test_dag.py | 12 +++-
 3 files changed, 15 insertions(+), 1 deletion(-)



[airflow] 01/10: Change default auth for experimental backend to deny_all (#9611)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 92c5f46405497f4d9d9ce6d5cd3544b998be3714
Author: Ash Berlin-Taylor 
AuthorDate: Wed Jul 1 17:04:35 2020 +0100

Change default auth for experimental backend to deny_all (#9611)

In a move that should surprise no one, a number of users do not read,
and leave the API wide open by default. Safe is better than powned

(cherry picked from commit 9e305d6b810a2a21e2591a80a80ec41acb3afed0)
---
 UPDATING.md  | 16 
 airflow/config_templates/config.yml  |  6 --
 airflow/config_templates/default_airflow.cfg |  6 --
 3 files changed, 24 insertions(+), 4 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 3dfda58..ec193f9 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -73,6 +73,22 @@ Before 1.10.11 it was possible to edit DagRun State in the 
`/admin/dagrun/` page
 
 In Airflow 1.10.11+, the user can only choose the states from the list.
 
+### Experimental API will deny all request by default.
+
+The previous default setting was to allow all API requests without 
authentication, but this poses security
+risks to users who miss this fact. This changes the default for new installs 
to deny all requests by default.
+
+**Note**: This will not change the behavior for existing installs, please 
update check your airflow.cfg
+
+If you wish to have the experimental API work, and aware of the risks of 
enabling this without authentication
+(or if you have your own authentication layer in front of Airflow) you can get
+the previous behaviour on a new install by setting this in your airflow.cfg:
+
+```
+[api]
+auth_backend = airflow.api.auth.backend.default
+```
+
 ## Airflow 1.10.10
 
 ### Setting Empty string to a Airflow Variable will return an empty string
diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index f632cd5..0d52426 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -524,11 +524,13 @@
   options:
 - name: auth_backend
   description: |
-How to authenticate users of the API
+How to authenticate users of the API. See
+https://airflow.apache.org/docs/stable/security.html for possible 
values.
+("airflow.api.auth.backend.default" allows all requests for historic 
reasons)
   version_added: ~
   type: string
   example: ~
-  default: "airflow.api.auth.backend.default"
+  default: "airflow.api.auth.backend.deny_all"
 - name: lineage
   description: ~
   options:
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index a061d46..63bd3cb 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -274,8 +274,10 @@ endpoint_url = http://localhost:8080
 fail_fast = False
 
 [api]
-# How to authenticate users of the API
-auth_backend = airflow.api.auth.backend.default
+# How to authenticate users of the API. See
+# https://airflow.apache.org/docs/stable/security.html for possible values.
+# ("airflow.api.auth.backend.default" allows all requests for historic reasons)
+auth_backend = airflow.api.auth.backend.deny_all
 
 [lineage]
 # what lineage backend to use



[airflow] 07/10: Update docs about the change to default auth for experimental API (#9617)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0d520f28c6232d94d1aae6118641ae4a06fae44f
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 22:59:13 2020 +0100

Update docs about the change to default auth for experimental API (#9617)

(cherry picked from commit 7ef7f5880dfefc6e33cb7bf331927aa08e1bb438)
---
 docs/security.rst | 18 +++---
 1 file changed, 15 insertions(+), 3 deletions(-)

diff --git a/docs/security.rst b/docs/security.rst
index 863a454..3817c7f 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -159,15 +159,27 @@ only the dags which it is owner of, unless it is a 
superuser.
 API Authentication
 --
 
-Authentication for the API is handled separately to the Web Authentication. 
The default is to not
-require any authentication on the API i.e. wide open by default. This is not 
recommended if your
-Airflow webserver is publicly accessible, and you should probably use the 
``deny all`` backend:
+Authentication for the API is handled separately to the Web Authentication. 
The default is to
+deny all requests:
 
 .. code-block:: ini
 
 [api]
 auth_backend = airflow.api.auth.backend.deny_all
 
+.. versionchanged:: 1.10.11
+
+In Airflow <1.10.11, the default setting was to allow all API requests 
without authentication, but this
+posed security risks for if the Webserver is publicly accessible.
+
+If you wish to have the experimental API work, and aware of the risks of 
enabling this without authentication
+(or if you have your own authentication layer in front of Airflow) you can set 
the following in ``airflow.cfg``:
+
+.. code-block:: ini
+
+[api]
+auth_backend = airflow.api.auth.backend.default
+
 Two "real" methods for authentication are currently supported for the API.
 
 To enabled Password authentication, set the following in the configuration:



[airflow] branch v1-10-test updated: Update version_added of configs added in 1.10.11

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 0e4c3a2  Update version_added of configs added in 1.10.11
0e4c3a2 is described below

commit 0e4c3a266d7b8f657a678a0ef78b7798ffbb6adc
Author: Kaxil Naik 
AuthorDate: Thu Jul 2 00:33:42 2020 +0100

Update version_added of configs added in 1.10.11
---
 airflow/config_templates/config.yml | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index 0d52426..3dd0a58 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -705,7 +705,7 @@
   description: |
 If set to True, Airflow will track files in plugins_folder directory. 
When it detects changes,
 then reload the gunicorn.
-  version_added: ~
+  version_added: 1.10.11
   type: boolean
   example: ~
   default: "False"
@@ -1749,7 +1749,7 @@
 - name: pod_template_file
   description: |
 Path to the YAML pod file. If set, all other kubernetes-related fields 
are ignored.
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: ""
@@ -1776,7 +1776,7 @@
   description: |
 If False (and delete_worker_pods is True),
 failed worker pods will not be deleted so users can investigate them.
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: "False"
@@ -1847,7 +1847,7 @@
 - name: dags_volume_mount_point
   description: |
 For either git sync or volume mounted DAGs, the worker will mount the 
volume in this path
-  version_added: ~
+  version_added: 1.10.11
   type: string
   example: ~
   default: ""



[airflow] 02/07: Change 'initiate' to 'initialize' in installation.rst (#9619)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit cc3f09f1a85bbecf51657def6c027d80bda21663
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 23:34:25 2020 +0100

Change 'initiate' to 'initialize' in installation.rst (#9619)

`Initiating Airflow Database` -> `Initializing Airflow Database`

(cherry picked from commit bc3f48c96603dbd2a94a33f05dd816097ccab3f1)
---
 docs/installation.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/installation.rst b/docs/installation.rst
index e7c8970..d5652cb 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -180,10 +180,10 @@ Here's the list of the subpackages and what they enable:
 | vertica | ``pip install 'apache-airflow[vertica]'``   | 
Vertica hook support as an Airflow backend   |
 
+-+-+--+
 
-Initiating Airflow Database
-'''
+Initializing Airflow Database
+'
 
-Airflow requires a database to be initiated before you can run tasks. If
+Airflow requires a database to be initialized before you can run tasks. If
 you're just experimenting and learning Airflow, you can stick with the
 default SQLite option. If you don't want to use SQLite, then take a look at
 :doc:`howto/initialize-database` to setup a different database.



[airflow] branch v1-10-test updated (d56fdb4 -> 011282e)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from d56fdb4  fixup! Add Changelog for 1.10.11
 new 670c7d4  Add docs to change Colors on the Webserver (#9607)
 new cc3f09f  Change 'initiate' to 'initialize' in installation.rst (#9619)
 new a177e5d  Replace old Variables View Screenshot with new (#9620)
 new ccf47ae  Restrict changing XCom values from the Webserver (#9614)
 new 3c8fb4a  Replace old SubDag zoom screenshot with new (#9621)
 new efe86e5  Update docs about the change to default auth for experimental 
API (#9617)
 new 011282e  fixup! fixup! Add Changelog for 1.10.11

The 7 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 CHANGELOG.txt|   6 +++
 UPDATING.md  |   6 +++
 airflow/www/views.py |   2 +
 airflow/www_rbac/views.py|   4 +-
 docs/howto/customize-state-colors-ui.rst |  70 +++
 docs/howto/index.rst |   1 +
 docs/img/change-ui-colors/dags-page-new.png  | Bin 0 -> 483599 bytes
 docs/img/change-ui-colors/dags-page-old.png  | Bin 0 -> 493009 bytes
 docs/img/change-ui-colors/graph-view-new.png | Bin 0 -> 56973 bytes
 docs/img/change-ui-colors/graph-view-old.png | Bin 0 -> 54884 bytes
 docs/img/change-ui-colors/tree-view-new.png  | Bin 0 -> 36934 bytes
 docs/img/change-ui-colors/tree-view-old.png  | Bin 0 -> 21601 bytes
 docs/img/subdag_zoom.png | Bin 150185 -> 255915 bytes
 docs/img/variable_hidden.png | Bin 154299 -> 121301 bytes
 docs/installation.rst|   6 +--
 docs/security.rst|  18 +--
 16 files changed, 104 insertions(+), 9 deletions(-)
 create mode 100644 docs/howto/customize-state-colors-ui.rst
 create mode 100644 docs/img/change-ui-colors/dags-page-new.png
 create mode 100644 docs/img/change-ui-colors/dags-page-old.png
 create mode 100644 docs/img/change-ui-colors/graph-view-new.png
 create mode 100644 docs/img/change-ui-colors/graph-view-old.png
 create mode 100644 docs/img/change-ui-colors/tree-view-new.png
 create mode 100644 docs/img/change-ui-colors/tree-view-old.png



[airflow] 03/07: Replace old Variables View Screenshot with new (#9620)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a177e5d4f4cd1d3d53a2897900fc964d1a6a752a
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 23:51:18 2020 +0100

Replace old Variables View Screenshot with new (#9620)

(cherry picked from commit 05c88cb392e63e9f12f05915b94d216f22b652dd)
---
 docs/img/variable_hidden.png | Bin 154299 -> 121301 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/img/variable_hidden.png b/docs/img/variable_hidden.png
index e081ca3..d982b92 100644
Binary files a/docs/img/variable_hidden.png and b/docs/img/variable_hidden.png 
differ



[airflow] 07/07: fixup! fixup! Add Changelog for 1.10.11

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 011282e55527d877fbd892f09e6790c527384c5b
Author: Kaxil Naik 
AuthorDate: Thu Jul 2 00:27:08 2020 +0100

fixup! fixup! Add Changelog for 1.10.11
---
 CHANGELOG.txt | 6 ++
 1 file changed, 6 insertions(+)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index bbf7ef9..f9e4dad 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -142,6 +142,7 @@ Improvements
 - Merging multiple sql operators (#9124)
 - Adds hive as extra in pyhive dependency (#9075)
 - Change default auth for experimental backend to deny_all (#9611)
+- Restrict changing XCom values from the Webserver (#9614)
 
 Doc only changes
 
@@ -187,6 +188,11 @@ Doc only changes
 - Remove non-existent chart value from readme (#9511)
 - Fix typo in helm chart upgrade command for 2.0 (#9484)
 - Don't use the term "whitelist" - language matters (#9174)
+- Add docs to change Colors on the Webserver (#9607)
+- Change 'initiate' to 'initialize' in installation.rst (#9619)
+- Replace old Variables View Screenshot with new (#9620)
+- Replace old SubDag zoom screenshot with new (#9621)
+- Update docs about the change to default auth for experimental API (#9617)
 
 
 Airflow 1.10.10, 2020-04-09



[airflow] 04/07: Restrict changing XCom values from the Webserver (#9614)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ccf47ae7548740a5f65443cbca85b22884548cc7
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 22:13:10 2020 +0100

Restrict changing XCom values from the Webserver (#9614)

(cherry-picked from 1655fa9253ba8f61ccda77780a9e94766c15f565)
---
 UPDATING.md   | 6 ++
 airflow/www/views.py  | 2 ++
 airflow/www_rbac/views.py | 4 +---
 3 files changed, 9 insertions(+), 3 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index ec193f9..61734bb 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -89,6 +89,12 @@ the previous behaviour on a new install by setting this in 
your airflow.cfg:
 auth_backend = airflow.api.auth.backend.default
 ```
 
+### XCom Values can no longer be added or changed from the Webserver
+
+Since XCom values can contain pickled data, we would no longer allow adding or
+changing XCom values from the UI.
+
+
 ## Airflow 1.10.10
 
 ### Setting Empty string to a Airflow Variable will return an empty string
diff --git a/airflow/www/views.py b/airflow/www/views.py
index a3293c8..abd1b9e 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2754,6 +2754,8 @@ class VariableView(wwwutils.DataProfilingMixin, 
AirflowModelView):
 
 
 class XComView(wwwutils.SuperUserMixin, AirflowModelView):
+can_create = False
+can_edit = False
 verbose_name = "XCom"
 verbose_name_plural = "XComs"
 
diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 67a7493..96d4079 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -2233,12 +2233,10 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_list', 'can_delete']
 
 search_columns = ['key', 'value', 'timestamp', 'execution_date', 
'task_id', 'dag_id']
 list_columns = ['key', 'value', 'timestamp', 'execution_date', 'task_id', 
'dag_id']
-add_columns = ['key', 'value', 'execution_date', 'task_id', 'dag_id']
-edit_columns = ['key', 'value', 'execution_date', 'task_id', 'dag_id']
 base_order = ('execution_date', 'desc')
 
 base_filters = [['dag_id', DagFilter, lambda: []]]



[airflow] 01/07: Add docs to change Colors on the Webserver (#9607)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 670c7d4ffbcda101a5ac451aa1aa8a88a054e30f
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 16:17:29 2020 +0100

Add docs to change Colors on the Webserver (#9607)

Feature was added in https://github.com/apache/airflow/pull/9520

(cherry picked from commit 65855e558f9b070004ba817e8cf94fd834d2a67a)
---
 docs/howto/customize-state-colors-ui.rst |  70 +++
 docs/howto/index.rst |   1 +
 docs/img/change-ui-colors/dags-page-new.png  | Bin 0 -> 483599 bytes
 docs/img/change-ui-colors/dags-page-old.png  | Bin 0 -> 493009 bytes
 docs/img/change-ui-colors/graph-view-new.png | Bin 0 -> 56973 bytes
 docs/img/change-ui-colors/graph-view-old.png | Bin 0 -> 54884 bytes
 docs/img/change-ui-colors/tree-view-new.png  | Bin 0 -> 36934 bytes
 docs/img/change-ui-colors/tree-view-old.png  | Bin 0 -> 21601 bytes
 8 files changed, 71 insertions(+)

diff --git a/docs/howto/customize-state-colors-ui.rst 
b/docs/howto/customize-state-colors-ui.rst
new file mode 100644
index 000..c856950
--- /dev/null
+++ b/docs/howto/customize-state-colors-ui.rst
@@ -0,0 +1,70 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+Customizing state colours in UI
+===
+
+.. versionadded:: 1.10.11
+
+To change the colors for TaskInstance/DagRun State in the Airflow Webserver, 
perform the
+following steps:
+
+1.  Create ``airflow_local_settings.py`` file and put in on ``$PYTHONPATH`` or
+to ``$AIRFLOW_HOME/config`` folder. (Airflow adds ``$AIRFLOW_HOME/config`` 
on ``PYTHONPATH`` when
+Airflow is initialized)
+
+2.  Add the following contents to ``airflow_local_settings.py`` file. Change 
the colors to whatever you
+would like.
+
+.. code-block:: python
+
+  STATE_COLORS = {
+"queued": 'darkgray',
+"running": '#01FF70',
+"success": '#2ECC40',
+"failed": 'firebrick',
+"up_for_retry": 'yellow',
+"up_for_reschedule": 'turquoise',
+"upstream_failed": 'orange',
+"skipped": 'darkorchid',
+"scheduled": 'tan',
+  }
+
+
+
+3.  Restart Airflow Webserver.
+
+Screenshots
+---
+
+Before
+^^
+
+.. image:: ../img/change-ui-colors/dags-page-old.png
+
+.. image:: ../img/change-ui-colors/graph-view-old.png
+
+.. image:: ../img/change-ui-colors/tree-view-old.png
+
+After
+^^
+
+.. image:: ../img/change-ui-colors/dags-page-new.png
+
+.. image:: ../img/change-ui-colors/graph-view-new.png
+
+.. image:: ../img/change-ui-colors/tree-view-new.png
diff --git a/docs/howto/index.rst b/docs/howto/index.rst
index ae20c91..2b91c77 100644
--- a/docs/howto/index.rst
+++ b/docs/howto/index.rst
@@ -32,6 +32,7 @@ configuring an Airflow environment.
 set-config
 initialize-database
 operator/index
+customize-state-colors-ui
 custom-operator
 connection/index
 secure-connections
diff --git a/docs/img/change-ui-colors/dags-page-new.png 
b/docs/img/change-ui-colors/dags-page-new.png
new file mode 100644
index 000..d2ffe1f
Binary files /dev/null and b/docs/img/change-ui-colors/dags-page-new.png differ
diff --git a/docs/img/change-ui-colors/dags-page-old.png 
b/docs/img/change-ui-colors/dags-page-old.png
new file mode 100644
index 000..5078d01
Binary files /dev/null and b/docs/img/change-ui-colors/dags-page-old.png differ
diff --git a/docs/img/change-ui-colors/graph-view-new.png 
b/docs/img/change-ui-colors/graph-view-new.png
new file mode 100644
index 000..b367461
Binary files /dev/null and b/docs/img/change-ui-colors/graph-view-new.png differ
diff --git a/docs/img/change-ui-colors/graph-view-old.png 
b/docs/img/change-ui-colors/graph-view-old.png
new file mode 100644
index 000..ceaf8d4
Binary files /dev/null and b/docs/img/change-ui-colors/graph-view-old.png differ
diff --git a/docs/img/change-ui-colors/tree-view-new.png 
b/docs/img/change-ui-colors/tree-view-new.png
new file mode 100644
index 000..6a5b2d7
Binary files /dev/null and b/docs/img/change-ui-colors/tree-view-new.png differ
diff 

[airflow] 05/07: Replace old SubDag zoom screenshot with new (#9621)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3c8fb4a89f9827baa82d1cf6403ca253959bc4bd
Author: Kaxil Naik 
AuthorDate: Thu Jul 2 00:18:08 2020 +0100

Replace old SubDag zoom screenshot with new (#9621)

(cherry picked from commit 63a8c79aa9f2dfc3f0802e19b98b0c0b0f6b7858)
---
 docs/img/subdag_zoom.png | Bin 150185 -> 255915 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/img/subdag_zoom.png b/docs/img/subdag_zoom.png
index 08fcf5c..fe5ce5a 100644
Binary files a/docs/img/subdag_zoom.png and b/docs/img/subdag_zoom.png differ



[airflow] 06/07: Update docs about the change to default auth for experimental API (#9617)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit efe86e55f6af52f0eb0457625d7b92193b88a296
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 22:59:13 2020 +0100

Update docs about the change to default auth for experimental API (#9617)

(cherry picked from commit 7ef7f5880dfefc6e33cb7bf331927aa08e1bb438)
---
 docs/security.rst | 18 +++---
 1 file changed, 15 insertions(+), 3 deletions(-)

diff --git a/docs/security.rst b/docs/security.rst
index 863a454..3817c7f 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -159,15 +159,27 @@ only the dags which it is owner of, unless it is a 
superuser.
 API Authentication
 --
 
-Authentication for the API is handled separately to the Web Authentication. 
The default is to not
-require any authentication on the API i.e. wide open by default. This is not 
recommended if your
-Airflow webserver is publicly accessible, and you should probably use the 
``deny all`` backend:
+Authentication for the API is handled separately to the Web Authentication. 
The default is to
+deny all requests:
 
 .. code-block:: ini
 
 [api]
 auth_backend = airflow.api.auth.backend.deny_all
 
+.. versionchanged:: 1.10.11
+
+In Airflow <1.10.11, the default setting was to allow all API requests 
without authentication, but this
+posed security risks for if the Webserver is publicly accessible.
+
+If you wish to have the experimental API work, and aware of the risks of 
enabling this without authentication
+(or if you have your own authentication layer in front of Airflow) you can set 
the following in ``airflow.cfg``:
+
+.. code-block:: ini
+
+[api]
+auth_backend = airflow.api.auth.backend.default
+
 Two "real" methods for authentication are currently supported for the API.
 
 To enabled Password authentication, set the following in the configuration:



[GitHub] [airflow] vanka56 commented on pull request #9472: Add drop_partition functionality for HiveMetastoreHook

2020-07-01 Thread GitBox


vanka56 commented on pull request #9472:
URL: https://github.com/apache/airflow/pull/9472#issuecomment-652692779


   @jhtimmins @turbaszek all static checks have passed now. is it good now?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (05c88cb -> 63a8c79)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 05c88cb  Replace old Variables View Screenshot with new (#9620)
 add 63a8c79  Replace old SubDag zoom screenshot with new (#9621)

No new revisions were added by this update.

Summary of changes:
 docs/img/subdag_zoom.png | Bin 150185 -> 255915 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)



[GitHub] [airflow] kaxil merged pull request #9621: Replace old SubDag zoom screenshot with new

2020-07-01 Thread GitBox


kaxil merged pull request #9621:
URL: https://github.com/apache/airflow/pull/9621


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9621: Replace old SubDag zoom screenshot with new

2020-07-01 Thread GitBox


kaxil opened a new pull request #9621:
URL: https://github.com/apache/airflow/pull/9621


   Replace old SubDag zoom screenshot with new
   
   **Before**:
   
![image](https://user-images.githubusercontent.com/8811558/86298597-00eb7000-bbf6-11ea-8ced-2c97be593793.png)
   
   
   **After**:
   
![image](https://user-images.githubusercontent.com/8811558/86298606-05178d80-bbf6-11ea-83ba-9c1302d1558f.png)
   
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Replace old Variables View Screenshot with new (#9620)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 05c88cb  Replace old Variables View Screenshot with new (#9620)
05c88cb is described below

commit 05c88cb392e63e9f12f05915b94d216f22b652dd
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 23:51:18 2020 +0100

Replace old Variables View Screenshot with new (#9620)
---
 docs/img/variable_hidden.png | Bin 154299 -> 121301 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/img/variable_hidden.png b/docs/img/variable_hidden.png
index e081ca3..d982b92 100644
Binary files a/docs/img/variable_hidden.png and b/docs/img/variable_hidden.png 
differ



[GitHub] [airflow] subkanthi commented on issue #8471: Add integration with Google Calendar API

2020-07-01 Thread GitBox


subkanthi commented on issue #8471:
URL: https://github.com/apache/airflow/issues/8471#issuecomment-652681463


   @mik-laj , is this something I can take on, Im trying to find a good first 
issue, I have been using airflow for a few years and have written custom 
operators. Just thought will start contributing.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9620: Replace old Variables View Screenshot with new

2020-07-01 Thread GitBox


kaxil opened a new pull request #9620:
URL: https://github.com/apache/airflow/pull/9620


   Replace old Variables View Screenshot with new
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #9619: Change 'initiate' to 'initialize' in installation.rst

2020-07-01 Thread GitBox


kaxil merged pull request #9619:
URL: https://github.com/apache/airflow/pull/9619


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (7ef7f58 -> bc3f48c)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 7ef7f58  Update docs about the change to default auth for experimental 
API (#9617)
 add bc3f48c  Change 'initiate' to 'initialize' in installation.rst (#9619)

No new revisions were added by this update.

Summary of changes:
 docs/installation.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)



[GitHub] [airflow] mik-laj commented on pull request #9354: Task logging handlers can provide custom log links

2020-07-01 Thread GitBox


mik-laj commented on pull request #9354:
URL: https://github.com/apache/airflow/pull/9354#issuecomment-652676572


   @turbaszek Do you want to add anything else?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9619: Change 'initiate' to 'initialize' in installation.rst

2020-07-01 Thread GitBox


kaxil opened a new pull request #9619:
URL: https://github.com/apache/airflow/pull/9619


   `Initiating Airflow Database` -> `Initializing Airflow Database`
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vuppalli commented on issue #9418: Deprecated AI Platform Operators and Runtimes in Example DAG

2020-07-01 Thread GitBox


vuppalli commented on issue #9418:
URL: https://github.com/apache/airflow/issues/9418#issuecomment-652672779


   Thank you for the information! I created a PR for this issue here: 
https://github.com/apache/airflow/pull/9618. 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vuppalli commented on pull request #9618: Fix typos, older versions, and deprecated operators with AI platform example DAG

2020-07-01 Thread GitBox


vuppalli commented on pull request #9618:
URL: https://github.com/apache/airflow/pull/9618#issuecomment-652672627


   @mik-laj You mentioned that this DAG does not have accompanying tests. How 
should I go about addressing the Unit tests coverage bullet above? 
Additionally, should I make this a draft PR while I work on the corresponding 
documentation?



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #9618: Fix typos, older versions, and deprecated operators with AI platform example DAG

2020-07-01 Thread GitBox


boring-cyborg[bot] commented on pull request #9618:
URL: https://github.com/apache/airflow/pull/9618#issuecomment-652670230


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] vuppalli opened a new pull request #9618: Fix typos, older versions, and deprecated operators with AI platform example DAG

2020-07-01 Thread GitBox


vuppalli opened a new pull request #9618:
URL: https://github.com/apache/airflow/pull/9618


   The AI platform example DAG had typos, was using older versions, and had 
deprecated operators. This PR is an attempt to fix these problems and 
corresponds to [this Github 
issue](https://github.com/apache/airflow/issues/9418). The relevant 
documentation will be updated soon (most likely next week), which corresponds 
to [this Github issue](https://github.com/apache/airflow/issues/8207). I will 
address any comments next Monday (07/06) since we have US holidays for the rest 
of the week. I look forward to getting feedback!
   
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #9617: Update docs about the change to default auth for experimental API

2020-07-01 Thread GitBox


kaxil merged pull request #9617:
URL: https://github.com/apache/airflow/pull/9617


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated: Update docs about the change to default auth for experimental API (#9617)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
 new 7ef7f58  Update docs about the change to default auth for experimental 
API (#9617)
7ef7f58 is described below

commit 7ef7f5880dfefc6e33cb7bf331927aa08e1bb438
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 22:59:13 2020 +0100

Update docs about the change to default auth for experimental API (#9617)
---
 docs/security.rst | 18 +++---
 1 file changed, 15 insertions(+), 3 deletions(-)

diff --git a/docs/security.rst b/docs/security.rst
index c8f6e1a..1e820d3 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -63,15 +63,27 @@ OAuth, OpenID, LDAP, REMOTE_USER. You can configure in 
``webserver_config.py``.
 API Authentication
 --
 
-Authentication for the API is handled separately to the Web Authentication. 
The default is to not
-require any authentication on the API i.e. wide open by default. This is not 
recommended if your
-Airflow webserver is publicly accessible, and you should probably use the 
``deny all`` backend:
+Authentication for the API is handled separately to the Web Authentication. 
The default is to
+deny all requests:
 
 .. code-block:: ini
 
 [api]
 auth_backend = airflow.api.auth.backend.deny_all
 
+.. versionchanged:: 1.10.11
+
+In Airflow <1.10.11, the default setting was to allow all API requests 
without authentication, but this
+posed security risks for if the Webserver is publicly accessible.
+
+If you wish to have the experimental API work, and aware of the risks of 
enabling this without authentication
+(or if you have your own authentication layer in front of Airflow) you can set 
the following in ``airflow.cfg``:
+
+.. code-block:: ini
+
+[api]
+auth_backend = airflow.api.auth.backend.default
+
 Kerberos authentication is currently supported for the API.
 
 To enable Kerberos authentication, set the following in the configuration:



[GitHub] [airflow] mik-laj commented on a change in pull request #9531: Support .airflowignore for plugins

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9531:
URL: https://github.com/apache/airflow/pull/9531#discussion_r448633985



##
File path: airflow/plugins_manager.py
##
@@ -164,34 +165,31 @@ def load_plugins_from_plugin_directory():
 global plugins  # pylint: disable=global-statement
 log.debug("Loading plugins from directory: %s", settings.PLUGINS_FOLDER)
 
-# Crawl through the plugins folder to find AirflowPlugin derivatives
-for root, _, files in os.walk(settings.PLUGINS_FOLDER, followlinks=True):  
# noqa # pylint: disable=too-many-nested-blocks
-for f in files:
-filepath = os.path.join(root, f)
-try:
-if not os.path.isfile(filepath):
-continue
-mod_name, file_ext = os.path.splitext(
-os.path.split(filepath)[-1])
-if file_ext != '.py':
-continue
-
-log.debug('Importing plugin module %s', filepath)
-
-loader = importlib.machinery.SourceFileLoader(mod_name, 
filepath)
-spec = importlib.util.spec_from_loader(mod_name, loader)
-mod = importlib.util.module_from_spec(spec)
-sys.modules[spec.name] = mod
-loader.exec_module(mod)
-for mod_attr_value in list(mod.__dict__.values()):
-if is_valid_plugin(mod_attr_value):
-plugin_instance = mod_attr_value()
-plugins.append(plugin_instance)
-except Exception as e:  # pylint: disable=broad-except
-log.exception(e)
-path = filepath or str(f)
-log.error('Failed to import plugin %s', path)
-import_errors[path] = str(e)
+for file_path in find_path_from_directory(
+settings.PLUGINS_FOLDER, ".airflowignore"):
+
+if not os.path.isfile(file_path):
+continue
+mod_name, file_ext = os.path.splitext(os.path.split(file_path)[-1])
+if file_ext != '.py':
+continue

Review comment:
   ```suggestion
   ```
   Should this not be part of the find_path_from_directory function? I would 
like to see that there is no code in the plugins_manager.py file that is 
responsible for the file selection. Plugins_manager should only load the 
module. WDYT?
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #8214: "Adding DAG and Tasks documentation" section rendered twice in Tutorial on website

2020-07-01 Thread GitBox


kaxil closed issue #8214:
URL: https://github.com/apache/airflow/issues/8214


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9549: YAML file supports extra json parameters

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9549:
URL: https://github.com/apache/airflow/pull/9549#discussion_r448630190



##
File path: docs/howto/use-alternative-secrets-backend.rst
##
@@ -117,7 +118,10 @@ The following is a sample JSON file.
 }
 
 The YAML file structure is similar to that of a JSON. The key-value pair of 
connection ID and the definitions of one or more connections.
-The connection can be defined as a URI (string) or JSON object.
+The connection can be defined as a URI (string) or JSON object. Any extra json 
parameters can be provided using keys like ``extra_dejson`` and ``extra``.

Review comment:
   What do you think to put it earlier in the text?
   
   > The file can be defined in ``JSON``, ``YAML`` or ``env`` format.  The 
``JSON`` and ``YAML`` formats support defining the connection as an object or 
as URI (string). The env format supports only URI (string).  For a description 
of the connection object parameters see 
:class:`~airflow.models.connection.Connection`. We additionally support 
``extra_dejson`` to allows you to define extra parameters as object where as 
the key ``extra`` can be used in case of a JSON string.
   The keys ``extra`` and ``extra_dejson`` are mutually exclusive.
   
   
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


kaxil merged pull request #9614:
URL: https://github.com/apache/airflow/pull/9614


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch master updated (a3a52c7 -> 1655fa9)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from a3a52c7  Removes importlib usage - it's not needed (fails on Airflow 
1.10) (#9613)
 add 1655fa9  Restrict changing XCom values from the Webserver (#9614)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md  | 5 +
 airflow/www/views.py | 4 +---
 2 files changed, 6 insertions(+), 3 deletions(-)



[jira] [Commented] (AIRFLOW-6062) Scheduler doesn't delete worker pods from namespaces different than the scheduler's namespace

2020-07-01 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6062?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17149699#comment-17149699
 ] 

ASF GitHub Bot commented on AIRFLOW-6062:
-

hcheng4 commented on pull request #8546:
URL: https://github.com/apache/airflow/pull/8546#issuecomment-652648926


   [Not a contribution]
   
   @dimberman a multi-namespace mode sounds like a great idea. Is there a 
discussion already ongoing? If there's agreement design-wise, I can see if 
there's appetite and bandwidth here for a patch contribution.
   
   It may be the case that longer-term the restrictions of an in-namespace 
Kubernetes deployment are too much and it's undesirable to support that going 
forward, but that doesn't feel like the type of compatibility-breaking change 
that's appropriate for a patch release IMHO.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Scheduler doesn't delete worker pods from namespaces different than the 
> scheduler's namespace
> -
>
> Key: AIRFLOW-6062
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6062
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: Mihail Petkov
>Assignee: Daniel Imberman
>Priority: Blocker
> Fix For: 1.10.10
>
>
> When you run Airflow's task instances as worker pods in different namespaces 
> into a Kubernetes cluster, the scheduler can delete only the pods that are 
> living in the same namespace where the scheduler lives. It's trying to delete 
> all pods that are in the namespace defined in the airflow.cfg file.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] hcheng4 commented on pull request #8546: AIRFLOW-6062 Watch worker pods from all namespaces

2020-07-01 Thread GitBox


hcheng4 commented on pull request #8546:
URL: https://github.com/apache/airflow/pull/8546#issuecomment-652648926


   [Not a contribution]
   
   @dimberman a multi-namespace mode sounds like a great idea. Is there a 
discussion already ongoing? If there's agreement design-wise, I can see if 
there's appetite and bandwidth here for a patch contribution.
   
   It may be the case that longer-term the restrictions of an in-namespace 
Kubernetes deployment are too much and it's undesirable to support that going 
forward, but that doesn't feel like the type of compatibility-breaking change 
that's appropriate for a patch release IMHO.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Acehaidrey commented on pull request #9544: Add metric for scheduling delay between first run task & expected start time

2020-07-01 Thread GitBox


Acehaidrey commented on pull request #9544:
URL: https://github.com/apache/airflow/pull/9544#issuecomment-652644579


   @mik-laj can you please take a look at this when you get a chance.
   Also @ashb if you have time too



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448613980



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()

Review comment:
   the job polls here





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448613697



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()
+return job
+
 def execute(self, context: Any):
 hook = BigQueryHook(
 gcp_conn_id=self.gcp_conn_id,
 delegate_to=self.delegate_to,
 )
 
-job_id = self.job_id or f"airflow_{self.task_id}_{int(time())}"
+exec_date = context['execution_date'].isoformat()
+job_id = self.job_id or 
f"airflow_{self.dag_id}_{self.task_id}_{exec_date}"
+
 try:
-job = hook.insert_job(
-configuration=self.configuration,
-project_id=self.project_id,
-location=self.location,
-job_id=job_id,
-)
-# Start the job and wait for it to complete and get the result.
-job.result()
+# Submit a new job
+job = self._submit_job(hook, job_id)
 except Conflict:
+# If the job already exists retrieve it
 job = hook.get_job(
 project_id=self.project_id,
 location=self.location,
 job_id=job_id,
 )
-# Get existing job and wait for it to be ready
-for time_to_wait in exponential_sleep_generator(initial=10, 
maximum=120):
-sleep(time_to_wait)
-job.reload()
-if job.done():
-break
+
+if job.done() and job.error_result:
+# The job exists and finished with an error and we are 
probably reruning it
+# So we have to make a new job_id because it has to be unique
+job_id = f"{self.job_id}_{int(time())}"
+job = self._submit_job(hook, job_id)
+elif not job.done():
+# The job is still running so wait for it to be ready
+for time_to_wait in exponential_sleep_generator(initial=10, 
maximum=120):

Review comment:
   If you use the Google client you won't need to sleep and wait - the loop 
is built into the job itself





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448611939



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()
+return job
+
 def execute(self, context: Any):
 hook = BigQueryHook(
 gcp_conn_id=self.gcp_conn_id,
 delegate_to=self.delegate_to,
 )
 
-job_id = self.job_id or f"airflow_{self.task_id}_{int(time())}"
+exec_date = context['execution_date'].isoformat()
+job_id = self.job_id or 
f"airflow_{self.dag_id}_{self.task_id}_{exec_date}"

Review comment:
   I tried this before you LOL





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448610696



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()
+return job
+
 def execute(self, context: Any):
 hook = BigQueryHook(
 gcp_conn_id=self.gcp_conn_id,
 delegate_to=self.delegate_to,
 )
 
-job_id = self.job_id or f"airflow_{self.task_id}_{int(time())}"
+exec_date = context['execution_date'].isoformat()
+job_id = self.job_id or 
f"airflow_{self.dag_id}_{self.task_id}_{exec_date}"

Review comment:
   f"airflow_{self.dag_id}_{self.task_id}_{exec_date}" the job won't be 
clearable this way which might result in unexpected behavours





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448610696



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()
+return job
+
 def execute(self, context: Any):
 hook = BigQueryHook(
 gcp_conn_id=self.gcp_conn_id,
 delegate_to=self.delegate_to,
 )
 
-job_id = self.job_id or f"airflow_{self.task_id}_{int(time())}"
+exec_date = context['execution_date'].isoformat()
+job_id = self.job_id or 
f"airflow_{self.dag_id}_{self.task_id}_{exec_date}"

Review comment:
   the job won't be clearable this way which might result in unexpected 
behavours





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] albertocalderari commented on a change in pull request #9590: Improve idempotency of BigQueryInsertJobOperator

2020-07-01 Thread GitBox


albertocalderari commented on a change in pull request #9590:
URL: https://github.com/apache/airflow/pull/9590#discussion_r448610390



##
File path: airflow/providers/google/cloud/operators/bigquery.py
##
@@ -1692,32 +1692,52 @@ def prepare_template(self) -> None:
 with open(self.configuration, 'r') as file:
 self.configuration = json.loads(file.read())
 
+def _submit_job(self, hook: BigQueryHook, job_id: str):
+# Submit a new job
+job = hook.insert_job(
+configuration=self.configuration,
+project_id=self.project_id,
+location=self.location,
+job_id=job_id,
+)
+# Start the job and wait for it to complete and get the result.
+job.result()
+return job
+
 def execute(self, context: Any):
 hook = BigQueryHook(
 gcp_conn_id=self.gcp_conn_id,
 delegate_to=self.delegate_to,
 )
 
-job_id = self.job_id or f"airflow_{self.task_id}_{int(time())}"
+exec_date = context['execution_date'].isoformat()
+job_id = self.job_id or 
f"airflow_{self.dag_id}_{self.task_id}_{exec_date}"
+
 try:
-job = hook.insert_job(
-configuration=self.configuration,
-project_id=self.project_id,
-location=self.location,
-job_id=job_id,
-)
-# Start the job and wait for it to complete and get the result.
-job.result()
+# Submit a new job
+job = self._submit_job(hook, job_id)
 except Conflict:
+# If the job already exists retrieve it
 job = hook.get_job(
 project_id=self.project_id,
 location=self.location,
 job_id=job_id,
 )
-# Get existing job and wait for it to be ready
-for time_to_wait in exponential_sleep_generator(initial=10, 
maximum=120):
-sleep(time_to_wait)
-job.reload()
-if job.done():
-break
+
+if job.done() and job.error_result:
+# The job exists and finished with an error and we are 
probably reruning it
+# So we have to make a new job_id because it has to be unique
+job_id = f"{self.job_id}_{int(time())}"

Review comment:
   the job won't be clearable this way which might result in unexpected 
behavours





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] VinayGb665 commented on a change in pull request #9549: YAML file supports extra json parameters

2020-07-01 Thread GitBox


VinayGb665 commented on a change in pull request #9549:
URL: https://github.com/apache/airflow/pull/9549#discussion_r448609526



##
File path: tests/secrets/test_local_filesystem.py
##
@@ -226,9 +226,15 @@ def test_missing_file(self, mock_exists):
schema: lschema
login: Login
password: None
-   port: 1234""",
+   port: 1234
+   extra_dejson:

Review comment:
   Added





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #9503: add date-time format validation for API spec

2020-07-01 Thread GitBox


ephraimbuddy commented on a change in pull request #9503:
URL: https://github.com/apache/airflow/pull/9503#discussion_r448607998



##
File path: airflow/api_connexion/endpoints/dag_run_endpoint.py
##
@@ -69,24 +61,24 @@ def get_dag_runs(session, dag_id, start_date_gte=None, 
start_date_lte=None,
 
 # filter start date
 if start_date_gte:
-query = query.filter(DagRun.start_date >= start_date_gte)
+query = query.filter(DagRun.start_date >= 
timezone.parse(start_date_gte))

Review comment:
   I personally like the `format_datetime` approach for a few reasons. One 
is that we was able to take datetime from url without parsing with quote_plus. 
Another was that we didn't parse with timezone.parse in view code.
   
   However, I would suggest we use custom date format in spec for those areas 
we need to use `format_datetime`. The reason is that connection may add the 
`rfc3339-validator` and it would affect us if it gets installed with a new 
version





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9617: Update docs about the change to default auth for experimental API

2020-07-01 Thread GitBox


kaxil opened a new pull request #9617:
URL: https://github.com/apache/airflow/pull/9617


   https://github.com/apache/airflow/issues/9611 updated the default but we 
missed to change our docs
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9503: add date-time format validation for API spec

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9503:
URL: https://github.com/apache/airflow/pull/9503#discussion_r448598638



##
File path: airflow/api_connexion/endpoints/dag_run_endpoint.py
##
@@ -69,24 +61,24 @@ def get_dag_runs(session, dag_id, start_date_gte=None, 
start_date_lte=None,
 
 # filter start date
 if start_date_gte:
-query = query.filter(DagRun.start_date >= start_date_gte)
+query = query.filter(DagRun.start_date >= 
timezone.parse(start_date_gte))

Review comment:
   To be honest I have mixed feelings. I wanted to remove custom code, but 
adding rfc3339-validator does not meet all of our requirements.  We need to 
validate and convert parameters, but it is not possible to do it elegantly in 
connexion.
   
   When we convert these parameters before executing the function logic, we can 
write more generic code.
   
https://github.com/apache/airflow/blob/37761e39587634bf524342e7cf465b1397320659/airflow/api_connexion/endpoints/task_instance_endpoint.py
   The _apply_range_filter method can be used for datetime and duration. Both 
types are supported.
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] 01/02: Change default auth for experimental backend to deny_all (#9611)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 89329c4ac4e98f479003b217981dc4dfde2bf661
Author: Ash Berlin-Taylor 
AuthorDate: Wed Jul 1 17:04:35 2020 +0100

Change default auth for experimental backend to deny_all (#9611)

In a move that should surprise no one, a number of users do not read,
and leave the API wide open by default. Safe is better than powned

(cherry picked from commit 9e305d6b810a2a21e2591a80a80ec41acb3afed0)
---
 UPDATING.md  | 16 
 airflow/config_templates/config.yml  |  6 --
 airflow/config_templates/default_airflow.cfg |  6 --
 3 files changed, 24 insertions(+), 4 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 3dfda58..ec193f9 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -73,6 +73,22 @@ Before 1.10.11 it was possible to edit DagRun State in the 
`/admin/dagrun/` page
 
 In Airflow 1.10.11+, the user can only choose the states from the list.
 
+### Experimental API will deny all request by default.
+
+The previous default setting was to allow all API requests without 
authentication, but this poses security
+risks to users who miss this fact. This changes the default for new installs 
to deny all requests by default.
+
+**Note**: This will not change the behavior for existing installs, please 
update check your airflow.cfg
+
+If you wish to have the experimental API work, and aware of the risks of 
enabling this without authentication
+(or if you have your own authentication layer in front of Airflow) you can get
+the previous behaviour on a new install by setting this in your airflow.cfg:
+
+```
+[api]
+auth_backend = airflow.api.auth.backend.default
+```
+
 ## Airflow 1.10.10
 
 ### Setting Empty string to a Airflow Variable will return an empty string
diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
index f632cd5..0d52426 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -524,11 +524,13 @@
   options:
 - name: auth_backend
   description: |
-How to authenticate users of the API
+How to authenticate users of the API. See
+https://airflow.apache.org/docs/stable/security.html for possible 
values.
+("airflow.api.auth.backend.default" allows all requests for historic 
reasons)
   version_added: ~
   type: string
   example: ~
-  default: "airflow.api.auth.backend.default"
+  default: "airflow.api.auth.backend.deny_all"
 - name: lineage
   description: ~
   options:
diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index a061d46..63bd3cb 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -274,8 +274,10 @@ endpoint_url = http://localhost:8080
 fail_fast = False
 
 [api]
-# How to authenticate users of the API
-auth_backend = airflow.api.auth.backend.default
+# How to authenticate users of the API. See
+# https://airflow.apache.org/docs/stable/security.html for possible values.
+# ("airflow.api.auth.backend.default" allows all requests for historic reasons)
+auth_backend = airflow.api.auth.backend.deny_all
 
 [lineage]
 # what lineage backend to use



[airflow] 02/02: fixup! Add Changelog for 1.10.11

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d56fdb4072fb98f5b29ee039a398b110ac226fb7
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 21:24:42 2020 +0100

fixup! Add Changelog for 1.10.11
---
 CHANGELOG.txt | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index ca16351..bbf7ef9 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -138,10 +138,10 @@ Improvements
 - Docker Image: Add ADDITIONAL_AIRFLOW_EXTRAS (#9032)
 - Docker Image: Add ADDITIONAL_PYTHON_DEPS (#9031)
 - Remove httplib2 from Google requirements (#9194)
-- Helm Chart: Removes importlib usage (#9613)
+- Helm Chart: Remove importlib usage (#9613)
 - Merging multiple sql operators (#9124)
 - Adds hive as extra in pyhive dependency (#9075)
-
+- Change default auth for experimental backend to deny_all (#9611)
 
 Doc only changes
 



[airflow] branch v1-10-test updated (fae2e63 -> d56fdb4)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from fae2e63  Add Changelog for 1.10.11
 new 89329c4  Change default auth for experimental backend to deny_all 
(#9611)
 new d56fdb4  fixup! Add Changelog for 1.10.11

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 CHANGELOG.txt|  4 ++--
 UPDATING.md  | 16 
 airflow/config_templates/config.yml  |  6 --
 airflow/config_templates/default_airflow.cfg |  6 --
 4 files changed, 26 insertions(+), 6 deletions(-)



[GitHub] [airflow] Acehaidrey commented on pull request #9472: Add drop_partition functionality for HiveMetastoreHook

2020-07-01 Thread GitBox


Acehaidrey commented on pull request #9472:
URL: https://github.com/apache/airflow/pull/9472#issuecomment-652627140


   @vanka56 
   ```
   Trim Trailing 
Whitespace..Failed
   - hook id: trailing-whitespace
   - exit code: 1
   - files were modified by this hook
   ```
   can you fix that



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Acehaidrey commented on pull request #9280: Functionality to shuffle HMS connections to be used by HiveMetastoreHook facilitating load balancing

2020-07-01 Thread GitBox


Acehaidrey commented on pull request #9280:
URL: https://github.com/apache/airflow/pull/9280#issuecomment-652626222


   @vanka56 can you try to rebase and see if that will help



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448590758



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")
+
+return task_instance_schema.dump(task_instance)
 
-def get_task_instance():
-"""
-Get a task instance
-"""
-raise NotImplementedError("Not implemented yet.")
 
+def _apply_array_filter(query, key, values):
+if values is not None:
+query = query.filter(or_(*[key == v for v in values]))
+return query
 
-def get_task_instances():
+
+def _apply_range_filter(query, key, value_range: Tuple[Any, Any]):
+gte_value, lte_value = value_range
+if gte_value is not None:
+query = query.filter(key >= gte_value)
+if lte_value is not None:
+query = query.filter(key <= lte_value)
+return query
+
+
+@format_parameters(
+{
+'start_date_gte': format_datetime,
+'start_date_lte': format_datetime,
+'execution_date_gte': format_datetime,
+'execution_date_lte': format_datetime,
+'end_date_gte': format_datetime,
+'end_date_lte': format_datetime,
+}
+)
+@provide_session
+def get_task_instances(
+limit: int,
+dag_id: Optional[str] = None,
+dag_run_id: Optional[str] = None,
+execution_date_gte: Optional[str] = None,
+execution_date_lte: Optional[str] = None,
+start_date_gte: Optional[str] = None,
+start_date_lte: Optional[str] = None,
+end_date_gte: Optional[str] = None,
+end_date_lte: Optional[str] = None,
+duration_gte: Optional[float] = None,
+duration_lte: Optional[float] = None,
+state: Optional[str] = None,
+pool: Optional[List[str]] = None,
+queue: Optional[List[str]] = None,
+offset: Optional[int] = None,
+session=None,

Review comment:
   It is not correct with pylint if I remember correctly. I didn't specify 
the type here because SQLAlchemy does a lot of magic, and mypy likes to get 
lost when he is aware of its existence.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448588204



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")
+
+return task_instance_schema.dump(task_instance)
 
-def get_task_instance():
-"""
-Get a task instance
-"""
-raise NotImplementedError("Not implemented yet.")
 
+def _apply_array_filter(query, key, values):
+if values is not None:
+query = query.filter(or_(*[key == v for v in values]))
+return query
 
-def get_task_instances():
+
+def _apply_range_filter(query, key, value_range: Tuple[Any, Any]):
+gte_value, lte_value = value_range
+if gte_value is not None:
+query = query.filter(key >= gte_value)
+if lte_value is not None:
+query = query.filter(key <= lte_value)
+return query
+
+
+@format_parameters(
+{
+'start_date_gte': format_datetime,
+'start_date_lte': format_datetime,
+'execution_date_gte': format_datetime,
+'execution_date_lte': format_datetime,
+'end_date_gte': format_datetime,
+'end_date_lte': format_datetime,
+}
+)
+@provide_session
+def get_task_instances(
+limit: int,
+dag_id: Optional[str] = None,
+dag_run_id: Optional[str] = None,
+execution_date_gte: Optional[str] = None,
+execution_date_lte: Optional[str] = None,
+start_date_gte: Optional[str] = None,
+start_date_lte: Optional[str] = None,
+end_date_gte: Optional[str] = None,
+end_date_lte: Optional[str] = None,
+duration_gte: Optional[float] = None,
+duration_lte: Optional[float] = None,
+state: Optional[str] = None,
+pool: Optional[List[str]] = None,
+queue: Optional[List[str]] = None,
+offset: Optional[int] = None,
+session=None,
+):
 """
-Get list of task instances of DAG.
+Get list of a task instances
 """
-raise NotImplementedError("Not implemented yet.")
+query = session.query(TI)
+
+if dag_id is not None:
+query = query.filter(TI.dag_id == dag_id)
+if dag_run_id is not None:
+query = query.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date 
== DR.execution_date))
+query = query.filter(DR.run_id == dag_run_id)
+
+query = _apply_range_filter(
+query, key=DR.execution_date, value_range=(execution_date_gte, 
execution_date_lte)
+)
+query = _apply_range_filter(query, key=DR.start_date, 
value_range=(start_date_gte, start_date_lte))
+query = _apply_range_filter(query, key=DR.end_date, 
value_range=(end_date_gte, end_date_lte))
+query = _apply_range_filter(query, key=DR.end_date, 
value_range=(end_date_gte, end_date_lte))

Review comment:
   Good point. I will remove it





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448588204



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")
+
+return task_instance_schema.dump(task_instance)
 
-def get_task_instance():
-"""
-Get a task instance
-"""
-raise NotImplementedError("Not implemented yet.")
 
+def _apply_array_filter(query, key, values):
+if values is not None:
+query = query.filter(or_(*[key == v for v in values]))
+return query
 
-def get_task_instances():
+
+def _apply_range_filter(query, key, value_range: Tuple[Any, Any]):
+gte_value, lte_value = value_range
+if gte_value is not None:
+query = query.filter(key >= gte_value)
+if lte_value is not None:
+query = query.filter(key <= lte_value)
+return query
+
+
+@format_parameters(
+{
+'start_date_gte': format_datetime,
+'start_date_lte': format_datetime,
+'execution_date_gte': format_datetime,
+'execution_date_lte': format_datetime,
+'end_date_gte': format_datetime,
+'end_date_lte': format_datetime,
+}
+)
+@provide_session
+def get_task_instances(
+limit: int,
+dag_id: Optional[str] = None,
+dag_run_id: Optional[str] = None,
+execution_date_gte: Optional[str] = None,
+execution_date_lte: Optional[str] = None,
+start_date_gte: Optional[str] = None,
+start_date_lte: Optional[str] = None,
+end_date_gte: Optional[str] = None,
+end_date_lte: Optional[str] = None,
+duration_gte: Optional[float] = None,
+duration_lte: Optional[float] = None,
+state: Optional[str] = None,
+pool: Optional[List[str]] = None,
+queue: Optional[List[str]] = None,
+offset: Optional[int] = None,
+session=None,
+):
 """
-Get list of task instances of DAG.
+Get list of a task instances
 """
-raise NotImplementedError("Not implemented yet.")
+query = session.query(TI)
+
+if dag_id is not None:
+query = query.filter(TI.dag_id == dag_id)
+if dag_run_id is not None:
+query = query.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date 
== DR.execution_date))
+query = query.filter(DR.run_id == dag_run_id)
+
+query = _apply_range_filter(
+query, key=DR.execution_date, value_range=(execution_date_gte, 
execution_date_lte)
+)
+query = _apply_range_filter(query, key=DR.start_date, 
value_range=(start_date_gte, start_date_lte))
+query = _apply_range_filter(query, key=DR.end_date, 
value_range=(end_date_gte, end_date_lte))
+query = _apply_range_filter(query, key=DR.end_date, 
value_range=(end_date_gte, end_date_lte))

Review comment:
   Good point. Itt should be execution_date.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448587966



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")
+
+return task_instance_schema.dump(task_instance)
 
-def get_task_instance():
-"""
-Get a task instance
-"""
-raise NotImplementedError("Not implemented yet.")
 
+def _apply_array_filter(query, key, values):
+if values is not None:
+query = query.filter(or_(*[key == v for v in values]))
+return query
 
-def get_task_instances():
+
+def _apply_range_filter(query, key, value_range: Tuple[Any, Any]):
+gte_value, lte_value = value_range
+if gte_value is not None:
+query = query.filter(key >= gte_value)
+if lte_value is not None:
+query = query.filter(key <= lte_value)
+return query
+
+
+@format_parameters(
+{
+'start_date_gte': format_datetime,
+'start_date_lte': format_datetime,
+'execution_date_gte': format_datetime,
+'execution_date_lte': format_datetime,
+'end_date_gte': format_datetime,
+'end_date_lte': format_datetime,
+}
+)
+@provide_session
+def get_task_instances(
+limit: int,
+dag_id: Optional[str] = None,
+dag_run_id: Optional[str] = None,
+execution_date_gte: Optional[str] = None,
+execution_date_lte: Optional[str] = None,
+start_date_gte: Optional[str] = None,
+start_date_lte: Optional[str] = None,
+end_date_gte: Optional[str] = None,
+end_date_lte: Optional[str] = None,
+duration_gte: Optional[float] = None,
+duration_lte: Optional[float] = None,
+state: Optional[str] = None,
+pool: Optional[List[str]] = None,
+queue: Optional[List[str]] = None,
+offset: Optional[int] = None,
+session=None,

Review comment:
   We use connexion, which fills these parameters based on the API 
specification.
   
https://connexion.readthedocs.io/en/latest/request.html#automatic-parameter-handling





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448587313



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")
+
+return task_instance_schema.dump(task_instance)
 
-def get_task_instance():
-"""
-Get a task instance
-"""
-raise NotImplementedError("Not implemented yet.")
 
+def _apply_array_filter(query, key, values):

Review comment:
   I do not know if I understand correctly. Can you say more?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448586807



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):
+query = (
+session.query(TI)
+.filter(TI.dag_id == dag_id)
+.join(DR, and_(TI.dag_id == DR.dag_id, TI.execution_date == 
DR.execution_date))
+.filter(DR.run_id == dag_run_id)
+.filter(TI.task_id == task_id)
+)
+
+task_instance = query.one_or_none()
+
+if task_instance is None:
+raise NotFound("Task instance not found")

Review comment:
   Yes. This is handled by connexion.
   https://connexion.readthedocs.io/en/latest/exceptions.html





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mik-laj commented on a change in pull request #9597: [WIP] Add read-only endpoints for task instances

2020-07-01 Thread GitBox


mik-laj commented on a change in pull request #9597:
URL: https://github.com/apache/airflow/pull/9597#discussion_r448586439



##
File path: airflow/api_connexion/endpoints/task_instance_endpoint.py
##
@@ -14,23 +14,110 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from typing import Any, List, Optional, Tuple
 
-# TODO(mik-laj): We have to implement it.
-# Do you want to help? Please look at: 
https://github.com/apache/airflow/issues/8132
+from sqlalchemy import and_, func, or_
 
+from airflow.api_connexion.exceptions import NotFound
+from airflow.api_connexion.parameters import format_datetime, format_parameters
+from airflow.api_connexion.schemas.task_instance_schema import (
+TaskInstanceCollection, task_instance_collection_schema, 
task_instance_schema,
+)
+from airflow.models.dagrun import DagRun as DR
+from airflow.models.taskinstance import TaskInstance as TI
+from airflow.utils.session import provide_session
+
+
+@provide_session
+def get_task_instance(dag_id: str, dag_run_id: str, task_id: str, 
session=None):

Review comment:
   Inside Airflow, we use execution_date as the primary identifier. 
However, we voted to use dag_run_id in the API
   
https://lists.apache.org/thread.html/rd4be3829627dcef8b40314c62c041f460992786f3bfcc634d25a6664%40%3Cdev.airflow.apache.org%3E





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


kaxil commented on a change in pull request #9614:
URL: https://github.com/apache/airflow/pull/9614#discussion_r448578257



##
File path: airflow/www/views.py
##
@@ -2192,12 +2192,11 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_add', 'can_list', 'can_delete']

Review comment:
   Updated in 
https://github.com/apache/airflow/pull/9614/commits/550155c7fb793432521dc279ceca56ea0bd6e30e





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on a change in pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


kaxil commented on a change in pull request #9614:
URL: https://github.com/apache/airflow/pull/9614#discussion_r448578166



##
File path: airflow/www/views.py
##
@@ -2192,12 +2192,11 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_add', 'can_list', 'can_delete']

Review comment:
   Good point @potiuk 

##
File path: airflow/www/views.py
##
@@ -2192,12 +2192,11 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_add', 'can_list', 'can_delete']

Review comment:
   Good point @potiuk @ashb 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pingzh opened a new pull request #9616: local job heartbeat callback should use session from provide_session

2020-07-01 Thread GitBox


pingzh opened a new pull request #9616:
URL: https://github.com/apache/airflow/pull/9616


   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [ ] Description above provides context of the change
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [ ] Target Github ISSUE in description if exists
   - [ ] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Unit03 commented on issue #9595: SparkSubmitOperator only masks one "form" of password arguments

2020-07-01 Thread GitBox


Unit03 commented on issue #9595:
URL: https://github.com/apache/airflow/issues/9595#issuecomment-652601598


   All right, then, PR opened. :)



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Unit03 commented on a change in pull request #9615: Mask other forms of password arguments in SparkSubmitOperator

2020-07-01 Thread GitBox


Unit03 commented on a change in pull request #9615:
URL: https://github.com/apache/airflow/pull/9615#discussion_r448568444



##
File path: tests/providers/apache/spark/hooks/test_spark_submit.py
##
@@ -748,3 +750,64 @@ def test_k8s_process_on_kill(self, mock_popen, 
mock_client_method):
 client.delete_namespaced_pod.assert_called_once_with(
 'spark-pi-edf2ace37be7353a958b38733a12f8e6-driver',
 'mynamespace', **kwargs)
+
+
+@pytest.mark.parametrize(
+("command", "expected"),
+(
+(
+("spark-submit", "foo", "--bar", "baz", "--password='secret'"),
+"spark-submit foo --bar baz --password='**'",
+),
+(
+("spark-submit", "foo", "--bar", "baz", '--password="secret"'),
+'spark-submit foo --bar baz --password="**"',
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--password=secret"),
+"spark-submit foo --bar baz --password=**",
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--password 'secret'"),
+"spark-submit foo --bar baz --password '**'",
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--password secret"),
+"spark-submit foo --bar baz --password **",
+),
+(
+("spark-submit", "foo", "--bar", "baz", '--password "secret"'),
+'spark-submit foo --bar baz --password "**"',
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--secret='secret'"),
+"spark-submit foo --bar baz --secret='**'",
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--foo.password='secret'"),
+"spark-submit foo --bar baz --foo.password='**'",
+),
+(
+("spark-submit",),
+"spark-submit",
+),
+
+(
+("spark-submit", "foo", "--bar", "baz", "--password \"secret'"),
+"spark-submit foo --bar baz --password \"secret'",
+),
+(
+("spark-submit", "foo", "--bar", "baz", "--password 'secret\""),
+"spark-submit foo --bar baz --password 'secret\"",
+),
+),
+)
+def test_masks_passwords(command: str, expected: str) -> None:

Review comment:
   I figured it would be a good place to use `pytest.mark.parameterize` - 
[but it doesn't go along with 
`unittest.TestCase`](https://docs.pytest.org/en/stable/unittest.html#pytest-features-in-unittest-testcase-subclasses).
   
   On the other hand, I get it that going with a separate function outside of 
the `TestSparkSubmitHook` class looks kind of awkward. Which way do we want 
this test to go?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Unit03 commented on a change in pull request #9615: Mask other forms of password arguments in SparkSubmitOperator

2020-07-01 Thread GitBox


Unit03 commented on a change in pull request #9615:
URL: https://github.com/apache/airflow/pull/9615#discussion_r448566919



##
File path: airflow/providers/apache/spark/hooks/spark_submit.py
##
@@ -237,8 +237,8 @@ def _mask_cmd(self, connection_cmd):
 # Mask any password related fields in application args with key value 
pair
 # where key contains password (case insensitive), e.g. 
HivePassword='abc'
 connection_cmd_masked = re.sub(
-r"(\S*?(?:secret|password)\S*?\s*=\s*')[^']*(?=')",
-r'\1**', ' '.join(connection_cmd), flags=re.I)
+r"(\S*?(?:secret|password)\S*?\s*(?:=|\s+)(['\"]?))[^'^\"]+(\2)",

Review comment:
   Does this regular expression deserve some more explaining, possibly in 
the form of [inline 
comments](https://docs.python.org/3/library/re.html#re.VERBOSE)?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #9615: Mask other forms of password arguments in SparkSubmitOperator

2020-07-01 Thread GitBox


boring-cyborg[bot] commented on pull request #9615:
URL: https://github.com/apache/airflow/pull/9615#issuecomment-652599302


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, pylint and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/master/docs/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://apache-airflow-slack.herokuapp.com/
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Unit03 opened a new pull request #9615: Mask other forms of password arguments in SparkSubmitOperator

2020-07-01 Thread GitBox


Unit03 opened a new pull request #9615:
URL: https://github.com/apache/airflow/pull/9615


   This is a follow-up to #6917. Related: #9595.
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] dossett commented on pull request #9593: Improve handling Dataproc cluster creation with ERROR state

2020-07-01 Thread GitBox


dossett commented on pull request #9593:
URL: https://github.com/apache/airflow/pull/9593#issuecomment-652583992


   @turbaszek Thanks!  One other small behavior that I added in AIRFLOW-3149 
was that if the cluster already existed in the DELETING state, the operator 
would wait for the DELETE to finish and then create a new cluster.  (A rare 
condition, but one we hit in production more than once.)
   
   Thank you very much!



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


ashb commented on a change in pull request #9614:
URL: https://github.com/apache/airflow/pull/9614#discussion_r448542230



##
File path: airflow/www/views.py
##
@@ -2192,12 +2192,11 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_add', 'can_list', 'can_delete']

Review comment:
   Yeah, add has the same potential problems as edit





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[jira] [Commented] (AIRFLOW-3152) Kubernetes Pod Operator should support init containers

2020-07-01 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17149583#comment-17149583
 ] 

ASF subversion and git services commented on AIRFLOW-3152:
--

Commit 7ebd91380114d372fe0d114a2189905a4ad8cf70 in airflow's branch 
refs/heads/v1-10-test from Marina Pereira
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=7ebd913 ]

[AIRFLOW-3152] Kubernetes Pod Operator should support init containers. (#6196)

* Add support for init-containers to Kubernetes Pod Operator

Enables start-up related code to be added for an app container in K8s Pod 
operator.

Add new init_container resource that can be attached to the K8s Pod.

* Update init_container and fix tests

Fix the error in init_container and the associated tests.

* Refactor and fix tests

Fix tests for init_containers

* Fix init_container test errors

Remove unused mocks in init_container test

* Fix init_container test errors

Update volume mount object used in init_container test

* Fix init_container test errors

Add the missing volume setup for the init_container test.

* Fix init_container test failure.

Fix the expected result in the init_container test.

* Fix init_container test failures

Update expected results in the init_container tests

* Update the KubernetesPodOperator guide

Update the KubernetesPodOperator guide to document support for init containers

* Update init-container tests

Fix test failures casued due python versions by sorting the output before 
assert test.

* Update init-container to use k8s V1Container object

Remove custom object InitContainer.
Allow users to pass List[k8s.V1Container] as init-container in K8sPodOperator

* Add missing init_containers initalization in K8s pod operator

Due to rebase from master, certain sections of the kubernetes_pod_operator.py 
file was refactored which led to missing init_containers initalization in K8s 
pod operator. Add missing init_containers initalization in K8s pod operator. 
Update kubernetes pod operator configurations in init container test.

(cherry picked from commit 4e1b0aa7c52e79eeb99698e6190e6f39ca3cab7f)


> Kubernetes Pod Operator should support init containers
> --
>
> Key: AIRFLOW-3152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3152
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: operators
>Affects Versions: 1.10.1
>Reporter: Sriraam AS
>Priority: Major
>  Labels: kubernetes
> Fix For: 2.0.0
>
>
> The pod generator has support for init containers, but the kubernetes pod 
> operator doesn't support init containers, yet.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[airflow] branch v1-10-test updated: [AIRFLOW-3152] Kubernetes Pod Operator should support init containers. (#6196)

2020-07-01 Thread dimberman
This is an automated email from the ASF dual-hosted git repository.

dimberman pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 7ebd913  [AIRFLOW-3152] Kubernetes Pod Operator should support init 
containers. (#6196)
7ebd913 is described below

commit 7ebd91380114d372fe0d114a2189905a4ad8cf70
Author: Marina Pereira <46900505+marinajpere...@users.noreply.github.com>
AuthorDate: Tue Dec 17 00:14:24 2019 -0500

[AIRFLOW-3152] Kubernetes Pod Operator should support init containers. 
(#6196)

* Add support for init-containers to Kubernetes Pod Operator

Enables start-up related code to be added for an app container in K8s Pod 
operator.

Add new init_container resource that can be attached to the K8s Pod.

* Update init_container and fix tests

Fix the error in init_container and the associated tests.

* Refactor and fix tests

Fix tests for init_containers

* Fix init_container test errors

Remove unused mocks in init_container test

* Fix init_container test errors

Update volume mount object used in init_container test

* Fix init_container test errors

Add the missing volume setup for the init_container test.

* Fix init_container test failure.

Fix the expected result in the init_container test.

* Fix init_container test failures

Update expected results in the init_container tests

* Update the KubernetesPodOperator guide

Update the KubernetesPodOperator guide to document support for init 
containers

* Update init-container tests

Fix test failures casued due python versions by sorting the output before 
assert test.

* Update init-container to use k8s V1Container object

Remove custom object InitContainer.
Allow users to pass List[k8s.V1Container] as init-container in 
K8sPodOperator

* Add missing init_containers initalization in K8s pod operator

Due to rebase from master, certain sections of the 
kubernetes_pod_operator.py file was refactored which led to missing 
init_containers initalization in K8s pod operator. Add missing init_containers 
initalization in K8s pod operator. Update kubernetes pod operator 
configurations in init container test.

(cherry picked from commit 4e1b0aa7c52e79eeb99698e6190e6f39ca3cab7f)
---
 .../contrib/operators/kubernetes_pod_operator.py   |   5 +-
 docs/howto/operator/kubernetes.rst | 155 +
 2 files changed, 159 insertions(+), 1 deletion(-)

diff --git a/airflow/contrib/operators/kubernetes_pod_operator.py 
b/airflow/contrib/operators/kubernetes_pod_operator.py
index 41f0df3..e64a912 100644
--- a/airflow/contrib/operators/kubernetes_pod_operator.py
+++ b/airflow/contrib/operators/kubernetes_pod_operator.py
@@ -138,6 +138,8 @@ class KubernetesPodOperator(BaseOperator):  # pylint: 
disable=too-many-instance-
 /airflow/xcom/return.json in the container will also be pushed to an
 XCom when the container completes.
 :type do_xcom_push: bool
+:param init_containers: init container for the launched Pod
+:type init_containers: list[kubernetes.client.models.V1Container]
 :param pod_template_file: path to pod template file
 :type pod_template_file: str
 """
@@ -178,11 +180,11 @@ class KubernetesPodOperator(BaseOperator):  # pylint: 
disable=too-many-instance-
  dnspolicy=None,
  schedulername=None,
  full_pod_spec=None,
- init_containers=None,
  log_events_on_failure=False,
  do_xcom_push=False,
  pod_template_file=None,
  priority_class_name=None,
+ init_containers=None,
  *args,
  **kwargs):
 if kwargs.get('xcom_push') is not None:
@@ -224,6 +226,7 @@ class KubernetesPodOperator(BaseOperator):  # pylint: 
disable=too-many-instance-
 self.schedulername = schedulername
 self.full_pod_spec = full_pod_spec
 self.init_containers = init_containers or []
+self.init_containers = init_containers or []
 self.log_events_on_failure = log_events_on_failure
 self.pod_template_file = pod_template_file
 self.priority_class_name = priority_class_name
diff --git a/docs/howto/operator/kubernetes.rst 
b/docs/howto/operator/kubernetes.rst
new file mode 100644
index 000..f612cce
--- /dev/null
+++ b/docs/howto/operator/kubernetes.rst
@@ -0,0 +1,155 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, 

[airflow] branch v1-10-test updated: Fix quarantined tests - TestCliWebServer (#9596)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new 301a0b8  Fix quarantined tests - TestCliWebServer (#9596)
301a0b8 is described below

commit 301a0b8d7196244956f53970991d91b780dfdf2e
Author: Kamil Breguła 
AuthorDate: Wed Jul 1 19:10:58 2020 +0200

Fix quarantined tests - TestCliWebServer (#9596)
---
 airflow/bin/cli.py  |  24 ++--
 requirements/requirements-python2.7.txt |   9 +-
 requirements/requirements-python3.5.txt |  10 +-
 requirements/requirements-python3.6.txt |  14 +--
 requirements/requirements-python3.7.txt |  12 +-
 requirements/requirements-python3.8.txt |   6 +-
 requirements/setup-2.7.md5  |   2 +-
 requirements/setup-3.5.md5  |   2 +-
 requirements/setup-3.6.md5  |   2 +-
 requirements/setup-3.7.md5  |   2 +-
 requirements/setup-3.8.md5  |   2 +-
 setup.py|   4 +-
 tests/cli/test_cli.py   |   7 +-
 tests/insert_extras.py  |   2 +-
 tests/test_core.py  | 193 
 15 files changed, 167 insertions(+), 124 deletions(-)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 8334af6..51cf49d 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -800,8 +800,8 @@ class GunicornMonitor(LoggingMixin):
 respectively. Gunicorn guarantees that on TTOU workers are terminated
 gracefully and that the oldest worker is terminated.
 
-:param gunicorn_master_proc:  handle for the main Gunicorn process
-:param num_workers_expected:  Number of workers to run the Gunicorn web 
server
+:param gunicorn_master_pid: pid of the main Gunicorn process
+:param num_workers_expected: Number of workers to run the Gunicorn web 
server
 :param master_timeout: Number of seconds the webserver waits before 
killing gunicorn master that
 doesn't respond
 :param worker_refresh_interval: Number of seconds to wait before 
refreshing a batch of workers.
@@ -813,7 +813,7 @@ class GunicornMonitor(LoggingMixin):
 """
 def __init__(
 self,
-gunicorn_master_proc,
+gunicorn_master_pid,
 num_workers_expected,
 master_timeout,
 worker_refresh_interval,
@@ -821,7 +821,7 @@ class GunicornMonitor(LoggingMixin):
 reload_on_plugin_change
 ):
 super(GunicornMonitor, self).__init__()
-self.gunicorn_master_proc = gunicorn_master_proc
+self.gunicorn_master_proc = psutil.Process(gunicorn_master_pid)
 self.num_workers_expected = num_workers_expected
 self.master_timeout = master_timeout
 self.worker_refresh_interval = worker_refresh_interval
@@ -936,14 +936,15 @@ class GunicornMonitor(LoggingMixin):
 """
 Starts monitoring the webserver.
 """
+self.log.debug("Start monitoring gunicorn")
 try:  # pylint: disable=too-many-nested-blocks
 self._wait_until_true(
 lambda: self.num_workers_expected == 
self._get_num_workers_running(),
 timeout=self.master_timeout
 )
 while True:
-if self.gunicorn_master_proc.poll() is not None:
-sys.exit(self.gunicorn_master_proc.returncode)
+if not self.gunicorn_master_proc.is_running():
+sys.exit(1)
 self._check_workers()
 # Throttle loop
 time.sleep(1)
@@ -1122,15 +1123,16 @@ def webserver(args):
 
 gunicorn_master_proc = None
 
-def kill_proc(dummy_signum, dummy_frame):
+def kill_proc(signum, _):
+log.info("Received signal: %s. Closing gunicorn.", signum)
 gunicorn_master_proc.terminate()
 gunicorn_master_proc.wait()
 sys.exit(0)
 
-def monitor_gunicorn(gunicorn_master_proc):
+def monitor_gunicorn(gunicorn_master_pid):
 # These run forever until SIG{INT, TERM, KILL, ...} signal is sent
 GunicornMonitor(
-gunicorn_master_proc=gunicorn_master_proc,
+gunicorn_master_pid=gunicorn_master_pid,
 num_workers_expected=num_workers,
 master_timeout=conf.getint('webserver', 
'web_server_master_timeout'),
 worker_refresh_interval=conf.getint('webserver', 
'worker_refresh_interval', fallback=30),
@@ -1167,7 +1169,7 @@ def webserver(args):
 time.sleep(0.1)
 
 gunicorn_master_proc = psutil.Process(gunicorn_master_proc_pid)
-monitor_gunicorn(gunicorn_master_proc)
+monitor_gunicorn(gunicorn_master_proc.pid)
 
 stdout.close()
 stderr.close()
@@ -1177,7 +1179,7 @@ 

[GitHub] [airflow] kaxil merged pull request #9596: Fix quarantined tests - TestCliWebServer (v1-10-test)

2020-07-01 Thread GitBox


kaxil merged pull request #9596:
URL: https://github.com/apache/airflow/pull/9596


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on issue #8717: DagRuns page renders the state column with artifacts

2020-07-01 Thread GitBox


kaxil commented on issue #8717:
URL: https://github.com/apache/airflow/issues/8717#issuecomment-652540406


   Fixed in 
https://github.com/apache/airflow/commit/dbe308c7ea31043d86ac1b563c5590fe6c40bcb5



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #8717: DagRuns page renders the state column with artifacts

2020-07-01 Thread GitBox


kaxil closed issue #8717:
URL: https://github.com/apache/airflow/issues/8717


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] j-y-matsubara commented on pull request #9531: Support .airflowignore for plugins

2020-07-01 Thread GitBox


j-y-matsubara commented on pull request #9531:
URL: https://github.com/apache/airflow/pull/9531#issuecomment-652534766


   There was an error.
   I will fix it.



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


potiuk commented on a change in pull request #9614:
URL: https://github.com/apache/airflow/pull/9614#discussion_r448494205



##
File path: airflow/www/views.py
##
@@ -2192,12 +2192,11 @@ class XComModelView(AirflowModelView):
 
 datamodel = AirflowModelView.CustomSQLAInterface(XCom)
 
-base_permissions = ['can_add', 'can_list', 'can_edit', 'can_delete']
+base_permissions = ['can_add', 'can_list', 'can_delete']

Review comment:
   I think we should remove "can_add" as well? When you restart the task, 
the xcom is wiped and you will be able to add it then ...





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] j-y-matsubara commented on a change in pull request #9531: Support .airflowignore for plugins

2020-07-01 Thread GitBox


j-y-matsubara commented on a change in pull request #9531:
URL: https://github.com/apache/airflow/pull/9531#discussion_r448459428



##
File path: airflow/utils/file.py
##
@@ -90,6 +90,48 @@ def open_maybe_zipped(fileloc, mode='r'):
 return io.open(fileloc, mode=mode)
 
 
+def find_path_from_directory(
+base_dir_path: str,
+ignore_list_file: str) -> Generator[str, None, None]:
+"""
+Search the file and return the path of the file that should not be ignored.
+:param base_dir_path: the base path to be searched for.
+:param ignore_file_list_name: the file name in which specifies a regular 
expression pattern is written.
+
+:return : file path not to be ignored
+"""
+
+patterns_by_dir: Dict[str, List[Pattern[str]]] = {}
+
+for root, dirs, files in os.walk(str(base_dir_path), followlinks=True):
+patterns: List[Pattern[str]] = patterns_by_dir.get(root, [])
+
+ignore_list_file_path = os.path.join(root, ignore_list_file)
+if os.path.isfile(ignore_list_file_path):
+with open(ignore_list_file_path, 'r') as file:
+lines_no_comments = [re.compile(r"\s*#.*").sub("", line) for 
line in file.read().split("\n")]

Review comment:
   It is not necessary.
   I fixed.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil opened a new pull request #9614: Restrict changing XCom values from the Webserver

2020-07-01 Thread GitBox


kaxil opened a new pull request #9614:
URL: https://github.com/apache/airflow/pull/9614


   Since XCom values can contain pickled data, we would no longer allow editing 
Xcom values from the UI.
   
   We don't allow changing DAG files from the UI, so this brings XComs in line 
with that
   
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated: Restrict editing DagRun State in the old UI (#9612)

2020-07-01 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v1-10-test by this push:
 new dbe308c  Restrict editing DagRun State in the old UI (#9612)
dbe308c is described below

commit dbe308c7ea31043d86ac1b563c5590fe6c40bcb5
Author: Kaxil Naik 
AuthorDate: Wed Jul 1 17:15:07 2020 +0100

Restrict editing DagRun State in the old UI (#9612)

closes #8717
---
 UPDATING.md  | 7 +++
 airflow/www/views.py | 1 -
 2 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/UPDATING.md b/UPDATING.md
index ea8c262..3dfda58 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -66,6 +66,13 @@ https://developers.google.com/style/inclusive-documentation
 
 Now use NULL as default value for dag.description in dag table
 
+### Restrict editing DagRun State in the old UI (Flask-admin based UI)
+
+Before 1.10.11 it was possible to edit DagRun State in the `/admin/dagrun/` 
page
+ to any text.
+
+In Airflow 1.10.11+, the user can only choose the states from the list.
+
 ## Airflow 1.10.10
 
 ### Setting Empty string to a Airflow Variable will return an empty string
diff --git a/airflow/www/views.py b/airflow/www/views.py
index 8b1e910..a3293c8 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2818,7 +2818,6 @@ class DagRunModelView(ModelViewOnly):
 verbose_name_plural = "DAG Runs"
 can_edit = True
 can_create = True
-column_editable_list = ('state',)
 verbose_name = "dag run"
 column_default_sort = ('execution_date', True)
 form_choices = {



[GitHub] [airflow] kaxil merged pull request #9612: Restrict editing DagRun State in the old UI

2020-07-01 Thread GitBox


kaxil merged pull request #9612:
URL: https://github.com/apache/airflow/pull/9612


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on a change in pull request #9612: Restrict editing DagRun State in the old UI

2020-07-01 Thread GitBox


potiuk commented on a change in pull request #9612:
URL: https://github.com/apache/airflow/pull/9612#discussion_r448471527



##
File path: UPDATING.md
##
@@ -66,6 +66,14 @@ https://developers.google.com/style/inclusive-documentation
 
 Now use NULL as default value for dag.description in dag table
 
+### Restrict editing DagRun State in the old UI (Flask-admin based UI)
+
+Before 1.10.11 it was possible to edit DagRun State in the `/admin/dagrun/` 
page
+ to any text.
+
+From Airflow 1.10.11, you would still be able to set from dropdown to a 
pre-defined option
+ but would not allow setting it to any text.

Review comment:
   ```suggestion
   Before 1.10.11 it was possible to edit DagRun State in the `/admin/dagrun/` 
page
to any text.
   
   In Airflow 1.10.11+, the user can only choose the states from the list.
   ```





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch v1-10-test updated (836f717 -> 4f40b88)

2020-07-01 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 836f717  Fix task and dag stats on home page (#8865)
 add dc9c7fd  Add Production Helm chart support (#8777)
 add d374278  Fix typo in helm chart upgrade command for 2.0 (#9484)
 add c363c1b  Remove non-existent chart value from readme (#9511)
 add dd40f47  Fix typo of resultBackendConnection in chart README (#9537)
 add 93570f3  Remove redundant airflowVersion from Helm Chart readme (#9592)
 add ad618a8  Fix broken link in chart/README.md (#9591)
 add 9253a8f  Switches to Helm Chart for Kubernetes tests (#9468)
 add ec3f4dc  Removes importlib usage - it's not needed (fails on Airflow 
1.10) (#9613)
 add 4f40b88  Update Breeze documentation (#9608)

No new revisions were added by this update.

Summary of changes:
 .github/workflows/ci.yml   |  23 +-
 .pre-commit-config.yaml|   2 +-
 BREEZE.rst | 374 --
 CI.rst |   2 +-
 Dockerfile |   4 +
 IMAGES.rst |   3 +
 TESTING.rst|  67 ++--
 airflow/kubernetes/pod_launcher.py |   2 +-
 breeze |  51 ++-
 breeze-complete|  14 +-
 chart/.gitignore   |   9 +
 .../.helmignore|  33 +-
 .readthedocs.yml => chart/Chart.yaml   |  20 +-
 chart/README.md| 270 +
 chart/charts/postgresql-6.3.12.tgz | Bin 22754 -> 0 bytes
 chart/requirements.lock|   6 +
 .../libs/helper.py => chart/requirements.yaml  |  11 +-
 .../LICENSE.txt => chart/templates/NOTES.txt   |  13 +
 chart/templates/_helpers.yaml  | 260 
 chart/templates/cleanup/cleanup-cronjob.yaml   |  67 
 .../templates/cleanup/cleanup-serviceaccount.yaml  |  24 +-
 chart/templates/configmap.yaml | 119 ++
 chart/templates/create-user-job.yaml   |  87 
 chart/templates/flower/flower-deployment.yaml  | 102 +
 chart/templates/flower/flower-networkpolicy.yaml   |  51 +++
 .../templates/flower/flower-service.yaml   |  41 +-
 .../pod.yaml => chart/templates/limitrange.yaml|  33 +-
 .../templates/pgbouncer/pgbouncer-deployment.yaml  | 128 ++
 .../pgbouncer/pgbouncer-networkpolicy.yaml |  69 
 .../pgbouncer/pgbouncer-poddisruptionbudget.yaml   |  56 ++-
 chart/templates/pgbouncer/pgbouncer-service.yaml   |  56 +++
 .../templates/rbac/pod-cleanup-role.yaml   |  34 +-
 .../templates/rbac/pod-cleanup-rolebinding.yaml|  32 +-
 chart/templates/rbac/pod-launcher-role.yaml|  58 +++
 chart/templates/rbac/pod-launcher-rolebinding.yaml |  51 +++
 chart/templates/redis/redis-networkpolicy.yaml |  63 +++
 .../templates/redis/redis-service.yaml |  41 +-
 chart/templates/redis/redis-statefulset.yaml   |  99 +
 .../pod.yaml => chart/templates/resourcequota.yaml |  33 +-
 .../templates/scheduler/scheduler-deployment.yaml  | 195 +
 .../scheduler/scheduler-networkpolicy.yaml |  55 +++
 .../scheduler/scheduler-poddisruptionbudget.yaml   |  39 +-
 .../templates/scheduler/scheduler-service.yaml |  41 +-
 .../scheduler/scheduler-serviceaccount.yaml|  24 +-
 .../templates/secrets/elasticsearch-secret.yaml|  22 +-
 .../templates/secrets/fernetkey-secret.yaml|  27 +-
 .../secrets/metadata-connection-secret.yaml|  42 ++
 .../templates/secrets/pgbouncer-config-secret.yaml |  23 +-
 .../templates/secrets/pgbouncer-stats-secret.yaml  |  22 +-
 chart/templates/secrets/redis-secrets.yaml |  61 +++
 .../templates/secrets/registry-secret.yaml |  24 +-
 .../secrets/result-backend-connection-secret.yaml  |  37 ++
 chart/templates/statsd/statsd-deployment.yaml  |  87 
 chart/templates/statsd/statsd-networkpolicy.yaml   |  57 +++
 chart/templates/statsd/statsd-service.yaml |  56 +++
 .../templates/webserver/webserver-deployment.yaml  | 139 +++
 .../webserver/webserver-networkpolicy.yaml |  51 +++
 .../templates/webserver/webserver-service.yaml |  39 +-
 chart/templates/workers/worker-deployment.yaml | 161 
 chart/templates/workers/worker-kedaautoscaler.yaml |  47 +++
 chart/templates/workers/worker-networkpolicy.yaml  |  53 +++
 .../templates/workers/worker-service.yaml  |  41 +-
 .../templates/workers/worker-serviceaccount.yaml   |  24 +-
 chart/values.yaml  | 436 +
 

  1   2   3   >