dstandish commented on code in PR #23260:
URL: https://github.com/apache/airflow/pull/23260#discussion_r858942498


##########
RELEASE_NOTES.rst:
##########
@@ -21,6 +21,555 @@
 
 .. towncrier release notes start
 
+Airflow 2.3.0 (2022-04-29)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Passing ``execution_date`` to ``XCom.set()``\ , ``XCom.clear()``\ , 
``XCom.get_one()``\ , and ``XCom.get_many()`` is deprecated (#19825)

Review Comment:
   ```suggestion
   Passing ``execution_date`` to ``XCom.set()``, ``XCom.clear()`` , 
``XCom.get_one()`` , and ``XCom.get_many()`` is deprecated (#19825)
   ```



##########
RELEASE_NOTES.rst:
##########
@@ -21,6 +21,555 @@
 
 .. towncrier release notes start
 
+Airflow 2.3.0 (2022-04-29)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Passing ``execution_date`` to ``XCom.set()``\ , ``XCom.clear()``\ , 
``XCom.get_one()``\ , and ``XCom.get_many()`` is deprecated (#19825)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+Continuing the effort to bind TaskInstance to a DagRun, XCom entries are now 
also tied to a DagRun. Use the ``run_id`` argument to specify the DagRun 
instead.
+
+Task log templates are now read from the metadatabase instead of 
``airflow.cfg`` (#20165)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+  Previously, a task’s log is dynamically rendered from the ``[core] 
log_filename_template`` and ``[elasticsearch] log_id_template`` config values 
at runtime. This resulted in unfortunate characteristics, e.g. it is 
impractical to modify the config value after an Airflow instance is running for 
a while, since all existing task logs have be saved under the previous format 
and cannot be found with the new config value.
+
+  A new ``log_template`` table is introduced to solve this problem. This table 
is synchronised with the aforementioned config values every time Airflow 
starts, and a new field ``log_template_id`` is added to every DAG run to point 
to the format used by tasks (\ ``NULL`` indicates the first ever entry for 
compatibility).

Review Comment:
   ```suggestion
     A new ``log_template`` table is introduced to solve this problem. This 
table is synchronised with the aforementioned config values every time Airflow 
starts, and a new field ``log_template_id`` is added to every DAG run to point 
to the format used by tasks (``NULL`` indicates the first ever entry for 
compatibility).
   ```



##########
RELEASE_NOTES.rst:
##########
@@ -21,6 +21,555 @@
 
 .. towncrier release notes start
 
+Airflow 2.3.0 (2022-04-29)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Passing ``execution_date`` to ``XCom.set()``\ , ``XCom.clear()``\ , 
``XCom.get_one()``\ , and ``XCom.get_many()`` is deprecated (#19825)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+Continuing the effort to bind TaskInstance to a DagRun, XCom entries are now 
also tied to a DagRun. Use the ``run_id`` argument to specify the DagRun 
instead.
+
+Task log templates are now read from the metadatabase instead of 
``airflow.cfg`` (#20165)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+  Previously, a task’s log is dynamically rendered from the ``[core] 
log_filename_template`` and ``[elasticsearch] log_id_template`` config values 
at runtime. This resulted in unfortunate characteristics, e.g. it is 
impractical to modify the config value after an Airflow instance is running for 
a while, since all existing task logs have be saved under the previous format 
and cannot be found with the new config value.
+
+  A new ``log_template`` table is introduced to solve this problem. This table 
is synchronised with the aforementioned config values every time Airflow 
starts, and a new field ``log_template_id`` is added to every DAG run to point 
to the format used by tasks (\ ``NULL`` indicates the first ever entry for 
compatibility).
+
+Minimum kubernetes version bumped from ``3.0.0`` to ``21.7.0`` (#20759)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  No change in behavior is expected.  This was necessary in order to take 
advantage of a `bugfix 
<https://github.com/kubernetes-client/python-base/commit/70b78cd8488068c014b6d762a0c8d358273865b4>`_
 concerning refreshing of Kubernetes API tokens with EKS, which enabled the 
removal of some `workaround code 
<https://github.com/apache/airflow/pull/20759>`_.
+
+XCom now define ``run_id`` instead of ``execution_date`` (#20975)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  As a continuation to the TaskInstance-DagRun relation change started in 
Airflow 2.2, the ``execution_date`` columns on XCom has been removed from the 
database, and replaced by an `association proxy 
<https://docs.sqlalchemy.org/en/13/orm/extensions/associationproxy.html>`_ 
field at the ORM level. If you access Airflow’s metadatabase directly, you 
should rewrite the implementation to use the ``run_id`` column instead.
+
+  Note that Airflow’s metadatabase definition on both the database and ORM 
levels are considered implementation detail without strict backward 
compatibility guarantees.
+
+Non-JSON-serializable params deprecated (#21135).
+"""""""""""""""""""""""""""""""""""""""""""""""""
+
+  It was previously possible to use dag or task param defaults that were not 
JSON-serializable.
+
+  For example this worked previously:
+
+  .. code-block:: python
+
+     @dag.task(params={"a": {1, 2, 3}, "b": pendulum.now()})
+     def datetime_param(value):
+         print(value)
+
+
+     datetime_param("{{ params.a }} | {{ params.b }}")
+
+  Note the use of ``set`` and ``datetime`` types, which are not 
JSON-serializable.  This behavior is problematic because to override these 
values in a dag run conf, you must use JSON, which could make these params 
non-overridable.  Another problem is that the support for param validation 
assumes JSON.  Use of non-JSON-serializable params will be removed in Airflow 
3.0 and until then, use of them will produce a warning at parse time.
+
+You have to use ``postgresql://`` instead of ``postgres://`` in 
``sql_alchemy_conn`` for SQLAlchemy 1.4.0+ (#21205)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  When you use SQLAlchemy 1.4.0+, you need to use ``postgresql://`` as the 
scheme in the ``sql_alchemy_conn``.
+  In the previous versions of SQLAlchemy it was possible to use 
``postgres://``\ , but using it in
+  SQLAlchemy 1.4.0+ results in:
+
+  .. code-block::
+
+     >       raise exc.NoSuchModuleError(
+                 "Can't load plugin: %s:%s" % (self.group, name)
+             )
+     E       sqlalchemy.exc.NoSuchModuleError: Can't load plugin: 
sqlalchemy.dialects:postgres
+
+  If you cannot change the scheme of your URL immediately, Airflow continues 
to work with SQLAlchemy
+  1.3 and you can downgrade SQLAlchemy, but we recommend updating the scheme.
+  Details in the `SQLAlchemy Changelog 
<https://docs.sqlalchemy.org/en/14/changelog/changelog_14.html#change-3687655465c25a39b968b4f5f6e9170b>`_.
+
+``auth_backends`` replaces ``auth_backend`` configuration setting (#21472)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, only one backend was used to authorize use of the REST API. In 
2.3 this was changed to support multiple backends, separated by whitespace. 
Each will be tried in turn until a successful response is returned.
+
+  This setting is also used for the deprecated experimental API, which only 
uses the first option even if multiple are given.
+
+``airflow.models.base.Operator`` is removed (#21505)
+""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, there was an empty class ``airflow.models.base.Operator`` for 
“type hinting”. This class was never really useful for anything (everything it 
did could be done better with ``airflow.models.baseoperator.BaseOperator``\ ), 
and has been removed. If you are relying on the class’s existence, use 
``BaseOperator`` (for concrete operators), 
``airflow.models.abstractoperator.AbstractOperator`` (the base class of both 
``BaseOperator`` and the AIP-42 ``MappedOperator``\ ), or 
``airflow.models.operator.Operator`` (a union type ``BaseOperator | 
MappedOperator`` for type annotation).
+
+Zip files in the DAGs folder can no longer have a ``.py`` extension (#21538)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  It was previously possible to have any extension for zip files in the DAGs 
folder. Now ``.py`` files are going to be loaded as modules without checking 
whether it is a zip file, as it leads to less IO. If a ``.py`` file in the DAGs 
folder is a zip compressed file, parsing it will fail with an exception.
+
+``auth_backends`` includes session (#21640)
+"""""""""""""""""""""""""""""""""""""""""""
+
+  To allow the Airflow UI to use the API, the previous default authorization 
backend ``airflow.api.auth.backend.deny_all`` is changed to 
``airflow.api.auth.backend.session``\ , and this is automatically added to the 
list of API authorization backends if a non-default value is set.

Review Comment:
   ```suggestion
     To allow the Airflow UI to use the API, the previous default authorization 
backend ``airflow.api.auth.backend.deny_all`` is changed to 
``airflow.api.auth.backend.session``, and this is automatically added to the 
list of API authorization backends if a non-default value is set.
   ```



##########
RELEASE_NOTES.rst:
##########
@@ -21,6 +21,555 @@
 
 .. towncrier release notes start
 
+Airflow 2.3.0 (2022-04-29)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Passing ``execution_date`` to ``XCom.set()``\ , ``XCom.clear()``\ , 
``XCom.get_one()``\ , and ``XCom.get_many()`` is deprecated (#19825)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+Continuing the effort to bind TaskInstance to a DagRun, XCom entries are now 
also tied to a DagRun. Use the ``run_id`` argument to specify the DagRun 
instead.
+
+Task log templates are now read from the metadatabase instead of 
``airflow.cfg`` (#20165)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+  Previously, a task’s log is dynamically rendered from the ``[core] 
log_filename_template`` and ``[elasticsearch] log_id_template`` config values 
at runtime. This resulted in unfortunate characteristics, e.g. it is 
impractical to modify the config value after an Airflow instance is running for 
a while, since all existing task logs have be saved under the previous format 
and cannot be found with the new config value.
+
+  A new ``log_template`` table is introduced to solve this problem. This table 
is synchronised with the aforementioned config values every time Airflow 
starts, and a new field ``log_template_id`` is added to every DAG run to point 
to the format used by tasks (\ ``NULL`` indicates the first ever entry for 
compatibility).
+
+Minimum kubernetes version bumped from ``3.0.0`` to ``21.7.0`` (#20759)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  No change in behavior is expected.  This was necessary in order to take 
advantage of a `bugfix 
<https://github.com/kubernetes-client/python-base/commit/70b78cd8488068c014b6d762a0c8d358273865b4>`_
 concerning refreshing of Kubernetes API tokens with EKS, which enabled the 
removal of some `workaround code 
<https://github.com/apache/airflow/pull/20759>`_.
+
+XCom now define ``run_id`` instead of ``execution_date`` (#20975)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  As a continuation to the TaskInstance-DagRun relation change started in 
Airflow 2.2, the ``execution_date`` columns on XCom has been removed from the 
database, and replaced by an `association proxy 
<https://docs.sqlalchemy.org/en/13/orm/extensions/associationproxy.html>`_ 
field at the ORM level. If you access Airflow’s metadatabase directly, you 
should rewrite the implementation to use the ``run_id`` column instead.
+
+  Note that Airflow’s metadatabase definition on both the database and ORM 
levels are considered implementation detail without strict backward 
compatibility guarantees.
+
+Non-JSON-serializable params deprecated (#21135).
+"""""""""""""""""""""""""""""""""""""""""""""""""
+
+  It was previously possible to use dag or task param defaults that were not 
JSON-serializable.
+
+  For example this worked previously:
+
+  .. code-block:: python
+
+     @dag.task(params={"a": {1, 2, 3}, "b": pendulum.now()})
+     def datetime_param(value):
+         print(value)
+
+
+     datetime_param("{{ params.a }} | {{ params.b }}")
+
+  Note the use of ``set`` and ``datetime`` types, which are not 
JSON-serializable.  This behavior is problematic because to override these 
values in a dag run conf, you must use JSON, which could make these params 
non-overridable.  Another problem is that the support for param validation 
assumes JSON.  Use of non-JSON-serializable params will be removed in Airflow 
3.0 and until then, use of them will produce a warning at parse time.
+
+You have to use ``postgresql://`` instead of ``postgres://`` in 
``sql_alchemy_conn`` for SQLAlchemy 1.4.0+ (#21205)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  When you use SQLAlchemy 1.4.0+, you need to use ``postgresql://`` as the 
scheme in the ``sql_alchemy_conn``.
+  In the previous versions of SQLAlchemy it was possible to use 
``postgres://``\ , but using it in
+  SQLAlchemy 1.4.0+ results in:
+
+  .. code-block::
+
+     >       raise exc.NoSuchModuleError(
+                 "Can't load plugin: %s:%s" % (self.group, name)
+             )
+     E       sqlalchemy.exc.NoSuchModuleError: Can't load plugin: 
sqlalchemy.dialects:postgres
+
+  If you cannot change the scheme of your URL immediately, Airflow continues 
to work with SQLAlchemy
+  1.3 and you can downgrade SQLAlchemy, but we recommend updating the scheme.
+  Details in the `SQLAlchemy Changelog 
<https://docs.sqlalchemy.org/en/14/changelog/changelog_14.html#change-3687655465c25a39b968b4f5f6e9170b>`_.
+
+``auth_backends`` replaces ``auth_backend`` configuration setting (#21472)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, only one backend was used to authorize use of the REST API. In 
2.3 this was changed to support multiple backends, separated by whitespace. 
Each will be tried in turn until a successful response is returned.
+
+  This setting is also used for the deprecated experimental API, which only 
uses the first option even if multiple are given.
+
+``airflow.models.base.Operator`` is removed (#21505)
+""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, there was an empty class ``airflow.models.base.Operator`` for 
“type hinting”. This class was never really useful for anything (everything it 
did could be done better with ``airflow.models.baseoperator.BaseOperator``\ ), 
and has been removed. If you are relying on the class’s existence, use 
``BaseOperator`` (for concrete operators), 
``airflow.models.abstractoperator.AbstractOperator`` (the base class of both 
``BaseOperator`` and the AIP-42 ``MappedOperator``\ ), or 
``airflow.models.operator.Operator`` (a union type ``BaseOperator | 
MappedOperator`` for type annotation).

Review Comment:
   ```suggestion
     Previously, there was an empty class ``airflow.models.base.Operator`` for 
“type hinting”. This class was never really useful for anything (everything it 
did could be done better with ``airflow.models.baseoperator.BaseOperator``), 
and has been removed. If you are relying on the class’s existence, use 
``BaseOperator`` (for concrete operators), 
``airflow.models.abstractoperator.AbstractOperator`` (the base class of both 
``BaseOperator`` and the AIP-42 ``MappedOperator``), or 
``airflow.models.operator.Operator`` (a union type ``BaseOperator | 
MappedOperator`` for type annotation).
   ```



##########
RELEASE_NOTES.rst:
##########
@@ -21,6 +21,555 @@
 
 .. towncrier release notes start
 
+Airflow 2.3.0 (2022-04-29)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Passing ``execution_date`` to ``XCom.set()``\ , ``XCom.clear()``\ , 
``XCom.get_one()``\ , and ``XCom.get_many()`` is deprecated (#19825)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+Continuing the effort to bind TaskInstance to a DagRun, XCom entries are now 
also tied to a DagRun. Use the ``run_id`` argument to specify the DagRun 
instead.
+
+Task log templates are now read from the metadatabase instead of 
``airflow.cfg`` (#20165)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+  Previously, a task’s log is dynamically rendered from the ``[core] 
log_filename_template`` and ``[elasticsearch] log_id_template`` config values 
at runtime. This resulted in unfortunate characteristics, e.g. it is 
impractical to modify the config value after an Airflow instance is running for 
a while, since all existing task logs have be saved under the previous format 
and cannot be found with the new config value.
+
+  A new ``log_template`` table is introduced to solve this problem. This table 
is synchronised with the aforementioned config values every time Airflow 
starts, and a new field ``log_template_id`` is added to every DAG run to point 
to the format used by tasks (\ ``NULL`` indicates the first ever entry for 
compatibility).
+
+Minimum kubernetes version bumped from ``3.0.0`` to ``21.7.0`` (#20759)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  No change in behavior is expected.  This was necessary in order to take 
advantage of a `bugfix 
<https://github.com/kubernetes-client/python-base/commit/70b78cd8488068c014b6d762a0c8d358273865b4>`_
 concerning refreshing of Kubernetes API tokens with EKS, which enabled the 
removal of some `workaround code 
<https://github.com/apache/airflow/pull/20759>`_.
+
+XCom now define ``run_id`` instead of ``execution_date`` (#20975)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  As a continuation to the TaskInstance-DagRun relation change started in 
Airflow 2.2, the ``execution_date`` columns on XCom has been removed from the 
database, and replaced by an `association proxy 
<https://docs.sqlalchemy.org/en/13/orm/extensions/associationproxy.html>`_ 
field at the ORM level. If you access Airflow’s metadatabase directly, you 
should rewrite the implementation to use the ``run_id`` column instead.
+
+  Note that Airflow’s metadatabase definition on both the database and ORM 
levels are considered implementation detail without strict backward 
compatibility guarantees.
+
+Non-JSON-serializable params deprecated (#21135).
+"""""""""""""""""""""""""""""""""""""""""""""""""
+
+  It was previously possible to use dag or task param defaults that were not 
JSON-serializable.
+
+  For example this worked previously:
+
+  .. code-block:: python
+
+     @dag.task(params={"a": {1, 2, 3}, "b": pendulum.now()})
+     def datetime_param(value):
+         print(value)
+
+
+     datetime_param("{{ params.a }} | {{ params.b }}")
+
+  Note the use of ``set`` and ``datetime`` types, which are not 
JSON-serializable.  This behavior is problematic because to override these 
values in a dag run conf, you must use JSON, which could make these params 
non-overridable.  Another problem is that the support for param validation 
assumes JSON.  Use of non-JSON-serializable params will be removed in Airflow 
3.0 and until then, use of them will produce a warning at parse time.
+
+You have to use ``postgresql://`` instead of ``postgres://`` in 
``sql_alchemy_conn`` for SQLAlchemy 1.4.0+ (#21205)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  When you use SQLAlchemy 1.4.0+, you need to use ``postgresql://`` as the 
scheme in the ``sql_alchemy_conn``.
+  In the previous versions of SQLAlchemy it was possible to use 
``postgres://``\ , but using it in
+  SQLAlchemy 1.4.0+ results in:
+
+  .. code-block::
+
+     >       raise exc.NoSuchModuleError(
+                 "Can't load plugin: %s:%s" % (self.group, name)
+             )
+     E       sqlalchemy.exc.NoSuchModuleError: Can't load plugin: 
sqlalchemy.dialects:postgres
+
+  If you cannot change the scheme of your URL immediately, Airflow continues 
to work with SQLAlchemy
+  1.3 and you can downgrade SQLAlchemy, but we recommend updating the scheme.
+  Details in the `SQLAlchemy Changelog 
<https://docs.sqlalchemy.org/en/14/changelog/changelog_14.html#change-3687655465c25a39b968b4f5f6e9170b>`_.
+
+``auth_backends`` replaces ``auth_backend`` configuration setting (#21472)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, only one backend was used to authorize use of the REST API. In 
2.3 this was changed to support multiple backends, separated by whitespace. 
Each will be tried in turn until a successful response is returned.
+
+  This setting is also used for the deprecated experimental API, which only 
uses the first option even if multiple are given.
+
+``airflow.models.base.Operator`` is removed (#21505)
+""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  Previously, there was an empty class ``airflow.models.base.Operator`` for 
“type hinting”. This class was never really useful for anything (everything it 
did could be done better with ``airflow.models.baseoperator.BaseOperator``\ ), 
and has been removed. If you are relying on the class’s existence, use 
``BaseOperator`` (for concrete operators), 
``airflow.models.abstractoperator.AbstractOperator`` (the base class of both 
``BaseOperator`` and the AIP-42 ``MappedOperator``\ ), or 
``airflow.models.operator.Operator`` (a union type ``BaseOperator | 
MappedOperator`` for type annotation).
+
+Zip files in the DAGs folder can no longer have a ``.py`` extension (#21538)
+""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  It was previously possible to have any extension for zip files in the DAGs 
folder. Now ``.py`` files are going to be loaded as modules without checking 
whether it is a zip file, as it leads to less IO. If a ``.py`` file in the DAGs 
folder is a zip compressed file, parsing it will fail with an exception.
+
+``auth_backends`` includes session (#21640)
+"""""""""""""""""""""""""""""""""""""""""""
+
+  To allow the Airflow UI to use the API, the previous default authorization 
backend ``airflow.api.auth.backend.deny_all`` is changed to 
``airflow.api.auth.backend.session``\ , and this is automatically added to the 
list of API authorization backends if a non-default value is set.
+
+Default templates for log filenames and elasticsearch log_id changed (#21734)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+  In order to support Dynamic Task Mapping the default templates for per-task 
instance logging has changed. If your config contains the old default values 
they will be upgraded-in-place.
+
+  If you are happy with the new config values you should *remove* the setting 
in ``airflow.cfg`` and let the default value be used. Old default values were:
+
+
+  * ``[core] log_filename_template``\ : ``{{ ti.dag_id }}/{{ ti.task_id }}/{{ 
ts }}/{{ try_number }}.log``
+  * ``[elasticsearch] log_id_template``\ : 
``{dag_id}-{task_id}-{execution_date}-{try_number}``

Review Comment:
   ```suggestion
     * ``[core] log_filename_template``: ``{{ ti.dag_id }}/{{ ti.task_id }}/{{ 
ts }}/{{ try_number }}.log``
     * ``[elasticsearch] log_id_template``: 
``{dag_id}-{task_id}-{execution_date}-{try_number}``
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to