potiuk commented on pull request #15705:
URL: https://github.com/apache/airflow/pull/15705#issuecomment-834197690


   Random failure -: @ashb do you want to take a look? 
   
   Just to be sure I run this `git log --oneline  2.0.2..HEAD -- airflow/` to 
filter out only `airflow` changes 
   ```
   492f84322 Adds interactivity when generating provider documentation. (#15518)
   4423ea7fc Use Pip 21.* to install airflow officially (#15513)
   ecff91caa Clarifies installation/runtime options for CI/PROD images. (#15320)
   19e41577f Fix critical CeleryKubernetesExecutor bug (#13247)
   ```
   
   And then `git show --oneline  2.0.2..HEAD -- airflow/` (and filtered out 
provider changes that were part of some doc fixes) - and those are the changes. 
   
   They are mainly about:
   
   *  "provider.schema" new field (not used in airflow, we have runtime schema 
in airflow). Part of the PIP 21 change for providers.
   *  Remove PIP warning from README
   * pylint exclusion fixes to make new version of pylint happy
   * the default value for timeout (30.0 -> 30) change was needed to make 
migration from 1.10 working smooth (the timeout was int in 1.10 and in some 
tests it failed for our builds. I can revert this default one for 2.0.3 if 
needed but I think it is very safe). 
   * the only real fix - the critical fix to CeleryKubernetesExecutor
   
   ```
   4423ea7fc Use Pip 21.* to install airflow officially (#15513)
   diff --git a/airflow/provider.yaml.schema.json 
b/airflow/provider.yaml.schema.json
   index bdec41dc3..458ee2ef2 100644
   --- a/airflow/provider.yaml.schema.json
   +++ b/airflow/provider.yaml.schema.json
   @@ -191,6 +191,10 @@
          "items": {
            "type": "string"
          }
   +    },
   +    "additional-extras": {
   +      "type": "object",
   +      "description": "Additional extras that the provider should have"
        }
      },
      "additionalProperties": false,
   diff --git a/airflow/providers/apache/beam/README.md 
b/airflow/providers/apache/beam/README.md
   index 34f2863b5..d9294d2c1 100644
   --- a/airflow/providers/apache/beam/README.md
   +++ b/airflow/providers/apache/beam/README.md
   @@ -41,14 +41,6 @@ are in `airflow.providers.apache.beam` python package.
    
    ## Installation
    
   -NOTE!
   -
   -On November 2020, new version of PIP (20.3) has been released with a new, 
2020 resolver. This resolver
   -does not yet work with Apache Airflow and might lead to errors in 
installation - depends on your choice
   -of extras. In order to install Airflow you need to either downgrade pip to 
version 20.2.4
   -`pip install --upgrade pip==20.2.4` or, in case you use Pip 20.3, you need 
to add option
   -`--use-deprecated legacy-resolver` to your pip install command.
   -
    You can install this package on top of an existing airflow 2.* installation 
via
    `pip install apache-airflow-providers-apache-beam`
    
   diff --git a/airflow/providers/apache/hive/transfers/mssql_to_hive.py 
b/airflow/providers/apache/hive/transfers/mssql_to_hive.py
   index 5e5af8c5d..090a70285 100644
   --- a/airflow/providers/apache/hive/transfers/mssql_to_hive.py
   +++ b/airflow/providers/apache/hive/transfers/mssql_to_hive.py
   @@ -15,7 +15,7 @@
    # KIND, either express or implied.  See the License for the
    # specific language governing permissions and limitations
    # under the License.
   -
   +# pylint: disable=no-member
    """This module contains operator to move data from MSSQL to Hive."""
    
    from collections import OrderedDict
   diff --git a/airflow/providers/microsoft/mssql/hooks/mssql.py 
b/airflow/providers/microsoft/mssql/hooks/mssql.py
   index 4acdb52d1..928d0c45d 100644
   --- a/airflow/providers/microsoft/mssql/hooks/mssql.py
   +++ b/airflow/providers/microsoft/mssql/hooks/mssql.py
   @@ -15,6 +15,7 @@
    # KIND, either express or implied.  See the License for the
    # specific language governing permissions and limitations
    # under the License.
   +# pylint: disable=no-member
    """Microsoft SQLServer hook module"""
    
    import pymssql
   ecff91caa Clarifies installation/runtime options for CI/PROD images. (#15320)
   diff --git a/airflow/config_templates/config.yml 
b/airflow/config_templates/config.yml
   index c92acf8d2..cbb90a6bb 100644
   --- a/airflow/config_templates/config.yml
   +++ b/airflow/config_templates/config.yml
   @@ -242,7 +242,7 @@
          version_added: ~
          type: float
          example: ~
   -      default: "30.0"
   +      default: "30"
        - name: dagbag_import_error_tracebacks
          description: |
            Should a traceback be shown in the UI for dagbag import errors,
   diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
   index bc4d54a0b..0b1aa493b 100644
   --- a/airflow/config_templates/default_airflow.cfg
   +++ b/airflow/config_templates/default_airflow.cfg
   @@ -147,7 +147,7 @@ fernet_key = {FERNET_KEY}
    donot_pickle = True
    
    # How long before timing out a python file import
   -dagbag_import_timeout = 30.0
   +dagbag_import_timeout = 30
    
    # Should a traceback be shown in the UI for dagbag import errors,
    # instead of just the exception message
   19e41577f Fix critical CeleryKubernetesExecutor bug (#13247)
   diff --git a/airflow/executors/celery_kubernetes_executor.py 
b/airflow/executors/celery_kubernetes_executor.py
   index 721db317b..b5146801a 100644
   --- a/airflow/executors/celery_kubernetes_executor.py
   +++ b/airflow/executors/celery_kubernetes_executor.py
   @@ -38,6 +38,7 @@ class CeleryKubernetesExecutor(LoggingMixin):
    
        def __init__(self, celery_executor, kubernetes_executor):
            super().__init__()
   +        self._job_id: Optional[str] = None
            self.celery_executor = celery_executor
            self.kubernetes_executor = kubernetes_executor
    
   @@ -54,11 +55,31 @@ class CeleryKubernetesExecutor(LoggingMixin):
            """Return running tasks from celery and kubernetes executor"""
            return 
self.celery_executor.running.union(self.kubernetes_executor.running)
    
   +    @property
   +    def job_id(self):
   +        """
   +        This is a class attribute in BaseExecutor but since this is not 
really an executor, but a wrapper
   +        of executors we implement as property so we can have custom setter.
   +        """
   +        return self._job_id
   +
   +    @job_id.setter
   +    def job_id(self, value):
   +        """job_id is manipulated by SchedulerJob.  We must propagate the 
job_id to wrapped executors."""
   +        self._job_id = value
   +        self.kubernetes_executor.job_id = value
   +        self.celery_executor.job_id = value
   +
        def start(self) -> None:
            """Start celery and kubernetes executor"""
            self.celery_executor.start()
            self.kubernetes_executor.start()
    
   +    @property
   +    def slots_available(self):
   +        """Number of new tasks this executor instance can accept"""
   +        return self.celery_executor.slots_available
   +
        def queue_command(
            self,
            task_instance: TaskInstance,
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to