kaxil commented on a change in pull request #7166: [AIRFLOW-XXXX] Move UPDATING 
changes into correct versions
URL: https://github.com/apache/airflow/pull/7166#discussion_r366702859
 
 

 ##########
 File path: UPDATING.md
 ##########
 @@ -938,17 +838,93 @@ The 'properties' and 'jars' properties for the Dataproc 
related operators (`Data
 and `dataproc_jars`respectively.
 Arguments for dataproc_properties dataproc_jars
 
-### Failure callback will be called when task is marked failed
-When task is marked failed by user or task fails due to system failures - on 
failure call back will be called as part of clean up
+## Airflow 1.10.7
 
-See [AIRFLOW-5621](https://jira.apache.org/jira/browse/AIRFLOW-5621) for 
details
+### Changes in experimental API execution_date microseconds replacement
+
+The default behavior was to strip the microseconds (and milliseconds, etc) off 
of all dag runs triggered by
+by the experimental REST API.  The default behavior will change when an 
explicit execution_date is
+passed in the request body.  It will also now be possible to have the 
execution_date generated, but
+keep the microseconds by sending `replace_microseconds=false` in the request 
body.  The default
+behavior can be overridden by sending `replace_microseconds=true` along with 
an explicit execution_date
+
+### Viewer won't have edit permissions on DAG view.
+
+### Renamed "extra" requirements for cloud providers
+
+Subpackages for specific services have been combined into one variant for
+each cloud provider. The name of the subpackage for the Google Cloud Platform
+has changed to follow style.
+
+If you want to install integration for Microsoft Azure, then instead of
+```
+pip install 
'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]'
+```
+you should execute `pip install 'apache-airflow[azure]'`
+
+If you want to install integration for Amazon Web Services, then instead of
+`pip install 'apache-airflow[s3,emr]'`, you should execute `pip install 
'apache-airflow[aws]'`
+
+If you want to install integration for Google Cloud Platform, then instead of
+`pip install 'apache-airflow[gcp_api]'`, you should execute `pip install 
'apache-airflow[gcp]'`.
+The old way will work until the release of Airflow 2.1.
+
+## Airflow 1.10.6
+
+### BaseOperator::render_template function signature changed
+
+Previous versions of the `BaseOperator::render_template` function required an 
`attr` argument as the first
+positional argument, along with `content` and `context`. This function 
signature was changed in 1.10.6 and
+the `attr` argument is no longer required (or accepted).
+
+In order to use this function in subclasses of the `BaseOperator`, the `attr` 
argument must be removed:
+```python
+result = self.render_template('myattr', self.myattr, context)  # Pre-1.10.6 
call
+...
+result = self.render_template(self.myattr, context)  # Post-1.10.6 call
+```
+
+### Changes to `aws_default` Connection's default region
+
+The region of Airflow's default connection to AWS (`aws_default`) was 
previously
+set to `us-east-1` during installation.
+
+The region now needs to be set manually, either in the connection screens in
+Airflow, via the `~/.aws` config files, or via the `AWS_DEFAULT_REGION` 
environment
+variable.
+
+### Some DAG Processing metrics have been renamed
+
+The following metrics are deprecated and won't be emitted in Airflow 2.0:
+
+- `scheduler.dagbag.errors` and `dagbag_import_errors` -- use 
`dag_processing.import_errors` instead
+- `dag_file_processor_timeouts` -- use `dag_processing.processor_timeouts` 
instead
+- `collect_dags` -- use `dag_processing.total_parse_time` instead
+- `dag.loading-duration.<basename>` -- use 
`dag_processing.last_duration.<basename>` instead
+- `dag_processing.last_runtime.<basename>` -- use 
`dag_processing.last_duration.<basename>` instead
 
 ## Airflow 1.10.5
 
 No breaking changes.
 
 ## Airflow 1.10.4
 
+### Export MySQL timestamps as UTC
+
+`MySqlToGoogleCloudStorageOperator` now exports TIMESTAMP columns as UTC
+by default, rather than using the default timezone of the MySQL server.
+This is the correct behavior for use with BigQuery, since BigQuery
+assumes that TIMESTAMP columns without time zones are in UTC. To
+preserve the previous behavior, set `ensure_utc` to `False.`
 
 Review comment:
   This is correct

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to