This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
     new f02ad50  Delete irrelevant entries from UPDATING.md (#10093)
f02ad50 is described below

commit f02ad50814f1af8be5f111e23a62f12bab61d7a8
Author: Kamil BreguĊ‚a <[email protected]>
AuthorDate: Mon Aug 3 14:27:58 2020 +0200

    Delete irrelevant entries from UPDATING.md (#10093)
---
 UPDATING.md | 34 ----------------------------------
 1 file changed, 34 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 404e265..5475974 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -342,16 +342,6 @@ ie. running, {"name": "bob"}
 whereas in in prior releases it just printed the state:
 ie. running
 
-#### Added `airflow dags test` CLI command
-
-A new command was added to the CLI for executing one full run of a DAG for a 
given execution date, similar to
-`airflow tasks test`. Example usage:
-
-```
-airflow dags test [dag_id] [execution_date]
-airflow dags test example_branch_operator 2018-01-01
-```
-
 #### Deprecating ignore_first_depends_on_past on backfill command and default 
it to True
 
 When doing backfill with `depends_on_past` dags, users will need to pass 
`--ignore-first-depends-on-past`.
@@ -983,16 +973,6 @@ arguments, please change `store_serialized_dags` to 
`read_dags_from_db`.
 Similarly, if you were using `DagBag().store_serialized_dags` property, change 
it to
 `DagBag().read_dags_from_db`.
 
-#### `airflow.models.baseoperator.BaseOperator`
-It was not possible to patch pool in BaseOperator as the signature sets the 
default value of pool
-as Pool.DEFAULT_POOL_NAME.
-While using subdagoperator in unittest(without initializing the sqlite db), it 
was throwing the
-following error:
-```
-sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: 
slot_pool.
-```
-Fix for this, https://github.com/apache/airflow/pull/8587
-
 ### Changes in `google` provider package
 
 We strive to ensure that there are no changes that may affect the end user and 
your Python files, but this
@@ -1655,20 +1635,6 @@ If you want to install integration for Amazon Web 
Services, then instead of
 
 The deprecated extras will be removed in 2.1:
 
-#### Added mypy plugin to preserve types of decorated functions
-
-Mypy currently doesn't support precise type information for decorated
-functions; see https://github.com/python/mypy/issues/3157 for details.
-To preserve precise type definitions for decorated functions, we now
-include a mypy plugin to preserve precise type definitions for decorated
-functions. To use the plugin, update your setup.cfg:
-
-```
-[mypy]
-plugins =
-  airflow.mypy.plugin.decorators
-```
-
 #### Simplify the response payload of endpoints /dag_stats and /task_stats
 
 The response of endpoints `/dag_stats` and `/task_stats` help UI fetch brief 
statistics about DAGs and Tasks. The format was like

Reply via email to