amaliujia commented on a change in pull request #11729:
URL: https://github.com/apache/beam/pull/11729#discussion_r431358877



##########
File path: website/www/site/content/en/blog/beam-2.21.0.md
##########
@@ -0,0 +1,97 @@
+---
+title:  "Apache Beam 2.21.0"
+date:   2020-05-27 00:00:01 -0800
+categories: 
+  - blog
+authors:
+  - ibzib
+---
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+http://www.apache.org/licenses/LICENSE-2.0
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+
+We are happy to present the new 2.21.0 release of Beam. This release includes 
both improvements and new functionality.
+See the [download page](/get-started/downloads/#xxxx-xxxx) for this release.
+For more information on changes in 2.21.0, check out the
+[detailed release 
notes](https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12319527&version=12347143).
+
+## I/Os
+* Python: Deprecated module `apache_beam.io.gcp.datastore.v1` has been removed
+as the client it uses is out of date and does not support Python 3
+([BEAM-9529](https://issues.apache.org/jira/browse/BEAM-9529)).
+Please migrate your code to use
+[apache_beam.io.gcp.datastore.**v1new**](https://beam.apache.org/releases/pydoc/current/apache_beam.io.gcp.datastore.v1new.datastoreio.html).
+See the updated
+[datastore_wordcount](https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/cookbook/datastore_wordcount.py)
+for example usage.
+* Python SDK: Added integration tests and updated batch write functionality 
for Google Cloud Spanner transform 
([BEAM-8949](https://issues.apache.org/jira/browse/BEAM-8949)).
+
+## New Features / Improvements
+* Python SDK will now use Python 3 type annotations as pipeline type hints.
+([#10717](https://github.com/apache/beam/pull/10717))
+
+    If you suspect that this feature is causing your pipeline to fail, calling
+    `apache_beam.typehints.disable_type_annotations()` before pipeline creation
+    will disable is completely, and decorating specific functions (such as 
+    `process()`) with `@apache_beam.typehints.no_annotations` will disable it
+    for that function.
+
+    More details will be in 
+    [Ensuring Python Type 
Safety](https://beam.apache.org/documentation/sdks/python-type-safety/)
+    and an upcoming
+    [blog 
post](https://beam.apache.org/blog/python/typing/2020/03/06/python-typing.html).
+
+* Java SDK: Introducing the concept of options in Beam Schema’s. These options 
add extra 
+context to fields and schemas. This replaces the current Beam metadata that is 
present 
+in a FieldType only, options are available in fields and row schemas. Schema 
options are
+fully typed and can contain complex rows. *Remark: Schema aware is still 
experimental.*
+([BEAM-9035](https://issues.apache.org/jira/browse/BEAM-9035))
+* Java SDK: The protobuf extension is fully schema aware and also includes 
protobuf option
+conversion to beam schema options. *Remark: Schema aware is still 
experimental.*
+([BEAM-9044](https://issues.apache.org/jira/browse/BEAM-9044))
+* Added ability to write to BigQuery via Avro file loads (Python) 
([BEAM-8841](https://issues.apache.org/jira/browse/BEAM-8841))
+
+    By default, file loads will be done using JSON, but it is possible to

Review comment:
       nit: I personally prefer that moving such details into JIRAs. The 
release highlights could be just short.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to