This is an automated email from the ASF dual-hosted git repository.

sivabalan pushed a change to branch release-0.5.3
in repository https://gitbox.apache.org/repos/asf/hudi.git.


      at 54bbe32  [HUDI-938] Removing incubating/incubator from project (#1658)

This branch includes the following new commits:

     new f6c87b5  Moving to 0.5.3-SNAPSHOT for bug-fix release over 0.5.2
     new dd7952e  [HUDI-652] Decouple HoodieReadClient and AbstractHoodieClient 
to break the inheritance chain (#1372)
     new 1d16b15  [HUDI-681]Remove embeddedTimelineService from 
HoodieReadClient (#1388)
     new cebce61  [HUDI-629]: Replace Guava's Hashing with an equivalent in 
NumericUtils.java (#1350)
     new 2112d0a  [HUDI - 738] Add validation to DeltaStreamer to fail fast 
when filterDupes is enabled on UPSERT mode. (#1505)
     new e3f7659  [HUDI-799] Use appropriate FS when loading configs (#1517)
     new 18453a4  [HUDI-713] Fix conversion of Spark array of struct type to 
Avro schema (#1406)
     new 9a2b3e3  [HUDI-656][Performance] Return a dummy Spark relation after 
writing the DataFrame (#1394)
     new ea48ca9  [HUDI-850] Avoid unnecessary listings in incremental cleaning 
mode (#1576)
     new 96b2359  [HUDI-724] Parallelize getSmallFiles for partitions (#1421)
     new 0050e00  [HUDI-607] Fix to allow creation/syncing of Hive tables 
partitioned by Date type columns (#1330)
     new d535381  Add constructor to HoodieROTablePathFilter (#1413)
     new bd70601  [HUDI-539] Make ROPathFilter conf member serializable (#1415)
     new d476ac5  Add changes for presto mor queries (#1578)
     new cc4aa45  [HUDI-782] Add support of Aliyun object storage service. 
(#1506)
     new 86698b5  [HUDI-716] Exception: Not an Avro data file when running 
HoodieCleanClient.runClean (#1432)
     new 2e5f601  [HUDI-400] Check upgrade from old plan to new plan for 
compaction (#1422)
     new 45efa69  [MINOR] Update DOAP with 0.5.2 Release (#1448)
     new 2f4ee0a  [HUDI-742] Fix Java Math Exception (#1466)
     new e80dd04  [HUDI-717] Fixed usage of HiveDriver for DDL statements. 
(#1416)
     new f9c3b07  [HUDI-727]: Copy default values of fields if not present when 
rewriting incoming record with new schema (#1427)
     new 500b390  [HUDI-795] Handle auto-deleted empty aux folder (#1515)
     new 0dcc204  [MINOR]: Fix cli docs for DeltaStreamer (#1547)
     new 288bc4d  [HUDI-852] adding check for table name for Append Save mode  
(#1580)
     new be205ed  [MINOR] fixed building IndexFileFilter with a wrong condition 
in HoodieGlobalBloomIndex class (#1537)
     new 77d7500  [HUDI-616] Fixed parquet files getting created on local FS 
(#1434)
     new 5fbb5a7  [HUDI-902] Avoid exception when getSchemaProvider (#1584)
     new 8324f6e  [HUDI-789]Adjust logic of upsert in HDFSParquetImporter 
(#1511)
     new da043af  [HUDI-889] Writer supports useJdbc configuration when hive 
synchronization is enabled (#1627)
     new 4095de4  [HUDI-820] cleaner repair command should only inspect clean 
metadata files (#1542)
     new 7c766c3  HUDI-528 Handle empty commit in incremental pulling (#1612)
     new 0fb02e7  [HUDI-895] Remove unnecessary listing .hoodie folder when 
using timeline server (#1636)
     new 555ff1a  [HUDI-858] Allow multiple operations to be executed within a 
single commit (#1633)
     new 4192cc6  [HUDI-863] get decimal properties from derived spark DataType 
(#1596)
     new b69bb18  [HUDI-846][HUDI-848] Enable Incremental cleaning and embedded 
timeline-server by default (#1634)
     new 07345ba  Fixing test failure in TestHoodieClientOnCopyOnWriteStorage
     new 0a77a3b  Fixing test failures for TestRepairsCommand
     new 498c86b  [MINOR] Remove incubating from README
     new 3a776f6  [HUDI-926] Removing DISCLAIMER from the repo (#1657)
     new 54bbe32  [HUDI-938] Removing incubating/incubator from project (#1658)

The 40 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Reply via email to