This is an automated email from the ASF dual-hosted git repository.

xushiyan pushed a change to branch release-0.11.0
in repository https://gitbox.apache.org/repos/asf/hudi.git


    from c0cadbca65 [HUDI-3894] Fix gcp bundle to include HBase dependencies 
and shading (#5349)
     new 40c80f6627 [HUDI-3902] Fallback to `HadoopFsRelation` in cases 
non-involving Schema Evolution (#5352)
     new a462584e04 [HUDI-3905] Add S3 related setup in Kafka Connect quick 
start (#5356)
     new 4c40d1645c [HUDI-3920] Fix partition path construction in metadata 
table validator (#5365)
     new 89a27a1ccf [HUDI-3917] Flink write task hangs if last checkpoint has 
no data input (#5360)
     new 5c57920fe7 [HUDI-3904] Claim RFC number for Improve timeline server 
(#5354)
     new fcaeaea62f [HUDI-3912] Fix lose data when rollback in flink async 
compact (#5357)
     new 2ec27cc436 [HUDI-3204] Fixing partition-values being derived from 
partition-path instead of source columns (#5364)

The 7 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../spark/sql/HoodieCatalystExpressionUtils.scala  |  20 +-
 .../org/apache/spark/sql/hudi/SparkAdapter.scala   |   4 +-
 .../java/org/apache/hudi/avro/AvroSchemaUtils.java | 112 +++++
 .../java/org/apache/hudi/avro/HoodieAvroUtils.java |  80 +---
 .../hudi/common/model/HoodiePartitionMetadata.java |   5 +-
 .../hudi/common/table/HoodieTableConfig.java       |   2 +-
 .../hudi/common/table/TableSchemaResolver.java     |  17 +-
 .../hudi/metadata/HoodieTableMetadataUtil.java     |   2 +-
 .../hudi/common/table/TestTableSchemaResolver.java |   3 +-
 .../hudi/sink/StreamWriteOperatorCoordinator.java  |  31 ++
 .../hudi/sink/append/AppendWriteFunction.java      |   2 +-
 .../sink/common/AbstractStreamWriteFunction.java   |   9 +-
 .../hudi/sink/compact/CompactionCommitSink.java    |  17 +-
 .../utils/HoodieRealtimeRecordReaderUtils.java     |  29 +-
 hudi-kafka-connect/README.md                       |  21 +-
 ...org.apache.spark.sql.sources.DataSourceRegister |   2 +-
 .../org/apache/hudi/BaseFileOnlyRelation.scala     |  78 +++-
 .../main/scala/org/apache/hudi/DefaultSource.scala |  36 +-
 .../scala/org/apache/hudi/HoodieBaseRelation.scala | 217 +++++----
 .../org/apache/hudi/HoodieDataSourceHelper.scala   |  20 +-
 .../org/apache/hudi/HoodieSparkSqlWriter.scala     |   4 +-
 .../org/apache/hudi/IncrementalRelation.scala      |   5 +-
 .../hudi/MergeOnReadIncrementalRelation.scala      |  12 +-
 .../apache/hudi/MergeOnReadSnapshotRelation.scala  |  13 +-
 .../apache/hudi/SparkHoodieTableFileIndex.scala    |   3 +
 ...eFormat.scala => HoodieParquetFileFormat.scala} |  30 +-
 .../scala/org/apache/hudi/SparkDatasetMixin.scala} |  30 +-
 .../apache/hudi/functional/TestCOWDataSource.scala |  37 +-
 .../hudi/functional/TestCOWDataSourceStorage.scala |   2 +-
 .../apache/hudi/functional/TestMORDataSource.scala |  20 +-
 .../functional/TestParquetColumnProjection.scala   |   9 +-
 .../apache/spark/sql/adapter/Spark2Adapter.scala   |   6 +-
 .../parquet/Spark24HoodieParquetFileFormat.scala   | 229 ++++++++++
 .../apache/spark/sql/adapter/Spark3_1Adapter.scala |  22 +-
 .../parquet/Spark312HoodieParquetFileFormat.scala  | 507 +++++++++++----------
 .../apache/spark/sql/adapter/Spark3_2Adapter.scala |  13 +-
 .../parquet/Spark32HoodieParquetFileFormat.scala   | 449 ++++++++++--------
 .../utilities/HoodieMetadataTableValidator.java    |   7 +-
 rfc/README.md                                      |   1 +
 39 files changed, 1360 insertions(+), 746 deletions(-)
 create mode 100644 
hudi-common/src/main/java/org/apache/hudi/avro/AvroSchemaUtils.java
 rename 
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/{SparkHoodieParquetFileFormat.scala
 => HoodieParquetFileFormat.scala} (58%)
 copy 
hudi-spark-datasource/hudi-spark/src/{main/scala/org/apache/spark/sql/hudi/command/UuidKeyGenerator.scala
 => test/scala/org/apache/hudi/SparkDatasetMixin.scala} (51%)
 create mode 100644 
hudi-spark-datasource/hudi-spark2/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/Spark24HoodieParquetFileFormat.scala

Reply via email to