This is an automated email from the ASF dual-hosted git repository.
bhavanisudha pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/hudi.git
The following commit(s) were added to refs/heads/asf-site by this push:
new d56fbc5088f Added rli issue fix guidance (#11716)
d56fbc5088f is described below
commit d56fbc5088f1ff9a41a758bfff2c65005517c457
Author: Aditya Goenka <[email protected]>
AuthorDate: Thu Aug 1 21:44:27 2024 +0530
Added rli issue fix guidance (#11716)
---
website/releases/release-0.15.0.md | 10 ++++++++++
website/versioned_docs/version-0.14.0/troubleshooting.md | 6 ++++++
website/versioned_docs/version-0.14.1/troubleshooting.md | 6 ++++++
website/versioned_docs/version-0.15.0/troubleshooting.md | 6 ++++++
4 files changed, 28 insertions(+)
diff --git a/website/releases/release-0.15.0.md
b/website/releases/release-0.15.0.md
index 3ba36000ac7..77e1c35ea98 100644
--- a/website/releases/release-0.15.0.md
+++ b/website/releases/release-0.15.0.md
@@ -317,6 +317,16 @@ Recent Athena version silently drops Hudi data when the
partition location has a
partition `s3` scheme fixes the issue. We have added a fix to use `s3` scheme
for the Hudi table partitions in AWS Glue
Catalog sync ([HUDI-7362](https://issues.apache.org/jira/browse/HUDI-7362)).
+## Known Regressions
+The Hudi 0.15.0 release introduces a regression related to Complex Key
generation when the record key consists of a
+single field. This issue was also present in version 0.14.1. When upgrading a
table from previous versions,
+it may silently ingest duplicate records.
+
+:::tip
+Avoid upgrading any existing table to 0.14.1 and 0.15.0 from any prior version
if you are using ComplexKeyGenerator and
+number of fields in record key is 1.
+:::
+
## Raw Release Notes
The raw release notes are
diff --git a/website/versioned_docs/version-0.14.0/troubleshooting.md
b/website/versioned_docs/version-0.14.0/troubleshooting.md
index 6398cfc7245..aaa3f4feb63 100644
--- a/website/versioned_docs/version-0.14.0/troubleshooting.md
+++ b/website/versioned_docs/version-0.14.0/troubleshooting.md
@@ -175,6 +175,12 @@ Caused by: [CIRCULAR REFERENCE: java.io.IOException: Write
end dead]
We have an active
patch([https://github.com/apache/hudi/pull/7245](https://github.com/apache/hudi/pull/7245))
on fixing the issue. Until we land this, you can use above config to bypass
the issue.
+#### Hudi upserts with RLI fails with ClassCastException
+
+In certain environments, RLI Hudi upserts may fail with the error: `Caused by:
java.lang.ClassCastException: class org.apache.avro.generic.GenericData$Record
cannot be cast to class org.apache.hudi.avro.model.HoodieDeleteRecordList`.
+
+To resolve this issue, we should add the hudi bundle JAR to both
`spark.driver.extraClassPath` and `spark.executor.extraClassPath`.
+
### Hive Sync
#### SQLException: following columns have types incompatible
diff --git a/website/versioned_docs/version-0.14.1/troubleshooting.md
b/website/versioned_docs/version-0.14.1/troubleshooting.md
index 6398cfc7245..aaa3f4feb63 100644
--- a/website/versioned_docs/version-0.14.1/troubleshooting.md
+++ b/website/versioned_docs/version-0.14.1/troubleshooting.md
@@ -175,6 +175,12 @@ Caused by: [CIRCULAR REFERENCE: java.io.IOException: Write
end dead]
We have an active
patch([https://github.com/apache/hudi/pull/7245](https://github.com/apache/hudi/pull/7245))
on fixing the issue. Until we land this, you can use above config to bypass
the issue.
+#### Hudi upserts with RLI fails with ClassCastException
+
+In certain environments, RLI Hudi upserts may fail with the error: `Caused by:
java.lang.ClassCastException: class org.apache.avro.generic.GenericData$Record
cannot be cast to class org.apache.hudi.avro.model.HoodieDeleteRecordList`.
+
+To resolve this issue, we should add the hudi bundle JAR to both
`spark.driver.extraClassPath` and `spark.executor.extraClassPath`.
+
### Hive Sync
#### SQLException: following columns have types incompatible
diff --git a/website/versioned_docs/version-0.15.0/troubleshooting.md
b/website/versioned_docs/version-0.15.0/troubleshooting.md
index db93a76d187..f16fa458ee7 100644
--- a/website/versioned_docs/version-0.15.0/troubleshooting.md
+++ b/website/versioned_docs/version-0.15.0/troubleshooting.md
@@ -175,6 +175,12 @@ Caused by: [CIRCULAR REFERENCE: java.io.IOException: Write
end dead]
We have an active
patch([https://github.com/apache/hudi/pull/7245](https://github.com/apache/hudi/pull/7245))
on fixing the issue. Until we land this, you can use above config to bypass
the issue.
+#### Hudi upserts with RLI fails with ClassCastException
+
+In certain environments, RLI Hudi upserts may fail with the error: `Caused by:
java.lang.ClassCastException: class org.apache.avro.generic.GenericData$Record
cannot be cast to class org.apache.hudi.avro.model.HoodieDeleteRecordList`.
+
+To resolve this issue, we should add the hudi bundle JAR to both
`spark.driver.extraClassPath` and `spark.executor.extraClassPath`.
+
### Hive Sync
#### SQLException: following columns have types incompatible