[incubator-hudi] branch master updated (89f0968 -> 845e261)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 89f0968 [DOCS] Update the build source link (#1071) add 845e261 [MINOR] Update some urls from http to https in the README file (#1074) No new revisions were added by this update. Summary of changes: README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-)
[incubator-hudi] branch hudi_test_suite updated: Rename module name from hudi-bench to hudi-end-to-end-tests
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch hudi_test_suite in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/hudi_test_suite by this push: new 4ac72f4 Rename module name from hudi-bench to hudi-end-to-end-tests 4ac72f4 is described below commit 4ac72f4f62800725a469b9a8bc8110acb0b18fb1 Author: yanghua AuthorDate: Tue Dec 3 19:44:15 2019 +0800 Rename module name from hudi-bench to hudi-end-to-end-tests --- docker/hoodie/hadoop/hive_base/Dockerfile | 2 +- docker/hoodie/hadoop/hive_base/pom.xml | 4 ++-- {hudi-bench => hudi-end-to-end-tests}/pom.xml | 2 +- .../prepare_integration_suite.sh | 4 ++-- .../apache/hudi/e2e}/DFSDeltaWriterAdapter.java| 6 +++--- .../apache/hudi/e2e}/DFSSparkAvroDeltaWriter.java | 8 .../org/apache/hudi/e2e}/DeltaInputFormat.java | 2 +- .../java/org/apache/hudi/e2e}/DeltaOutputType.java | 2 +- .../org/apache/hudi/e2e}/DeltaWriterAdapter.java | 4 ++-- .../org/apache/hudi/e2e}/DeltaWriterFactory.java | 10 - .../hudi/e2e}/configuration/DFSDeltaConfig.java| 6 +++--- .../hudi/e2e}/configuration/DeltaConfig.java | 6 +++--- .../hudi/e2e}/converter/UpdateConverter.java | 6 +++--- .../java/org/apache/hudi/e2e}/dag/DagUtils.java| 9 .../org/apache/hudi/e2e}/dag/ExecutionContext.java | 9 .../java/org/apache/hudi/e2e}/dag/WorkflowDag.java | 4 ++-- .../apache/hudi/e2e}/dag/WorkflowDagGenerator.java | 12 +-- .../apache/hudi/e2e}/dag/nodes/BulkInsertNode.java | 6 +++--- .../org/apache/hudi/e2e}/dag/nodes/CleanNode.java | 4 ++-- .../apache/hudi/e2e}/dag/nodes/CompactNode.java| 6 +++--- .../org/apache/hudi/e2e}/dag/nodes/DagNode.java| 6 +++--- .../apache/hudi/e2e}/dag/nodes/HiveQueryNode.java | 8 .../apache/hudi/e2e}/dag/nodes/HiveSyncNode.java | 8 .../org/apache/hudi/e2e}/dag/nodes/InsertNode.java | 10 - .../apache/hudi/e2e}/dag/nodes/RollbackNode.java | 6 +++--- .../hudi/e2e}/dag/nodes/ScheduleCompactNode.java | 6 +++--- .../hudi/e2e}/dag/nodes/SparkSQLQueryNode.java | 8 .../org/apache/hudi/e2e}/dag/nodes/UpsertNode.java | 8 .../apache/hudi/e2e}/dag/nodes/ValidateNode.java | 6 +++--- .../hudi/e2e}/dag/scheduler/DagScheduler.java | 12 +-- .../apache/hudi/e2e}/generator/DeltaGenerator.java | 24 +++--- .../FlexibleSchemaRecordGenerationIterator.java| 2 +- .../GenericRecordFullPayloadGenerator.java | 2 +- .../GenericRecordFullPayloadSizeEstimator.java | 2 +- .../GenericRecordPartialPayloadGenerator.java | 2 +- .../generator/LazyRecordGeneratorIterator.java | 2 +- .../e2e}/generator/UpdateGeneratorIterator.java| 2 +- .../e2e}/helpers/DFSTestSuitePathSelector.java | 2 +- .../hudi/e2e}/helpers/HiveServiceProvider.java | 6 +++--- .../hudi/e2e}/job/HoodieDeltaStreamerWrapper.java | 2 +- .../apache/hudi/e2e}/job/HoodieTestSuiteJob.java | 24 +++--- .../hudi/e2e}/reader/DFSAvroDeltaInputReader.java | 8 .../hudi/e2e}/reader/DFSDeltaInputReader.java | 2 +- .../e2e}/reader/DFSHoodieDatasetInputReader.java | 2 +- .../e2e}/reader/DFSParquetDeltaInputReader.java| 6 +++--- .../apache/hudi/e2e}/reader/DeltaInputReader.java | 2 +- .../apache/hudi/e2e}/reader/SparkBasedReader.java | 2 +- .../hudi/e2e}/writer/AvroDeltaInputWriter.java | 2 +- .../apache/hudi/e2e}/writer/DeltaInputWriter.java | 2 +- .../org/apache/hudi/e2e}/writer/DeltaWriter.java | 6 +++--- .../hudi/e2e}/writer/FileDeltaInputWriter.java | 2 +- .../e2e}/writer/SparkAvroDeltaInputWriter.java | 2 +- .../org/apache/hudi/e2e}/writer/WriteStats.java| 2 +- .../hudi/e2e}/TestDFSDeltaWriterAdapter.java | 16 +++ .../apache/hudi/e2e}/TestFileDeltaInputWriter.java | 14 ++--- .../e2e}/configuration/TestWorkflowBuilder.java| 12 +-- .../hudi/e2e}/converter/TestUpdateConverter.java | 4 ++-- .../org/apache/hudi/e2e}/dag/TestComplexDag.java | 14 +++-- .../org/apache/hudi/e2e}/dag/TestDagUtils.java | 13 +++- .../org/apache/hudi/e2e}/dag/TestHiveSyncDag.java | 14 +++-- .../apache/hudi/e2e}/dag/TestInsertOnlyDag.java| 10 + .../apache/hudi/e2e}/dag/TestInsertUpsertDag.java | 12 ++- .../TestGenericRecordPayloadEstimator.java | 7 --- .../TestGenericRecordPayloadGenerator.java | 16 --- .../hudi/e2e}/generator/TestWorkloadGenerator.java | 22 ++-- .../hudi/e2e}/job/TestHoodieTestSuiteJob.java | 22 ++-- .../e2e}/reader/TestDFSAvroDeltaInputReader.java | 4 ++-- .../reader/TestDFSHoodieDatasetInputReader.java| 5 +++-- .../java/org/apache/hudi/e2e}/ut
[incubator-hudi] branch hudi_test_suite created (now afe00ff)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at afe00ff Hudi Test Suite - Flexible schema payload generation - Different types of workload generation such as inserts, upserts etc - Post process actions to perform validations - Interoperability of test suite to use HoodieWriteClient and HoodieDeltaStreamer so both code paths can be tested - Custom workload sequence generator - Ability to perform parallel operations, such as upsert and compaction No new revisions were added by this update.
[incubator-hudi] branch hudi_test_suite_refactor updated (c82d6d9 -> c2c9347)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard c82d6d9 Hudi Test Suite - Flexible schema payload generation - Different types of workload generation such as inserts, upserts etc - Post process actions to perform validations - Interoperability of test suite to use HoodieWriteClient and HoodieDeltaStreamer so both code paths can be tested - Custom workload sequence generator - Ability to perform parallel operations, such as upsert and compaction new c2c9347 [HUDI-394] Provide a basic implementation of test suite This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (c82d6d9) \ N -- N -- N refs/heads/hudi_test_suite_refactor (c2c9347) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes:
[incubator-hudi] branch hudi_test_suite_refactor created (now c2c9347)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at c2c9347 [HUDI-394] Provide a basic implementation of test suite No new revisions were added by this update.
[incubator-hudi] branch hudi_test_suite created (now 4ac72f4)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at 4ac72f4 Rename module name from hudi-bench to hudi-end-to-end-tests This branch includes the following new commits: new 4ac72f4 Rename module name from hudi-bench to hudi-end-to-end-tests The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[incubator-hudi] 01/01: Rename module name from hudi-bench to hudi-end-to-end-tests
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch hudi_test_suite in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git commit 4ac72f4f62800725a469b9a8bc8110acb0b18fb1 Author: yanghua AuthorDate: Tue Dec 3 19:44:15 2019 +0800 Rename module name from hudi-bench to hudi-end-to-end-tests --- docker/hoodie/hadoop/hive_base/Dockerfile | 2 +- docker/hoodie/hadoop/hive_base/pom.xml | 4 ++-- {hudi-bench => hudi-end-to-end-tests}/pom.xml | 2 +- .../prepare_integration_suite.sh | 4 ++-- .../apache/hudi/e2e}/DFSDeltaWriterAdapter.java| 6 +++--- .../apache/hudi/e2e}/DFSSparkAvroDeltaWriter.java | 8 .../org/apache/hudi/e2e}/DeltaInputFormat.java | 2 +- .../java/org/apache/hudi/e2e}/DeltaOutputType.java | 2 +- .../org/apache/hudi/e2e}/DeltaWriterAdapter.java | 4 ++-- .../org/apache/hudi/e2e}/DeltaWriterFactory.java | 10 - .../hudi/e2e}/configuration/DFSDeltaConfig.java| 6 +++--- .../hudi/e2e}/configuration/DeltaConfig.java | 6 +++--- .../hudi/e2e}/converter/UpdateConverter.java | 6 +++--- .../java/org/apache/hudi/e2e}/dag/DagUtils.java| 9 .../org/apache/hudi/e2e}/dag/ExecutionContext.java | 9 .../java/org/apache/hudi/e2e}/dag/WorkflowDag.java | 4 ++-- .../apache/hudi/e2e}/dag/WorkflowDagGenerator.java | 12 +-- .../apache/hudi/e2e}/dag/nodes/BulkInsertNode.java | 6 +++--- .../org/apache/hudi/e2e}/dag/nodes/CleanNode.java | 4 ++-- .../apache/hudi/e2e}/dag/nodes/CompactNode.java| 6 +++--- .../org/apache/hudi/e2e}/dag/nodes/DagNode.java| 6 +++--- .../apache/hudi/e2e}/dag/nodes/HiveQueryNode.java | 8 .../apache/hudi/e2e}/dag/nodes/HiveSyncNode.java | 8 .../org/apache/hudi/e2e}/dag/nodes/InsertNode.java | 10 - .../apache/hudi/e2e}/dag/nodes/RollbackNode.java | 6 +++--- .../hudi/e2e}/dag/nodes/ScheduleCompactNode.java | 6 +++--- .../hudi/e2e}/dag/nodes/SparkSQLQueryNode.java | 8 .../org/apache/hudi/e2e}/dag/nodes/UpsertNode.java | 8 .../apache/hudi/e2e}/dag/nodes/ValidateNode.java | 6 +++--- .../hudi/e2e}/dag/scheduler/DagScheduler.java | 12 +-- .../apache/hudi/e2e}/generator/DeltaGenerator.java | 24 +++--- .../FlexibleSchemaRecordGenerationIterator.java| 2 +- .../GenericRecordFullPayloadGenerator.java | 2 +- .../GenericRecordFullPayloadSizeEstimator.java | 2 +- .../GenericRecordPartialPayloadGenerator.java | 2 +- .../generator/LazyRecordGeneratorIterator.java | 2 +- .../e2e}/generator/UpdateGeneratorIterator.java| 2 +- .../e2e}/helpers/DFSTestSuitePathSelector.java | 2 +- .../hudi/e2e}/helpers/HiveServiceProvider.java | 6 +++--- .../hudi/e2e}/job/HoodieDeltaStreamerWrapper.java | 2 +- .../apache/hudi/e2e}/job/HoodieTestSuiteJob.java | 24 +++--- .../hudi/e2e}/reader/DFSAvroDeltaInputReader.java | 8 .../hudi/e2e}/reader/DFSDeltaInputReader.java | 2 +- .../e2e}/reader/DFSHoodieDatasetInputReader.java | 2 +- .../e2e}/reader/DFSParquetDeltaInputReader.java| 6 +++--- .../apache/hudi/e2e}/reader/DeltaInputReader.java | 2 +- .../apache/hudi/e2e}/reader/SparkBasedReader.java | 2 +- .../hudi/e2e}/writer/AvroDeltaInputWriter.java | 2 +- .../apache/hudi/e2e}/writer/DeltaInputWriter.java | 2 +- .../org/apache/hudi/e2e}/writer/DeltaWriter.java | 6 +++--- .../hudi/e2e}/writer/FileDeltaInputWriter.java | 2 +- .../e2e}/writer/SparkAvroDeltaInputWriter.java | 2 +- .../org/apache/hudi/e2e}/writer/WriteStats.java| 2 +- .../hudi/e2e}/TestDFSDeltaWriterAdapter.java | 16 +++ .../apache/hudi/e2e}/TestFileDeltaInputWriter.java | 14 ++--- .../e2e}/configuration/TestWorkflowBuilder.java| 12 +-- .../hudi/e2e}/converter/TestUpdateConverter.java | 4 ++-- .../org/apache/hudi/e2e}/dag/TestComplexDag.java | 14 +++-- .../org/apache/hudi/e2e}/dag/TestDagUtils.java | 13 +++- .../org/apache/hudi/e2e}/dag/TestHiveSyncDag.java | 14 +++-- .../apache/hudi/e2e}/dag/TestInsertOnlyDag.java| 10 + .../apache/hudi/e2e}/dag/TestInsertUpsertDag.java | 12 ++- .../TestGenericRecordPayloadEstimator.java | 7 --- .../TestGenericRecordPayloadGenerator.java | 16 --- .../hudi/e2e}/generator/TestWorkloadGenerator.java | 22 ++-- .../hudi/e2e}/job/TestHoodieTestSuiteJob.java | 22 ++-- .../e2e}/reader/TestDFSAvroDeltaInputReader.java | 4 ++-- .../reader/TestDFSHoodieDatasetInputReader.java| 5 +++-- .../java/org/apache/hudi/e2e}/utils/TestUtils.java | 2 +- .../apache/hudi/e2e}/writer/TestDeltaWriter.java | 7 --- .../resources/hudi-bench-config/base.properties| 0 .../hudi-bench-config/complex-source.a
[incubator-hudi] branch feature_hudi_test_suite created (now eaaf3f6)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch feature_hudi_test_suite in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at eaaf3f6 [HUDI-394] Provide a basic implementation of test suite This branch includes the following new commits: new eaaf3f6 [HUDI-394] Provide a basic implementation of test suite The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[incubator-hudi] branch hudi_test_suite_refactor created (now c2c9347)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at c2c9347 [HUDI-394] Provide a basic implementation of test suite No new revisions were added by this update.
[incubator-hudi] branch hudi_test_suite_refactor created (now eaaf3f6)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at eaaf3f6 [HUDI-394] Provide a basic implementation of test suite No new revisions were added by this update.
[incubator-hudi] branch master updated: [HUDI-378] Refactor the rest codes based on new ImportOrder code style rule (#1078)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new c06d89b [HUDI-378] Refactor the rest codes based on new ImportOrder code style rule (#1078) c06d89b is described below commit c06d89b648bbdaaad9c68fb9d3c6d53b4d68541c Author: lamber-ken AuthorDate: Thu Dec 5 17:25:03 2019 +0800 [HUDI-378] Refactor the rest codes based on new ImportOrder code style rule (#1078) --- .../apache/hudi/config/HoodieStorageConfig.java| 3 +- .../apache/hudi/metrics/JmxMetricsReporter.java| 16 +++--- .../src/test/java/org/apache/hudi/TestCleaner.java | 1 + .../apache/hudi/metrics/TestHoodieJmxMetrics.java | 10 ++-- .../org/apache/hudi/common/util/CleanerUtils.java | 6 ++- .../versioning/clean/CleanMetadataMigrator.java| 3 +- .../versioning/clean/CleanV1MigrationHandler.java | 3 +- .../versioning/clean/CleanV2MigrationHandler.java | 3 +- .../org/apache/hudi/hive/HoodieHiveClient.java | 58 +++--- style/checkstyle.xml | 1 - 10 files changed, 57 insertions(+), 47 deletions(-) diff --git a/hudi-client/src/main/java/org/apache/hudi/config/HoodieStorageConfig.java b/hudi-client/src/main/java/org/apache/hudi/config/HoodieStorageConfig.java index 90fdb6c..f9c98c7 100644 --- a/hudi-client/src/main/java/org/apache/hudi/config/HoodieStorageConfig.java +++ b/hudi-client/src/main/java/org/apache/hudi/config/HoodieStorageConfig.java @@ -18,11 +18,12 @@ package org.apache.hudi.config; +import javax.annotation.concurrent.Immutable; + import java.io.File; import java.io.FileReader; import java.io.IOException; import java.util.Properties; -import javax.annotation.concurrent.Immutable; /** * Storage related config diff --git a/hudi-client/src/main/java/org/apache/hudi/metrics/JmxMetricsReporter.java b/hudi-client/src/main/java/org/apache/hudi/metrics/JmxMetricsReporter.java index 7bc73d2..d00ec67 100644 --- a/hudi-client/src/main/java/org/apache/hudi/metrics/JmxMetricsReporter.java +++ b/hudi-client/src/main/java/org/apache/hudi/metrics/JmxMetricsReporter.java @@ -18,18 +18,20 @@ package org.apache.hudi.metrics; +import org.apache.hudi.config.HoodieWriteConfig; +import org.apache.hudi.exception.HoodieException; + import com.google.common.base.Preconditions; -import java.io.Closeable; +import org.apache.log4j.LogManager; +import org.apache.log4j.Logger; -import java.lang.management.ManagementFactory; -import java.rmi.registry.LocateRegistry; import javax.management.remote.JMXConnectorServer; import javax.management.remote.JMXConnectorServerFactory; import javax.management.remote.JMXServiceURL; -import org.apache.hudi.config.HoodieWriteConfig; -import org.apache.hudi.exception.HoodieException; -import org.apache.log4j.LogManager; -import org.apache.log4j.Logger; + +import java.io.Closeable; +import java.lang.management.ManagementFactory; +import java.rmi.registry.LocateRegistry; /** * Implementation of Jmx reporter, which used to report jmx metric. diff --git a/hudi-client/src/test/java/org/apache/hudi/TestCleaner.java b/hudi-client/src/test/java/org/apache/hudi/TestCleaner.java index 370021a..200575a 100644 --- a/hudi-client/src/test/java/org/apache/hudi/TestCleaner.java +++ b/hudi-client/src/test/java/org/apache/hudi/TestCleaner.java @@ -78,6 +78,7 @@ import java.util.TreeSet; import java.util.function.Predicate; import java.util.stream.Collectors; import java.util.stream.Stream; + import scala.Tuple3; import static org.apache.hudi.common.model.HoodieTestUtils.DEFAULT_PARTITION_PATHS; diff --git a/hudi-client/src/test/java/org/apache/hudi/metrics/TestHoodieJmxMetrics.java b/hudi-client/src/test/java/org/apache/hudi/metrics/TestHoodieJmxMetrics.java index 7260774..b014329 100644 --- a/hudi-client/src/test/java/org/apache/hudi/metrics/TestHoodieJmxMetrics.java +++ b/hudi-client/src/test/java/org/apache/hudi/metrics/TestHoodieJmxMetrics.java @@ -18,16 +18,16 @@ package org.apache.hudi.metrics; -import static org.apache.hudi.metrics.Metrics.registerGauge; -import static org.junit.Assert.assertTrue; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - import org.apache.hudi.config.HoodieMetricsConfig; import org.apache.hudi.config.HoodieWriteConfig; import org.junit.Test; +import static org.apache.hudi.metrics.Metrics.registerGauge; +import static org.junit.Assert.assertTrue; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + /** * Test for the Jmx metrics report. */ diff --git a/hudi-common/src/main/java/org/apache/hudi/common/util/CleanerUtils.java b/hudi-common/src/main/java/org/apache/hudi/common/util/CleanerUtils.java index 4d4ccb9..0e8c460 100644 --- a/hudi-common/src/main/java/org/apache/hudi/common/util/CleanerUtils.java
[incubator-hudi] branch hudi_test_suite_refactor updated (eaaf3f6 -> ae5bd06)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from eaaf3f6 [HUDI-394] Provide a basic implementation of test suite add ae5bd06 [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) No new revisions were added by this update. Summary of changes: docker/hoodie/hadoop/hive_base/Dockerfile | 2 +- docker/hoodie/hadoop/hive_base/pom.xml | 4 ++-- {hudi-bench => hudi-test-suite}/pom.xml| 2 +- .../prepare_integration_suite.sh | 4 ++-- .../hudi/testsuite}/DFSDeltaWriterAdapter.java | 6 +++--- .../hudi/testsuite}/DFSSparkAvroDeltaWriter.java | 8 .../apache/hudi/testsuite}/DeltaInputFormat.java | 2 +- .../apache/hudi/testsuite}/DeltaOutputType.java| 2 +- .../apache/hudi/testsuite}/DeltaWriterAdapter.java | 4 ++-- .../apache/hudi/testsuite}/DeltaWriterFactory.java | 10 - .../testsuite}/configuration/DFSDeltaConfig.java | 6 +++--- .../hudi/testsuite}/configuration/DeltaConfig.java | 6 +++--- .../hudi/testsuite}/converter/UpdateConverter.java | 6 +++--- .../org/apache/hudi/testsuite}/dag/DagUtils.java | 8 .../hudi/testsuite}/dag/ExecutionContext.java | 9 .../apache/hudi/testsuite}/dag/WorkflowDag.java| 4 ++-- .../hudi/testsuite}/dag/WorkflowDagGenerator.java | 12 +-- .../hudi/testsuite}/dag/nodes/BulkInsertNode.java | 6 +++--- .../hudi/testsuite}/dag/nodes/CleanNode.java | 4 ++-- .../hudi/testsuite}/dag/nodes/CompactNode.java | 6 +++--- .../apache/hudi/testsuite}/dag/nodes/DagNode.java | 6 +++--- .../hudi/testsuite}/dag/nodes/HiveQueryNode.java | 8 .../hudi/testsuite}/dag/nodes/HiveSyncNode.java| 8 .../hudi/testsuite}/dag/nodes/InsertNode.java | 10 - .../hudi/testsuite}/dag/nodes/RollbackNode.java| 6 +++--- .../testsuite}/dag/nodes/ScheduleCompactNode.java | 6 +++--- .../testsuite}/dag/nodes/SparkSQLQueryNode.java| 8 .../hudi/testsuite}/dag/nodes/UpsertNode.java | 8 .../hudi/testsuite}/dag/nodes/ValidateNode.java| 6 +++--- .../testsuite}/dag/scheduler/DagScheduler.java | 12 +-- .../hudi/testsuite}/generator/DeltaGenerator.java | 24 +++--- .../FlexibleSchemaRecordGenerationIterator.java| 2 +- .../GenericRecordFullPayloadGenerator.java | 2 +- .../GenericRecordFullPayloadSizeEstimator.java | 2 +- .../GenericRecordPartialPayloadGenerator.java | 2 +- .../generator/LazyRecordGeneratorIterator.java | 2 +- .../generator/UpdateGeneratorIterator.java | 2 +- .../helpers/DFSTestSuitePathSelector.java | 2 +- .../testsuite}/helpers/HiveServiceProvider.java| 6 +++--- .../testsuite}/job/HoodieDeltaStreamerWrapper.java | 2 +- .../hudi/testsuite}/job/HoodieTestSuiteJob.java| 24 +++--- .../testsuite}/reader/DFSAvroDeltaInputReader.java | 8 .../testsuite}/reader/DFSDeltaInputReader.java | 2 +- .../reader/DFSHoodieDatasetInputReader.java| 2 +- .../reader/DFSParquetDeltaInputReader.java | 6 +++--- .../hudi/testsuite}/reader/DeltaInputReader.java | 2 +- .../hudi/testsuite}/reader/SparkBasedReader.java | 2 +- .../testsuite}/writer/AvroDeltaInputWriter.java| 2 +- .../hudi/testsuite}/writer/DeltaInputWriter.java | 2 +- .../apache/hudi/testsuite}/writer/DeltaWriter.java | 6 +++--- .../testsuite}/writer/FileDeltaInputWriter.java| 2 +- .../writer/SparkAvroDeltaInputWriter.java | 2 +- .../apache/hudi/testsuite}/writer/WriteStats.java | 2 +- .../hudi/testsuite}/TestDFSDeltaWriterAdapter.java | 18 .../hudi/testsuite}/TestFileDeltaInputWriter.java | 14 ++--- .../configuration/TestWorkflowBuilder.java | 12 +-- .../testsuite}/converter/TestUpdateConverter.java | 4 ++-- .../apache/hudi/testsuite}/dag/TestComplexDag.java | 12 +-- .../apache/hudi/testsuite}/dag/TestDagUtils.java | 10 - .../hudi/testsuite}/dag/TestHiveSyncDag.java | 12 +-- .../hudi/testsuite}/dag/TestInsertOnlyDag.java | 8 .../hudi/testsuite}/dag/TestInsertUpsertDag.java | 10 - .../TestGenericRecordPayloadEstimator.java | 6 +++--- .../TestGenericRecordPayloadGenerator.java | 14 ++--- .../generator/TestWorkloadGenerator.java | 22 ++-- .../testsuite}/job/TestHoodieTestSuiteJob.java | 22 ++-- .../reader/TestDFSAvroDeltaInputReader.java| 4 ++-- .../reader/TestDFSHoodieDatasetInputReader.java| 4 ++-- .../apache/hudi/testsuite}/utils/TestUtils.java| 2 +- .../hudi/testsuite}/writer/TestDeltaWriter.java| 6 +++--- .../hudi-t
[incubator-hudi] branch master updated (d6e83e8 -> b77fad3)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from d6e83e8 [HUDI-325] Fix Hive partition error for updated HDFS Hudi table (#1001) add b77fad3 [HUDI-364] Refactor hudi-hive based on new ImportOrder code style rule (#1048) No new revisions were added by this update. Summary of changes: .../hudi/common/util/collection/DiskBasedMap.java | 3 +- .../java/org/apache/hudi/hive/HiveSyncConfig.java | 1 + .../java/org/apache/hudi/hive/HiveSyncTool.java| 21 +- .../org/apache/hudi/hive/HoodieHiveClient.java | 4 +- .../hudi/hive/MultiPartKeysValueExtractor.java | 1 + .../org/apache/hudi/hive/SchemaDifference.java | 3 +- .../SlashEncodedDayPartitionValueExtractor.java| 3 +- .../apache/hudi/hive/util/ColumnNameXLator.java| 1 + .../java/org/apache/hudi/hive/util/SchemaUtil.java | 22 +- .../org/apache/hudi/hive/TestHiveSyncTool.java | 24 ++- .../test/java/org/apache/hudi/hive/TestUtil.java | 48 +++--- .../org/apache/hudi/hive/util/HiveTestService.java | 20 + .../adhoc/UpgradePayloadFromUberToApache.java | 2 +- style/checkstyle.xml | 2 +- 14 files changed, 84 insertions(+), 71 deletions(-)
[incubator-hudi] branch master updated: [HUDI-366] Refactor some module codes based on new ImportOrder code style rule (#1055)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new f9139c0 [HUDI-366] Refactor some module codes based on new ImportOrder code style rule (#1055) f9139c0 is described below commit f9139c0f616775f4b3d0df95772f2621f0e7c9f1 Author: 谢磊 AuthorDate: Wed Nov 27 21:32:43 2019 +0800 [HUDI-366] Refactor some module codes based on new ImportOrder code style rule (#1055) [HUDI-366] Refactor hudi-hadoop-mr / hudi-timeline-service / hudi-spark / hudi-integ-test / hudi- utilities based on new ImportOrder code style rule --- .../hudi/hadoop/HoodieParquetInputFormat.java | 34 +++-- .../hudi/hadoop/HoodieROTablePathFilter.java | 20 .../hudi/hadoop/RecordReaderValueIterator.java | 10 ++-- .../hadoop/SafeParquetRecordReaderWrapper.java | 3 +- .../hadoop/hive/HoodieCombineHiveInputFormat.java | 38 +++--- .../realtime/AbstractRealtimeRecordReader.java | 34 +++-- .../realtime/HoodieParquetRealtimeInputFormat.java | 46 + .../hadoop/realtime/HoodieRealtimeFileSplit.java | 3 +- .../realtime/HoodieRealtimeRecordReader.java | 6 ++- .../realtime/RealtimeCompactedRecordReader.java| 18 --- .../realtime/RealtimeUnmergedRecordReader.java | 20 .../apache/hudi/hadoop/InputFormatTestUtil.java| 28 +- .../org/apache/hudi/hadoop/TestAnnotation.java | 5 +- .../apache/hudi/hadoop/TestHoodieInputFormat.java | 10 ++-- .../hudi/hadoop/TestHoodieROTablePathFilter.java | 16 +++--- .../hudi/hadoop/TestRecordReaderValueIterator.java | 12 +++-- .../realtime/TestHoodieRealtimeRecordReader.java | 59 -- .../java/org/apache/hudi/integ/ITTestBase.java | 18 --- .../org/apache/hudi/integ/ITTestHoodieDemo.java| 6 ++- .../org/apache/hudi/integ/ITTestHoodieSanity.java | 1 + .../main/java/org/apache/hudi/BaseAvroPayload.java | 8 +-- .../java/org/apache/hudi/ComplexKeyGenerator.java | 8 +-- .../main/java/org/apache/hudi/DataSourceUtils.java | 18 --- .../org/apache/hudi/HoodieDataSourceHelpers.java | 10 ++-- .../main/java/org/apache/hudi/KeyGenerator.java| 6 ++- .../apache/hudi/NonpartitionedKeyGenerator.java| 3 +- .../hudi/OverwriteWithLatestAvroPayload.java | 10 ++-- .../main/java/org/apache/hudi/QuickstartUtils.java | 18 --- .../java/org/apache/hudi/SimpleKeyGenerator.java | 3 +- hudi-spark/src/test/java/DataSourceTestUtils.java | 7 +-- hudi-spark/src/test/java/HoodieJavaApp.java| 12 +++-- .../src/test/java/HoodieJavaStreamingApp.java | 18 --- .../timeline/service/FileSystemViewHandler.java| 24 + .../hudi/timeline/service/TimelineService.java | 16 +++--- .../timeline/service/handlers/DataFileHandler.java | 8 +-- .../service/handlers/FileSliceHandler.java | 12 +++-- .../hudi/timeline/service/handlers/Handler.java| 6 ++- .../timeline/service/handlers/TimelineHandler.java | 10 ++-- .../view/TestRemoteHoodieTableFileSystemView.java | 1 + .../apache/hudi/utilities/HDFSParquetImporter.java | 43 .../hudi/utilities/HiveIncrementalPuller.java | 32 ++-- .../org/apache/hudi/utilities/HoodieCleaner.java | 18 --- .../hudi/utilities/HoodieCompactionAdminTool.java | 18 --- .../org/apache/hudi/utilities/HoodieCompactor.java | 16 +++--- .../hudi/utilities/HoodieSnapshotCopier.java | 25 + .../hudi/utilities/HoodieWithTimelineServer.java | 11 ++-- .../org/apache/hudi/utilities/UtilHelpers.java | 26 +- .../adhoc/UpgradePayloadFromUberToApache.java | 20 .../AbstractDeltaStreamerService.java | 8 +-- .../hudi/utilities/deltastreamer/Compactor.java| 6 ++- .../hudi/utilities/deltastreamer/DeltaSync.java| 42 --- .../deltastreamer/HoodieDeltaStreamer.java | 50 +- .../deltastreamer/HoodieDeltaStreamerMetrics.java | 3 +- .../deltastreamer/SchedulerConfGenerator.java | 12 +++-- .../deltastreamer/SourceFormatAdapter.java | 11 ++-- .../exception/HoodieIncrementalPullException.java | 3 +- .../keygen/TimestampBasedKeyGenerator.java | 18 --- .../hudi/utilities/perf/TimelineServerPerf.java| 36 ++--- .../utilities/schema/FilebasedSchemaProvider.java | 12 +++-- .../schema/NullTargetSchemaRegistryProvider.java | 3 +- .../utilities/schema/RowBasedSchemaProvider.java | 3 +- .../hudi/utilities/schema/SchemaProvider.java | 6 ++- .../utilities/schema/SchemaRegistryProvider.java | 12 +++-- .../hudi/utilities/sources/AvroDFSSource.java | 9 ++-- .../hudi/utilities/sources/AvroKafkaSource.java| 7 +-- .../apache/hudi/utilities/sources/AvroSource.java | 3 +- .../hudi
[incubator-hudi] branch hudi_test_suite_refactor updated (ae5bd06 -> 9151ccf)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from ae5bd06 [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) add 9151ccf [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE (#1118) No new revisions were added by this update. Summary of changes: hudi-spark/src/main/java/org/apache/hudi/ComplexKeyGenerator.java | 5 - 1 file changed, 5 deletions(-)
[incubator-hudi] branch master updated: [HUDI-365] Refactor hudi-cli based on new ImportOrder code style rule (#1076)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new b2d9638 [HUDI-365] Refactor hudi-cli based on new ImportOrder code style rule (#1076) b2d9638 is described below commit b2d9638bea43f6b0a9c3412362dc4cb08e214957 Author: Gurudatt Kulkarni AuthorDate: Wed Dec 4 12:40:40 2019 +0530 [HUDI-365] Refactor hudi-cli based on new ImportOrder code style rule (#1076) --- .../main/java/org/apache/hudi/cli/HoodieCLI.java | 8 +++--- .../org/apache/hudi/cli/HoodiePrintHelper.java | 5 +++- .../src/main/java/org/apache/hudi/cli/Main.java| 3 ++- .../src/main/java/org/apache/hudi/cli/Table.java | 3 ++- .../hudi/cli/commands/ArchivedCommitsCommand.java | 22 --- .../apache/hudi/cli/commands/CleansCommand.java| 14 +- .../apache/hudi/cli/commands/CommitsCommand.java | 16 ++- .../hudi/cli/commands/CompactionCommand.java | 28 ++- .../apache/hudi/cli/commands/DatasetsCommand.java | 10 --- .../hudi/cli/commands/FileSystemViewCommand.java | 28 ++- .../cli/commands/HDFSParquetImportCommand.java | 2 ++ .../hudi/cli/commands/HoodieLogFileCommand.java| 31 -- .../hudi/cli/commands/HoodieSyncCommand.java | 6 +++-- .../apache/hudi/cli/commands/RepairsCommand.java | 8 +++--- .../apache/hudi/cli/commands/RollbacksCommand.java | 18 +++-- .../hudi/cli/commands/SavepointsCommand.java | 8 +++--- .../org/apache/hudi/cli/commands/SparkMain.java| 1 + .../org/apache/hudi/cli/commands/StatsCommand.java | 30 +++-- .../java/org/apache/hudi/cli/utils/CommitUtil.java | 5 ++-- .../java/org/apache/hudi/cli/utils/HiveUtil.java | 6 +++-- .../java/org/apache/hudi/cli/utils/SparkUtil.java | 6 +++-- 21 files changed, 149 insertions(+), 109 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCLI.java b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCLI.java index 0dafdc4..1b2dd86 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCLI.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCLI.java @@ -18,13 +18,15 @@ package org.apache.hudi.cli; -import java.io.IOException; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.FileSystem; import org.apache.hudi.common.table.HoodieTableMetaClient; import org.apache.hudi.common.util.ConsistencyGuardConfig; import org.apache.hudi.common.util.FSUtils; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileSystem; + +import java.io.IOException; + /** * This class is responsible to load table metadata and hoodie related configs. */ diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodiePrintHelper.java b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodiePrintHelper.java index 0e48911..5325432 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodiePrintHelper.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodiePrintHelper.java @@ -18,11 +18,14 @@ package org.apache.hudi.cli; +import org.apache.hudi.common.util.Option; + import com.jakewharton.fliptables.FlipTable; + import java.util.List; import java.util.Map; import java.util.function.Function; -import org.apache.hudi.common.util.Option; + /** * Helper class to render table for hoodie-cli. diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/Main.java b/hudi-cli/src/main/java/org/apache/hudi/cli/Main.java index 99627b0..e924be9 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/Main.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/Main.java @@ -18,9 +18,10 @@ package org.apache.hudi.cli; -import java.io.IOException; import org.springframework.shell.Bootstrap; +import java.io.IOException; + /** * Main class that delegates to Spring Shell's Bootstrap class in order to simplify debugging inside an IDE. */ diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/Table.java b/hudi-cli/src/main/java/org/apache/hudi/cli/Table.java index 5a446e7..2efad37 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/Table.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/Table.java @@ -18,6 +18,8 @@ package org.apache.hudi.cli; +import org.apache.hudi.common.util.Option; + import java.util.ArrayList; import java.util.Arrays; import java.util.Comparator; @@ -28,7 +30,6 @@ import java.util.function.Consumer; import java.util.function.Function; import java.util.stream.Collectors; import java.util.stream.IntStream; -import org.apache.hudi.common.util.Option; /** * Table to be rendered. This class takes care of ordering rows and limiting before renderer renders it. diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/ArchivedCommitsCommand.java b/hudi-cli/src/main/java/org/apache
[incubator-hudi] branch hudi_test_suite_refactor updated (5f22849 -> 7a0794a)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. omit 5f22849 [HUDI-592] Remove duplicated dependencies in the pom file of test suite module omit 044759a [HUDI-503] Add hudi test suite documentation into the README file of the test suite module (#1191) omit b66ba8d [MINOR] Fix TestHoodieTestSuiteJob#testComplexDag failure omit eaa4fb0 [HUDI-441] Rename WorkflowDagGenerator and some class names in test package omit 33246c4 Fixed resource leak in HiveTestService about hive meta store omit 7d67d5e [HUDI-591] Support Spark version upgrade omit 0456214 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE omit f8d25b1 [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime omit 53e73e5 [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) omit d5549f5 [HUDI-394] Provide a basic implementation of test suite add 2bb0c21 Fix conversion of Spark struct type to Avro schema add 9b2944a [MINOR] Refactor unnecessary boxing inside TypedProperties code (#1227) add 87fdb76 Adding util methods to assist in adding deletion support to Quick Start add 2b2f23a Fixing delete util method add 2248fd9 Fixing checkstyle issues add 7aa3ce3 [MINOR] Fix redundant judgment statement (#1231) add dd09abb [HUDI-335] Improvements to DiskBasedMap used by ExternalSpillableMap, for write and random/sequential read paths, by introducing bufferedRandmomAccessFile add 1daba24 Add GlobalDeleteKeyGenerator add b39458b [MINOR] Make constant fields final in HoodieTestDataGenerator (#1234) add 8a3a503 [MINOR] Fix missing @Override annotation on BufferedRandomAccessFile method (#1236) add c2c0f6b [HUDI-509] Renaming code in sync with cWiki restructuring (#1212) add baa6b5e [HUDI-537] Introduce `repair overwrite-hoodie-props` CLI command (#1241) add 0a07752 [HUDI-527] scalastyle-maven-plugin moved to pluginManagement as it is only used in hoodie-spark and hoodie-cli modules. add 923e2b4 [HUDI-535] Ensure Compaction Plan is always written in .aux folder to avoid 0.5.0/0.5.1 reader-writer compatibility issues (#1229) add 292c1e2 [HUDI-238] Make Hudi support Scala 2.12 (#1226) add 5471d8f [MINOR] Add toString method to TimelineLayoutVersion to make it more readable (#1244) add 3f4966d [MINOR] Fix PMC in DOAP] (#1247) add d0ee95e [HUDI-552] Fix the schema mismatch in Row-to-Avro conversion (#1246) add 9489d0f [HUDI-551] Abstract a test case class for DFS Source to make it extensible (#1239) add 7087e7d [HUDI-556] Add lisence for PR#1233 add ba54a7e [HUDI-559] : Make the timeline layout version default to be null version add 6e59c1c Moving to 0.5.2-SNAPSHOT on master branch. add 924bf51 [MINOR] Download KEYS file when validating release candidate (#1259) add b6e2993 [MINOR] Update the javadoc of HoodieTableMetaClient#scanFiles (#1263) add a54535e [MINOR] Fix invalid maven repo address (#1265) add a46fea9 [MINOR] Change deploy_staging_jars script to take in scala version (#1269) add 8e3d81c [MINOR] Change deploy_staging_jars script to take in scala version (#1270) add ed54eb2 [MINOR] Add missing licenses (#1271) add fc8d4a7 [MINOR] fix license issue (#1273) add 1e79cbc [HUDI-549] update Github README with instructions to build with Scala 2.12 (#1275) add cdb028f [MINOR] Fix missing groupId / version property of dependency add 56a4e0d [MINOR] Fix invalid issue url & quickstart url (#1282) add 362a9b9 [MINOR] Remove junit-dep dependency add c06ec8b [MINOR] Fix assigning to configuration more times (#1291) add 6f34be1 HUDI-117 Close file handle before throwing an exception due to append failure. Add test cases to handle/verify stage failure scenarios. add 652224e [HUDI-578] Trim recordKeyFields and partitionPathFields in ComplexKeyGenerator (#1281) add f27c7a1 [HUDI-564] Added new test cases for HoodieLogFormat and HoodieLogFormatVersion. add 5b7bb14 [HUDI-583] Code Cleanup, remove redundant code, and other changes (#1237) add 0026234 [MINOR] Updated DOAP with 0.5.1 release (#1300) add fcf9e4a [MINOR] Updated DOAP with 0.5.1 release (#1301) add d07ac58 Increase test coverage for HoodieReadClient add 347e297 [HUDI-596] Close KafkaConsumer every time (#1303) add 594da28 [HUDI-595] code cleanup, refactoring code out of PR# 1159 (#1302) add 4de0fcf [HUDI-566] Added new test cases for class HoodieTimeline, HoodieDefaultTimeline and HoodieActiveTimeline. add 425e3e6 [HUDI-585] Optimize the steps of building with scala-2.12 (#1293)
[incubator-hudi] branch hudi_test_suite_refactor updated (7a0794a -> b04b037)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard 7a0794a Fix compile error after rebasing the branch add b04b037 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (7a0794a) \ N -- N -- N refs/heads/hudi_test_suite_refactor (b04b037) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: hudi-test-suite/pom.xml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
svn commit: r38339 - /dev/incubator/hudi/KEYS
Author: vinoyang Date: Sun Mar 1 04:49:47 2020 New Revision: 38339 Log: Update KEYS file in release repo Modified: dev/incubator/hudi/KEYS Modified: dev/incubator/hudi/KEYS == --- dev/incubator/hudi/KEYS (original) +++ dev/incubator/hudi/KEYS Sun Mar 1 04:49:47 2020 @@ -332,3 +332,40 @@ ePILQEvmZ8GgwevHx170WUjKWBpLLSFl4zgXIl9Q =FZo/ -END PGP PUBLIC KEY BLOCK- +pub rsa2048 2020-03-01 [SC] + C3A96EC77149571AE89F82764C86684D047DE03C +uid [ultimate] vinoyang (apache gpg) +sig 34C86684D047DE03C 2020-03-01 vinoyang (apache gpg) +sub rsa2048 2020-03-01 [E] +sig 4C86684D047DE03C 2020-03-01 vinoyang (apache gpg) + +-BEGIN PGP PUBLIC KEY BLOCK- + +mQENBF5bOScBCADbO+U5FnwUK2Nf+NjwTRK2HZYsckLbjpO9we6OuASbYmWSqMhu +qdhGgwu37E7qjQW4nE8s1hzOPrVUlSzFhXTRjMmRtR1wviWs1ibwH/AQA0HGH9jE +1p9O6Dy6tASSE0Hd+rxUaFmYxabC6BLopJccD9ANhkhnH9UVbHSDn5RXVV7eEvy5 +wzFHCQavqj4oMAGRUXIHl3uh4U9ROiAU3UQRkZ3pnf3R6tbCtw1Jv/NcWztaLQLm +aMkr3hSFrrZ8xR2hAOhOs4+Y+i3bELg9bBmvz1lCQRYDEY82lugPyqpdKc98KTAw +gVgQoXjJhw2+bC/4KTaXwwAcexuh7YfcNJAvABEBAAG0K3Zpbm95YW5nIChhcGFj +aGUgZ3BnKSA8dmlub3lhbmdAYXBhY2hlLm9yZz6JAU4EEwEIADgWIQTDqW7HcUlX +GuifgnZMhmhNBH3gPAUCXls5JwIbAwULCQgHAgYVCgkICwIEFgIDAQIeAQIXgAAK +CRBMhmhNBH3gPJvxB/9gRVhXMXJXH7Z1JNf7zjdRXbdtQgMr2/WVLnbAevjU+Btf +gHO+KllJZoFXFhHJWSRld7OaNJC0k9V9UyMgOBS1hvCRFEbMH8mRisux4JDx9/F6 +0/gWCzoK3pY2EevzDtr5bOx6C4GPoSjAfBUifTB5YixnX57ePxLnZmeeME+dQ+/1 +fMkpR6a5Eo4sLiVOwhBgYMGKHr6GZSVd23CSyPVxDKuImNfehZQLCtNq8LiKFTY9 +tGhInVdbLUJCPiXoxIKpinVtYxUGoW8LHUFaVq3BenG4wKx5s/ImU824pAtWJ0CK +NHbzwF7oPckfM1hdT3ZycfUK1qoX6FoEVtZZ1QxTuQENBF5bOScBCADKl9sswehe +S+2zrhR1C5U3gNqZYD/MSIdx3K2k//BjweYZCqxtzR1J9JtitrA0WJKF8NnK2dF+ +FkpC1iDduvAAZXw94tGb9qyTeSXhZ4gFAxfbRwthEhP0GYya1bhtM+gi2zOW+tsp +KSYwCBUoAk8PKI3ZPyiWJNhlmsOolSg4IF50PWzhXet0t+OeJaBGNffdfERF7TF/ +y1lBu5RLiLxUDYc5tV80dA3MNLDkCKW16OlCAkxH8+IkZ5Z2eprDaFDBwDo0/5jk +pET8XBCjBReaFsleYBkZmwdbzeurkj8sTa0GQZKdeBynciDqbREmWulkkTp9jGTv +lRarr3woa2f7ABEBAAGJATYEGAEIACAWIQTDqW7HcUlXGuifgnZMhmhNBH3gPAUC +Xls5JwIbDAAKCRBMhmhNBH3gPM9FCACO9+sqdi7wkp8asbpS6WzjZ0FS3KbW3IoW +QgbVx9t4mB4cGq91h6CnbDGZnr2qlRKwCCAijuUfBTPER8lzyltOVos22FbHXWa+ +Oqicjn336aysnFZuNTvnvYsWvlwvW5AAVCZn4YfE0qYB6oHCBZLdg4YFQRx6U1t5 +CXIaSBYhtOhp0VJ4+0X9chmMmSpJayutFaykU2AnZwLe8a5EppT/NXe6db1oV/c5 +k2TGkmCbCkVobp4AElQ28fQ/sAYtVWLO6wGEpFH/HWUeMjXTsun2mY25jVr5X4CJ +FA3V4MZ7SGwKjZZa6oep6lPoig/R4MfsDwQ2zW/vLFPel1am406v +=Z0T0 +-END PGP PUBLIC KEY BLOCK-
[incubator-hudi] branch master updated: Moving to 0.6.0-SNAPSHOT on master branch.
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 0dc8e49 Moving to 0.6.0-SNAPSHOT on master branch. 0dc8e49 is described below commit 0dc8e493aa1658910a3519df3941278d9d072c18 Author: yanghua AuthorDate: Sun Mar 1 15:08:30 2020 +0800 Moving to 0.6.0-SNAPSHOT on master branch. --- docker/hoodie/hadoop/base/pom.xml | 2 +- docker/hoodie/hadoop/datanode/pom.xml | 2 +- docker/hoodie/hadoop/historyserver/pom.xml| 2 +- docker/hoodie/hadoop/hive_base/pom.xml| 2 +- docker/hoodie/hadoop/namenode/pom.xml | 2 +- docker/hoodie/hadoop/pom.xml | 2 +- docker/hoodie/hadoop/prestobase/pom.xml | 2 +- docker/hoodie/hadoop/spark_base/pom.xml | 2 +- docker/hoodie/hadoop/sparkadhoc/pom.xml | 2 +- docker/hoodie/hadoop/sparkmaster/pom.xml | 2 +- docker/hoodie/hadoop/sparkworker/pom.xml | 2 +- hudi-cli/pom.xml | 2 +- hudi-client/pom.xml | 2 +- hudi-common/pom.xml | 2 +- hudi-hadoop-mr/pom.xml| 2 +- hudi-hive/pom.xml | 2 +- hudi-integ-test/pom.xml | 2 +- hudi-spark/pom.xml| 2 +- hudi-timeline-service/pom.xml | 2 +- hudi-utilities/pom.xml| 2 +- packaging/hudi-hadoop-mr-bundle/pom.xml | 2 +- packaging/hudi-hive-bundle/pom.xml| 2 +- packaging/hudi-presto-bundle/pom.xml | 2 +- packaging/hudi-spark-bundle/pom.xml | 2 +- packaging/hudi-timeline-server-bundle/pom.xml | 2 +- packaging/hudi-utilities-bundle/pom.xml | 2 +- pom.xml | 2 +- 27 files changed, 27 insertions(+), 27 deletions(-) diff --git a/docker/hoodie/hadoop/base/pom.xml b/docker/hoodie/hadoop/base/pom.xml index 0cbd377..55205ee 100644 --- a/docker/hoodie/hadoop/base/pom.xml +++ b/docker/hoodie/hadoop/base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/datanode/pom.xml b/docker/hoodie/hadoop/datanode/pom.xml index 034aebe..e8c95f9 100644 --- a/docker/hoodie/hadoop/datanode/pom.xml +++ b/docker/hoodie/hadoop/datanode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/historyserver/pom.xml b/docker/hoodie/hadoop/historyserver/pom.xml index b41ca5c..725cdcf 100644 --- a/docker/hoodie/hadoop/historyserver/pom.xml +++ b/docker/hoodie/hadoop/historyserver/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/hive_base/pom.xml b/docker/hoodie/hadoop/hive_base/pom.xml index d65e230..04aac75 100644 --- a/docker/hoodie/hadoop/hive_base/pom.xml +++ b/docker/hoodie/hadoop/hive_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/namenode/pom.xml b/docker/hoodie/hadoop/namenode/pom.xml index c35ff45..4ec1f9a 100644 --- a/docker/hoodie/hadoop/namenode/pom.xml +++ b/docker/hoodie/hadoop/namenode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/pom.xml b/docker/hoodie/hadoop/pom.xml index e2d0482..bedd3b4 100644 --- a/docker/hoodie/hadoop/pom.xml +++ b/docker/hoodie/hadoop/pom.xml @@ -19,7 +19,7 @@ hudi org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT ../../../pom.xml 4.0.0 diff --git a/docker/hoodie/hadoop/prestobase/pom.xml b/docker/hoodie/hadoop/prestobase/pom.xml index fd96e21..2ba319c 100644 --- a/docker/hoodie/hadoop/prestobase/pom.xml +++ b/docker/hoodie/hadoop/prestobase/pom.xml @@ -22,7 +22,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/spark_base/pom.xml b/docker/hoodie/hadoop/spark_base/pom.xml index e9a4d5a..6385305 100644 --- a/docker/hoodie/hadoop/spark_base/pom.xml +++ b/docker/hoodie/hadoop/spark_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-SNAPSHOT +0.6.0-SNAPSHOT 4.0.0 pom diff --git a/docker/hoodie/hadoop/sparkadhoc/pom.xml b/docker/hoodie/hadoop/sparkadhoc/pom.xml index 1e008e5..c1babf4 100644 --- a/docker/hoodie/hadoop/sparkadhoc/pom.xml +++ b/docker/hoodie/hadoop/sparkadhoc/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2
[incubator-hudi] branch release-0.5.2 created (now afaf4ba)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at afaf4ba Create release branch for version 0.5.2. This branch includes the following new commits: new afaf4ba Create release branch for version 0.5.2. The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
svn commit: r38340 - /release/incubator/hudi/KEYS
Author: vinoyang Date: Sun Mar 1 05:47:02 2020 New Revision: 38340 Log: Update KEYS file Modified: release/incubator/hudi/KEYS Modified: release/incubator/hudi/KEYS == --- release/incubator/hudi/KEYS (original) +++ release/incubator/hudi/KEYS Sun Mar 1 05:47:02 2020 @@ -332,3 +332,40 @@ ePILQEvmZ8GgwevHx170WUjKWBpLLSFl4zgXIl9Q =FZo/ -END PGP PUBLIC KEY BLOCK- +pub rsa2048 2020-03-01 [SC] + C3A96EC77149571AE89F82764C86684D047DE03C +uid [ultimate] vinoyang (apache gpg) +sig 34C86684D047DE03C 2020-03-01 vinoyang (apache gpg) +sub rsa2048 2020-03-01 [E] +sig 4C86684D047DE03C 2020-03-01 vinoyang (apache gpg) + +-BEGIN PGP PUBLIC KEY BLOCK- + +mQENBF5bOScBCADbO+U5FnwUK2Nf+NjwTRK2HZYsckLbjpO9we6OuASbYmWSqMhu +qdhGgwu37E7qjQW4nE8s1hzOPrVUlSzFhXTRjMmRtR1wviWs1ibwH/AQA0HGH9jE +1p9O6Dy6tASSE0Hd+rxUaFmYxabC6BLopJccD9ANhkhnH9UVbHSDn5RXVV7eEvy5 +wzFHCQavqj4oMAGRUXIHl3uh4U9ROiAU3UQRkZ3pnf3R6tbCtw1Jv/NcWztaLQLm +aMkr3hSFrrZ8xR2hAOhOs4+Y+i3bELg9bBmvz1lCQRYDEY82lugPyqpdKc98KTAw +gVgQoXjJhw2+bC/4KTaXwwAcexuh7YfcNJAvABEBAAG0K3Zpbm95YW5nIChhcGFj +aGUgZ3BnKSA8dmlub3lhbmdAYXBhY2hlLm9yZz6JAU4EEwEIADgWIQTDqW7HcUlX +GuifgnZMhmhNBH3gPAUCXls5JwIbAwULCQgHAgYVCgkICwIEFgIDAQIeAQIXgAAK +CRBMhmhNBH3gPJvxB/9gRVhXMXJXH7Z1JNf7zjdRXbdtQgMr2/WVLnbAevjU+Btf +gHO+KllJZoFXFhHJWSRld7OaNJC0k9V9UyMgOBS1hvCRFEbMH8mRisux4JDx9/F6 +0/gWCzoK3pY2EevzDtr5bOx6C4GPoSjAfBUifTB5YixnX57ePxLnZmeeME+dQ+/1 +fMkpR6a5Eo4sLiVOwhBgYMGKHr6GZSVd23CSyPVxDKuImNfehZQLCtNq8LiKFTY9 +tGhInVdbLUJCPiXoxIKpinVtYxUGoW8LHUFaVq3BenG4wKx5s/ImU824pAtWJ0CK +NHbzwF7oPckfM1hdT3ZycfUK1qoX6FoEVtZZ1QxTuQENBF5bOScBCADKl9sswehe +S+2zrhR1C5U3gNqZYD/MSIdx3K2k//BjweYZCqxtzR1J9JtitrA0WJKF8NnK2dF+ +FkpC1iDduvAAZXw94tGb9qyTeSXhZ4gFAxfbRwthEhP0GYya1bhtM+gi2zOW+tsp +KSYwCBUoAk8PKI3ZPyiWJNhlmsOolSg4IF50PWzhXet0t+OeJaBGNffdfERF7TF/ +y1lBu5RLiLxUDYc5tV80dA3MNLDkCKW16OlCAkxH8+IkZ5Z2eprDaFDBwDo0/5jk +pET8XBCjBReaFsleYBkZmwdbzeurkj8sTa0GQZKdeBynciDqbREmWulkkTp9jGTv +lRarr3woa2f7ABEBAAGJATYEGAEIACAWIQTDqW7HcUlXGuifgnZMhmhNBH3gPAUC +Xls5JwIbDAAKCRBMhmhNBH3gPM9FCACO9+sqdi7wkp8asbpS6WzjZ0FS3KbW3IoW +QgbVx9t4mB4cGq91h6CnbDGZnr2qlRKwCCAijuUfBTPER8lzyltOVos22FbHXWa+ +Oqicjn336aysnFZuNTvnvYsWvlwvW5AAVCZn4YfE0qYB6oHCBZLdg4YFQRx6U1t5 +CXIaSBYhtOhp0VJ4+0X9chmMmSpJayutFaykU2AnZwLe8a5EppT/NXe6db1oV/c5 +k2TGkmCbCkVobp4AElQ28fQ/sAYtVWLO6wGEpFH/HWUeMjXTsun2mY25jVr5X4CJ +FA3V4MZ7SGwKjZZa6oep6lPoig/R4MfsDwQ2zW/vLFPel1am406v +=Z0T0 +-END PGP PUBLIC KEY BLOCK-
[incubator-hudi] branch master updated: [MINOR] Fix cut_release_branch script missed a double quotation marks (#1365)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 9160084 [MINOR] Fix cut_release_branch script missed a double quotation marks (#1365) 9160084 is described below commit 9160084bb147552d670235d27ecb198239ee32e5 Author: vinoyang AuthorDate: Sun Mar 1 14:34:15 2020 +0800 [MINOR] Fix cut_release_branch script missed a double quotation marks (#1365) --- scripts/release/cut_release_branch.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/release/cut_release_branch.sh b/scripts/release/cut_release_branch.sh index 8e6e923..f8f0d9f 100755 --- a/scripts/release/cut_release_branch.sh +++ b/scripts/release/cut_release_branch.sh @@ -73,7 +73,7 @@ echo "next_release: ${NEXT_VERSION_IN_BASE_BRANCH}" echo "working master branch: ${MASTER_BRANCH}" echo "working release branch: ${RELEASE_BRANCH}" echo "local repo dir: ~/${LOCAL_CLONE_DIR}/${HUDI_ROOT_DIR}" -echo "RC_NUM: $RC_NUM +echo "RC_NUM: $RC_NUM" echo "===" cd ~
[incubator-hudi] branch master updated: [HUDI-599] Update release guide & release scripts due to the change of scala 2.12 build (#1364)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 0cde27e [HUDI-599] Update release guide & release scripts due to the change of scala 2.12 build (#1364) 0cde27e is described below commit 0cde27e63c2cf9b70f24f0ae6b63fad9259b28d3 Author: leesf <490081...@qq.com> AuthorDate: Sun Mar 1 14:30:32 2020 +0800 [HUDI-599] Update release guide & release scripts due to the change of scala 2.12 build (#1364) --- scripts/release/deploy_staging_jars.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/release/deploy_staging_jars.sh b/scripts/release/deploy_staging_jars.sh index b02a7d4..885c25f 100755 --- a/scripts/release/deploy_staging_jars.sh +++ b/scripts/release/deploy_staging_jars.sh @@ -54,5 +54,5 @@ cd .. echo "Deploying to repository.apache.org with scala version ${SCALA_VERSION}" -COMMON_OPTIONS="-Pscala-${SCALA_VERSION} -Prelease -DskipTests -DretryFailedDeploymentCount=10 -DdeployArtifacts=true" +COMMON_OPTIONS="-Dscala-${SCALA_VERSION} -Prelease -DskipTests -DretryFailedDeploymentCount=10 -DdeployArtifacts=true" $MVN clean deploy $COMMON_OPTIONS
[incubator-hudi] branch master updated (924bf51 -> b6e2993)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 924bf51 [MINOR] Download KEYS file when validating release candidate (#1259) add b6e2993 [MINOR] Update the javadoc of HoodieTableMetaClient#scanFiles (#1263) No new revisions were added by this update. Summary of changes: .../org/apache/hudi/common/table/HoodieTableMetaClient.java| 10 +- 1 file changed, 9 insertions(+), 1 deletion(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (33246c4 -> eaa4fb0)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 33246c4 Fixed resource leak in HiveTestService about hive meta store add eaa4fb0 [HUDI-441] Rename WorkflowDagGenerator and some class names in test package No new revisions were added by this update. Summary of changes: ...erator.java => SimpleWorkflowDagGenerator.java} | 3 +- .../hudi/testsuite/dag/WorkflowDagGenerator.java | 56 +++--- ...estComplexDag.java => ComplexDagGenerator.java} | 2 +- ...tHiveSyncDag.java => HiveSyncDagGenerator.java} | 2 +- ...ertOnlyDag.java => InsertOnlyDagGenerator.java} | 2 +- ...psertDag.java => InsertUpsertDagGenerator.java} | 2 +- .../apache/hudi/testsuite/dag/TestDagUtils.java| 4 +- .../hudi/testsuite/job/TestHoodieTestSuiteJob.java | 12 ++--- 8 files changed, 20 insertions(+), 63 deletions(-) copy hudi-test-suite/src/main/java/org/apache/hudi/testsuite/dag/{WorkflowDagGenerator.java => SimpleWorkflowDagGenerator.java} (96%) rename hudi-test-suite/src/test/java/org/apache/hudi/testsuite/dag/{TestComplexDag.java => ComplexDagGenerator.java} (97%) rename hudi-test-suite/src/test/java/org/apache/hudi/testsuite/dag/{TestHiveSyncDag.java => HiveSyncDagGenerator.java} (96%) rename hudi-test-suite/src/test/java/org/apache/hudi/testsuite/dag/{TestInsertOnlyDag.java => InsertOnlyDagGenerator.java} (96%) rename hudi-test-suite/src/test/java/org/apache/hudi/testsuite/dag/{TestInsertUpsertDag.java => InsertUpsertDagGenerator.java} (96%)
[incubator-hudi] branch asf-site updated: [HUDI-602] Add some guidance about how to judge the scope of MINOR to the contribution guidance (#1311)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/asf-site by this push: new b1bb7b0 [HUDI-602] Add some guidance about how to judge the scope of MINOR to the contribution guidance (#1311) b1bb7b0 is described below commit b1bb7b0e050b2d7acfb7c5bccdbc2a98e6a35743 Author: vinoyang AuthorDate: Fri Feb 7 15:00:43 2020 +0800 [HUDI-602] Add some guidance about how to judge the scope of MINOR to the contribution guidance (#1311) * [HUDI-602] Add some guidance about how to judge the scope of MINOR to the contribution guidance --- docs/_pages/contributing.cn.md | 6 +- docs/_pages/contributing.md| 6 +- 2 files changed, 10 insertions(+), 2 deletions(-) diff --git a/docs/_pages/contributing.cn.md b/docs/_pages/contributing.cn.md index 9e017ad..e66deea 100644 --- a/docs/_pages/contributing.cn.md +++ b/docs/_pages/contributing.cn.md @@ -88,7 +88,11 @@ and more importantly also try to improve the process along the way as well. - Once you finalize on a project/task, please open a new JIRA or assign an existing one to yourself. - Almost all PRs should be linked to a JIRA. It's always good to have a JIRA upfront to avoid duplicating efforts. - - If the changes are minor, then `[MINOR]` prefix can be added to Pull Request title without a JIRA. + - If the changes are minor, then `[MINOR]` prefix can be added to Pull Request title without a JIRA. Below are some tips to judge **MINOR** Pull Request : +- trivial fixes (for example, a typo, a broken link, intellisense or an obvious error) +- the change does not alter functionality or performance in any way +- changed lines less than 100 +- obviously judge that the PR would pass without waiting for CI / CD verification - But, you may be asked to file a JIRA, if reviewer deems it necessary - Before you begin work, - Claim the JIRA using the process above and assign the JIRA to yourself. diff --git a/docs/_pages/contributing.md b/docs/_pages/contributing.md index 3745edd..9d84872 100644 --- a/docs/_pages/contributing.md +++ b/docs/_pages/contributing.md @@ -87,7 +87,11 @@ and more importantly also try to improve the process along the way as well. - Once you finalize on a project/task, please open a new JIRA or assign an existing one to yourself. - Almost all PRs should be linked to a JIRA. It's always good to have a JIRA upfront to avoid duplicating efforts. - - If the changes are minor, then `[MINOR]` prefix can be added to Pull Request title without a JIRA. + - If the changes are minor, then `[MINOR]` prefix can be added to Pull Request title without a JIRA. Below are some tips to judge **MINOR** Pull Request : +- trivial fixes (for example, a typo, a broken link, intellisense or an obvious error) +- the change does not alter functionality or performance in any way +- changed lines less than 100 +- obviously judge that the PR would pass without waiting for CI / CD verification - But, you may be asked to file a JIRA, if reviewer deems it necessary - Before you begin work, - Claim the JIRA using the process above and assign the JIRA to yourself.
[incubator-hudi] branch restructure-hudi-client created (now d26dc0b)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch restructure-hudi-client in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. at d26dc0b [HUDI-587] Fixed generation of jacoco coverage reports. No new revisions were added by this update.
[incubator-hudi] branch restructure-hudi-client updated: [HUDI-542] Introduce a new pom module named hudi-writer-common (#1314)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch restructure-hudi-client in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/restructure-hudi-client by this push: new e7b1961 [HUDI-542] Introduce a new pom module named hudi-writer-common (#1314) e7b1961 is described below commit e7b1961de082b626fdb27890eb4ff701f11dffb4 Author: vinoyang AuthorDate: Sat Feb 8 16:20:33 2020 +0800 [HUDI-542] Introduce a new pom module named hudi-writer-common (#1314) --- hudi-writer-common/pom.xml | 15 + .../org/apache/hudi/writer/common/Placeholder.java | 26 ++ .../apache/hudi/writer/common/PlaceholderTest.java | 26 ++ pom.xml| 1 + 4 files changed, 68 insertions(+) diff --git a/hudi-writer-common/pom.xml b/hudi-writer-common/pom.xml new file mode 100644 index 000..eed8c5a --- /dev/null +++ b/hudi-writer-common/pom.xml @@ -0,0 +1,15 @@ + +http://maven.apache.org/POM/4.0.0; + xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; + xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + +hudi +org.apache.hudi +0.5.2-SNAPSHOT + + 4.0.0 + + hudi-writer-common + + + \ No newline at end of file diff --git a/hudi-writer-common/src/main/java/org/apache/hudi/writer/common/Placeholder.java b/hudi-writer-common/src/main/java/org/apache/hudi/writer/common/Placeholder.java new file mode 100644 index 000..93d8a35 --- /dev/null +++ b/hudi-writer-common/src/main/java/org/apache/hudi/writer/common/Placeholder.java @@ -0,0 +1,26 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.writer.common; + +/** + * Used for placeholder purpose. + */ +public class Placeholder { + +} diff --git a/hudi-writer-common/src/test/java/org/apache/hudi/writer/common/PlaceholderTest.java b/hudi-writer-common/src/test/java/org/apache/hudi/writer/common/PlaceholderTest.java new file mode 100644 index 000..2a10a58 --- /dev/null +++ b/hudi-writer-common/src/test/java/org/apache/hudi/writer/common/PlaceholderTest.java @@ -0,0 +1,26 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.writer.common; + +/** + * Used for placeholder purpose. + */ +public class PlaceholderTest { + +} diff --git a/pom.xml b/pom.xml index 8bdb4a6..b6782ad 100644 --- a/pom.xml +++ b/pom.xml @@ -51,6 +51,7 @@ packaging/hudi-timeline-server-bundle docker/hoodie/hadoop hudi-integ-test +hudi-writer-common
[incubator-hudi] branch hudi_test_suite_refactor updated (fa07df4 -> a31a8f6)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard fa07df4 trigger rebuild discard bb58a5f [MINOR] Fix compile error after rebasing the branch add a31a8f6 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (fa07df4) \ N -- N -- N refs/heads/hudi_test_suite_refactor (a31a8f6) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: hudi-test-suite/pom.xml| 4 ++-- .../hudi/testsuite/reader/DFSHoodieDatasetInputReader.java | 10 +- .../org/apache/hudi/testsuite/job/TestHoodieTestSuiteJob.java | 8 .../test/java/org/apache/hudi/testsuite/utils/TestUtils.java | 4 ++-- packaging/hudi-test-suite-bundle/pom.xml | 4 ++-- 5 files changed, 15 insertions(+), 15 deletions(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (162eb50 -> f380005)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 162eb50 [MINOR] Fix compile error after rebasing the branch add f380005 trigger rebuild No new revisions were added by this update. Summary of changes:
[incubator-hudi] branch master updated: [HUDI-624]: Split some of the code from PR for HUDI-479 (#1344)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 078d482 [HUDI-624]: Split some of the code from PR for HUDI-479 (#1344) 078d482 is described below commit 078d4825d909b2c469398f31c97d2290687321a8 Author: Suneel Marthi AuthorDate: Fri Feb 21 01:22:21 2020 -0500 [HUDI-624]: Split some of the code from PR for HUDI-479 (#1344) --- .../hudi/cli/commands/HoodieLogFileCommand.java| 11 +++ .../org/apache/hudi/cli/commands/SparkMain.java| 10 +++--- .../java/org/apache/hudi/cli/utils/SparkUtil.java | 5 ++- .../org/apache/hudi/config/HoodieWriteConfig.java | 4 +-- .../org/apache/hudi/func/LazyIterableIterator.java | 2 +- .../hudi/index/bloom/BloomIndexFileInfo.java | 9 +++--- .../org/apache/hudi/io/HoodieAppendHandle.java | 4 +-- .../org/apache/hudi/io/HoodieCommitArchiveLog.java | 8 ++--- .../strategy/BoundedIOCompactionStrategy.java | 5 ++- .../io/compact/strategy/CompactionStrategy.java| 5 ++- .../apache/hudi/metrics/JmxMetricsReporter.java| 4 +-- .../org/apache/hudi/table/RollbackExecutor.java| 6 ++-- .../org/apache/hudi/TestCompactionAdminClient.java | 1 - .../apache/hudi/config/TestHoodieWriteConfig.java | 4 +-- .../hudi/index/bloom/TestHoodieBloomIndex.java | 22 ++--- .../index/bloom/TestHoodieGlobalBloomIndex.java| 10 +++--- .../org/apache/hudi/common/model/HoodieKey.java| 7 ++--- .../org/apache/hudi/common/model/HoodieRecord.java | 8 ++--- .../hudi/common/model/HoodieRecordLocation.java| 7 ++--- .../hudi/common/util/BufferedRandomAccessFile.java | 6 +--- .../java/org/apache/hudi/common/util/FSUtils.java | 3 +- .../hudi/common/util/ObjectSizeCalculator.java | 36 +++--- .../log/TestHoodieLogFormatAppendFailure.java | 4 +-- .../table/string/TestHoodieActiveTimeline.java | 1 - .../table/view/TestHoodieTableFileSystemView.java | 5 ++- .../hudi/common/util/TestCompactionUtils.java | 21 ++--- .../org/apache/hudi/hive/SchemaDifference.java | 28 + .../java/org/apache/hudi/hive/util/SchemaUtil.java | 8 ++--- .../org/apache/hudi/hive/TestHiveSyncTool.java | 3 +- .../test/java/org/apache/hudi/hive/TestUtil.java | 15 - .../org/apache/hudi/utilities/UtilHelpers.java | 9 +++--- 31 files changed, 130 insertions(+), 141 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HoodieLogFileCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HoodieLogFileCommand.java index 8a50309..2bb87e0 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HoodieLogFileCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HoodieLogFileCommand.java @@ -38,8 +38,6 @@ import org.apache.hudi.config.HoodieCompactionConfig; import org.apache.hudi.config.HoodieMemoryConfig; import org.apache.hudi.hive.util.SchemaUtil; -import com.google.common.base.Preconditions; -import com.google.common.collect.Maps; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.avro.Schema; @@ -59,6 +57,7 @@ import java.util.Arrays; import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.Objects; import java.util.stream.Collectors; import scala.Tuple2; @@ -85,14 +84,14 @@ public class HoodieLogFileCommand implements CommandMarker { List logFilePaths = Arrays.stream(fs.globStatus(new Path(logFilePathPattern))) .map(status -> status.getPath().toString()).collect(Collectors.toList()); Map, Map>, Integer>>> commitCountAndMetadata = -Maps.newHashMap(); +new HashMap<>(); int numCorruptBlocks = 0; int dummyInstantTimeCount = 0; for (String logFilePath : logFilePaths) { FileStatus[] fsStatus = fs.listStatus(new Path(logFilePath)); Schema writerSchema = new AvroSchemaConverter() - .convert(Preconditions.checkNotNull(SchemaUtil.readSchemaFromLogFile(fs, new Path(logFilePath; + .convert(Objects.requireNonNull(SchemaUtil.readSchemaFromLogFile(fs, new Path(logFilePath; Reader reader = HoodieLogFormat.newReader(fs, new HoodieLogFile(fsStatus[0].getPath()), writerSchema); // read the avro blocks @@ -181,7 +180,7 @@ public class HoodieLogFileCommand implements CommandMarker { AvroSchemaConverter converter = new AvroSchemaConverter(); // get schema from last log file Schema readerSchema = - converter.convert(Preconditions.checkNotNull(SchemaUtil.readSchemaFromLogFile(fs, new Path(logFilePaths.get(logFilePaths.size() - 1); + converter.convert(Objects.requireNonNull(SchemaUtil.readSchemaFromLogFile(fs, new Path(logFilePaths.get(logFilePaths.size() - 1);
[incubator-hudi] branch master updated: [HUDI-560] Remove legacy IdentityTransformer (#1264)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 5fdf5a1 [HUDI-560] Remove legacy IdentityTransformer (#1264) 5fdf5a1 is described below commit 5fdf5a192706d8c8a0432297cca1e6e097de0e58 Author: Mathieu <49835526+wangxian...@users.noreply.github.com> AuthorDate: Mon Feb 10 10:04:58 2020 +0800 [HUDI-560] Remove legacy IdentityTransformer (#1264) --- .../utilities/transform/IdentityTransformer.java | 38 -- 1 file changed, 38 deletions(-) diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/transform/IdentityTransformer.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/transform/IdentityTransformer.java deleted file mode 100644 index 31f0ce6..000 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/transform/IdentityTransformer.java +++ /dev/null @@ -1,38 +0,0 @@ -/* - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hudi.utilities.transform; - -import org.apache.hudi.common.util.TypedProperties; - -import org.apache.spark.api.java.JavaSparkContext; -import org.apache.spark.sql.Dataset; -import org.apache.spark.sql.Row; -import org.apache.spark.sql.SparkSession; - -/** - * Identity transformer. - */ -public class IdentityTransformer implements Transformer { - - @Override - public Dataset apply(JavaSparkContext jsc, SparkSession sparkSession, Dataset rowDataset, - TypedProperties properties) { -return rowDataset; - } -}
[incubator-hudi] branch hudi_test_suite_refactor updated (6fbd285 -> bb58a5f)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard 6fbd285 [MINOR] Fix compile error after rebasing the branch add bb58a5f [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (6fbd285) \ N -- N -- N refs/heads/hudi_test_suite_refactor (bb58a5f) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .../src/test/java/org/apache/hudi/utilities/UtilitiesTestBase.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[incubator-hudi] branch master updated: [HUDI-615]: Add some methods and test cases for StringUtils. (#1338)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new b8f9d0e [HUDI-615]: Add some methods and test cases for StringUtils. (#1338) b8f9d0e is described below commit b8f9d0ec4548a2487914c4cc982fc5e1f4e0c88c Author: Suneel Marthi AuthorDate: Mon Feb 17 01:13:33 2020 -0500 [HUDI-615]: Add some methods and test cases for StringUtils. (#1338) --- .../org/apache/hudi/common/util/StringUtils.java | 27 + .../apache/hudi/common/util/TestStringUtils.java | 64 ++ 2 files changed, 91 insertions(+) diff --git a/hudi-common/src/main/java/org/apache/hudi/common/util/StringUtils.java b/hudi-common/src/main/java/org/apache/hudi/common/util/StringUtils.java index 63f02fe..d1e3305 100644 --- a/hudi-common/src/main/java/org/apache/hudi/common/util/StringUtils.java +++ b/hudi-common/src/main/java/org/apache/hudi/common/util/StringUtils.java @@ -18,6 +18,8 @@ package org.apache.hudi.common.util; +import javax.annotation.Nullable; + /** * Simple utility for operations on strings. */ @@ -67,4 +69,29 @@ public class StringUtils { public static boolean isNullOrEmpty(String str) { return str == null || str.length() == 0; } + + + /** + * Returns the given string if it is non-null; the empty string otherwise. + * + * @param string the string to test and possibly return + * @return {@code string} itself if it is non-null; {@code ""} if it is null + */ + public static String nullToEmpty(@Nullable String string) { +return string == null ? "" : string; + } + + /** + * Returns the given string if it is nonempty; {@code null} otherwise. + * + * @param string the string to test and possibly return + * @return {@code string} itself if it is nonempty; {@code null} if it is empty or null + */ + public static @Nullable String emptyToNull(@Nullable String string) { +return stringIsNullOrEmpty(string) ? null : string; + } + + private static boolean stringIsNullOrEmpty(@Nullable String string) { +return string == null || string.isEmpty(); + } } diff --git a/hudi-common/src/test/java/org/apache/hudi/common/util/TestStringUtils.java b/hudi-common/src/test/java/org/apache/hudi/common/util/TestStringUtils.java new file mode 100644 index 000..05a0825 --- /dev/null +++ b/hudi-common/src/test/java/org/apache/hudi/common/util/TestStringUtils.java @@ -0,0 +1,64 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.common.util; + +import org.junit.Test; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertNotEquals; +import static org.junit.Assert.assertNull; +import static org.junit.Assert.assertTrue; + +public class TestStringUtils { + + private static final String[] STRINGS = {"This", "is", "a", "test"}; + + @Test + public void testStringJoinWithDelim() { +String joinedString = StringUtils.joinUsingDelim("-", STRINGS); +assertEquals(STRINGS.length, joinedString.split("-").length); + } + + @Test + public void testStringJoin() { +assertNotEquals(null, StringUtils.join("")); +assertNotEquals(null, StringUtils.join(STRINGS)); + } + + @Test + public void testStringNullToEmpty() { +String str = "This is a test"; +assertEquals(str, StringUtils.nullToEmpty(str)); +assertEquals("", StringUtils.nullToEmpty(null)); + } + + @Test + public void testStringEmptyToNull() { +assertNull(StringUtils.emptyToNull("")); +assertEquals("Test String", StringUtils.emptyToNull("Test String")); + } + + @Test + public void testStringNullOrEmpty() { +assertTrue(StringUtils.isNullOrEmpty(null)); +assertTrue(StringUtils.isNullOrEmpty("")); +assertNotEquals(null, StringUtils.isNullOrEmpty("this is not empty")); +assertTrue(StringUtils.isNullOrEmpty("")); + } +}
[incubator-hudi] branch hudi_test_suite_refactor updated (bb58a5f -> fa07df4)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from bb58a5f [MINOR] Fix compile error after rebasing the branch add fa07df4 trigger rebuild No new revisions were added by this update. Summary of changes:
[incubator-hudi] branch hudi_test_suite_refactor updated (b04b037 -> 3063cd7)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. omit b04b037 [MINOR] Fix compile error after rebasing the branch add 3063cd7 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (b04b037) \ N -- N -- N refs/heads/hudi_test_suite_refactor (3063cd7) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .../main/java/org/apache/hudi/testsuite/generator/DeltaGenerator.java | 2 +- .../main/java/org/apache/hudi/testsuite/job/HoodieTestSuiteJob.java | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (3063cd7 -> 67fdda3)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard 3063cd7 [MINOR] Fix compile error after rebasing the branch discard e064262 [HUDI-592] Remove duplicated dependencies in the pom file of test suite module discard 1f64ca0 [HUDI-503] Add hudi test suite documentation into the README file of the test suite module (#1191) discard 4eafca9 [MINOR] Fix TestHoodieTestSuiteJob#testComplexDag failure discard e175677 [HUDI-441] Rename WorkflowDagGenerator and some class names in test package discard aeac933 Fixed resource leak in HiveTestService about hive meta store discard 1136002 [HUDI-591] Support Spark version upgrade discard 99b2ae0 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE discard 3471112 [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime discard 43fbc0c [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) discard 285c691 [HUDI-394] Provide a basic implementation of test suite add 01c868a [HUDI-574] Fix CLI counts small file inserts as updates (#1321) add 175de0d [MINOR] Fix typo (#1331) add dfbee67 [HUDI-514] A schema provider to get metadata through Jdbc (#1200) add 1f5051d [HUDI-394] Provide a basic implementation of test suite add c3caeb9 [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) add 89788a5 [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime add c9c3319 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE add 188199b [HUDI-591] Support Spark version upgrade add ea6389d Fixed resource leak in HiveTestService about hive meta store add a823b93 [HUDI-441] Rename WorkflowDagGenerator and some class names in test package add 64f9924 [MINOR] Fix TestHoodieTestSuiteJob#testComplexDag failure add 5b2 [HUDI-503] Add hudi test suite documentation into the README file of the test suite module (#1191) add c1035d0 [HUDI-592] Remove duplicated dependencies in the pom file of test suite module add 67fdda3 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (3063cd7) \ N -- N -- N refs/heads/hudi_test_suite_refactor (67fdda3) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .../apache/hudi/cli/commands/CommitsCommand.java | 2 +- hudi-utilities/pom.xml | 7 ++ .../org/apache/hudi/utilities/UtilHelpers.java | 101 + .../AbstractDeltaStreamerService.java | 2 +- .../utilities/schema/JdbcbasedSchemaProvider.java | 84 + .../hudi/utilities/TestHoodieDeltaStreamer.java| 11 ++- .../utilities/TestJdbcbasedSchemaProvider.java | 88 ++ .../apache/hudi/utilities/UtilitiesTestBase.java | 18 +++- .../delta-streamer-config/source-jdbc.avsc | 43 - .../resources/delta-streamer-config/triprec.sql| 21 +++-- pom.xml| 1 + 11 files changed, 351 insertions(+), 27 deletions(-) create mode 100644 hudi-utilities/src/main/java/org/apache/hudi/utilities/schema/JdbcbasedSchemaProvider.java create mode 100644 hudi-utilities/src/test/java/org/apache/hudi/utilities/TestJdbcbasedSchemaProvider.java copy hudi-common/src/test/resources/timestamp-test-evolved.avsc => hudi-utilities/src/test/resources/delta-streamer-config/source-jdbc.avsc (60%) copy hudi-common/src/test/resources/simple-test.avsc => hudi-utilities/src/test/resources/delta-streamer-config/triprec.sql (75%)
[incubator-hudi] branch hudi_test_suite_refactor updated (67fdda3 -> 6fbd285)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard 67fdda3 [MINOR] Fix compile error after rebasing the branch add 6fbd285 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (67fdda3) \ N -- N -- N refs/heads/hudi_test_suite_refactor (6fbd285) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: hudi-test-suite/README.md| 2 +- packaging/hudi-test-suite-bundle/pom.xml | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-)
[incubator-hudi] branch master updated: [MINOR] Code Cleanup, remove redundant code (#1337)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 24e7381 [MINOR] Code Cleanup, remove redundant code (#1337) 24e7381 is described below commit 24e73816b2b50af518576907379bf9202d6b8dd2 Author: Suneel Marthi AuthorDate: Sat Feb 15 09:03:29 2020 -0500 [MINOR] Code Cleanup, remove redundant code (#1337) --- .../hudi/index/bloom/TestHoodieBloomIndex.java | 13 ++-- .../index/bloom/TestHoodieGlobalBloomIndex.java| 3 +- .../apache/hudi/io/TestHoodieCommitArchiveLog.java | 8 +-- .../io/strategy/TestHoodieCompactionStrategy.java | 73 +++--- .../table/log/block/HoodieAvroDataBlock.java | 3 - .../common/table/log/block/HoodieLogBlock.java | 19 +++--- .../table/timeline/HoodieDefaultTimeline.java | 16 +++-- .../table/view/RocksDbBasedFileSystemView.java | 2 +- .../apache/hudi/common/model/HoodieTestUtils.java | 6 +- .../hudi/common/table/log/TestHoodieLogFormat.java | 50 +++ .../table/string/TestHoodieActiveTimeline.java | 50 +++ .../table/view/TestHoodieTableFileSystemView.java | 19 +++--- .../hadoop/hive/HoodieCombineHiveInputFormat.java | 8 +-- .../realtime/RealtimeUnmergedRecordReader.java | 3 +- .../realtime/TestHoodieRealtimeRecordReader.java | 8 +-- .../org/apache/hudi/hive/HoodieHiveClient.java | 14 ++--- .../org/apache/hudi/hive/SchemaDifference.java | 10 +-- .../apache/hudi/hive/util/ColumnNameXLator.java| 13 ++-- 18 files changed, 148 insertions(+), 170 deletions(-) diff --git a/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestHoodieBloomIndex.java b/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestHoodieBloomIndex.java index c121c14..8bbd527 100644 --- a/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestHoodieBloomIndex.java +++ b/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestHoodieBloomIndex.java @@ -52,6 +52,7 @@ import java.io.File; import java.io.IOException; import java.util.Arrays; import java.util.Collection; +import java.util.Collections; import java.util.HashMap; import java.util.HashSet; import java.util.List; @@ -407,11 +408,11 @@ public class TestHoodieBloomIndex extends HoodieClientTestHarness { // We create three parquet file, each having one record. (two different partitions) String filename1 = -HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Arrays.asList(record1), schema, null, true); +HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Collections.singletonList(record1), schema, null, true); String filename2 = -HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Arrays.asList(record2), schema, null, true); +HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Collections.singletonList(record2), schema, null, true); String filename3 = -HoodieClientTestUtils.writeParquetFile(basePath, "2015/01/31", Arrays.asList(record4), schema, null, true); +HoodieClientTestUtils.writeParquetFile(basePath, "2015/01/31", Collections.singletonList(record4), schema, null, true); // We do the tag again metaClient = HoodieTableMetaClient.reload(metaClient); @@ -431,7 +432,7 @@ public class TestHoodieBloomIndex extends HoodieClientTestHarness { assertEquals(FSUtils.getFileId(filename2), record._2.get().getRight()); } } else if (record._1.getRecordKey().equals("3eb5b87c-1fej-4edd-87b4-6ec96dc405a0")) { -assertTrue(!record._2.isPresent()); +assertFalse(record._2.isPresent()); } } } @@ -456,7 +457,7 @@ public class TestHoodieBloomIndex extends HoodieClientTestHarness { BloomFilterTypeCode.SIMPLE.name()); filter.add(record2.getRecordKey()); String filename = -HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Arrays.asList(record1), schema, filter, true); +HoodieClientTestUtils.writeParquetFile(basePath, "2016/01/31", Collections.singletonList(record1), schema, filter, true); assertTrue(filter.mightContain(record1.getRecordKey())); assertTrue(filter.mightContain(record2.getRecordKey())); @@ -472,7 +473,7 @@ public class TestHoodieBloomIndex extends HoodieClientTestHarness { // Check results for (HoodieRecord record : taggedRecordRDD.collect()) { if (record.getKey().equals("1eb5b87a-1feh-4edd-87b4-6ec96dc405a0")) { - assertTrue(record.getCurrentLocation().getFileId().equals(FSUtils.getFileId(filename))); +assertEquals(record.getCurrentLocation().getFileId(), FSUtils.getFileId(filename)); } else if (record.getRecordKey().equals("2eb5b
[incubator-hudi] branch master updated: [HUDI-622]: Remove VisibleForTesting annotation and import from code (#1343)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new f9d2f66 [HUDI-622]: Remove VisibleForTesting annotation and import from code (#1343) f9d2f66 is described below commit f9d2f66dc16540e3e5c1cb1f7f23b4fca7c656c3 Author: Suneel Marthi AuthorDate: Thu Feb 20 02:17:53 2020 -0500 [HUDI-622]: Remove VisibleForTesting annotation and import from code (#1343) * HUDI:622: Remove VisibleForTesting annotation and import from code --- hudi-client/src/main/java/org/apache/hudi/HoodieCleanClient.java| 3 --- hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java| 3 --- .../apache/hudi/index/bloom/BucketizedBloomCheckPartitioner.java| 2 -- .../src/main/java/org/apache/hudi/index/bloom/HoodieBloomIndex.java | 4 .../java/org/apache/hudi/index/bloom/HoodieGlobalBloomIndex.java| 3 --- .../src/main/java/org/apache/hudi/index/hbase/HBaseIndex.java | 4 .../compact/strategy/BoundedPartitionAwareCompactionStrategy.java | 3 --- .../apache/hudi/io/compact/strategy/DayBasedCompactionStrategy.java | 3 --- .../src/main/java/org/apache/hudi/metrics/HoodieMetrics.java| 2 -- .../src/main/java/org/apache/hudi/table/HoodieMergeOnReadTable.java | 2 -- .../java/org/apache/hudi/common/bloom/filter/InternalFilter.java| 4 ++-- .../java/org/apache/hudi/common/util/BufferedRandomAccessFile.java | 2 +- hudi-common/src/main/java/org/apache/hudi/common/util/FSUtils.java | 2 -- .../main/java/org/apache/hudi/common/util/ObjectSizeCalculator.java | 3 --- .../src/main/java/org/apache/hudi/common/util/RocksDBDAO.java | 2 -- .../org/apache/hudi/common/util/queue/BoundedInMemoryQueue.java | 5 - .../src/main/scala/org/apache/hudi/AvroConversionHelper.scala | 6 +++--- .../main/java/org/apache/hudi/utilities/HDFSParquetImporter.java| 2 -- 18 files changed, 6 insertions(+), 49 deletions(-) diff --git a/hudi-client/src/main/java/org/apache/hudi/HoodieCleanClient.java b/hudi-client/src/main/java/org/apache/hudi/HoodieCleanClient.java index 9411782..fe0cc60 100644 --- a/hudi-client/src/main/java/org/apache/hudi/HoodieCleanClient.java +++ b/hudi-client/src/main/java/org/apache/hudi/HoodieCleanClient.java @@ -37,7 +37,6 @@ import org.apache.hudi.metrics.HoodieMetrics; import org.apache.hudi.table.HoodieTable; import com.codahale.metrics.Timer; -import com.google.common.annotations.VisibleForTesting; import com.google.common.base.Preconditions; import org.apache.log4j.LogManager; import org.apache.log4j.Logger; @@ -108,7 +107,6 @@ public class HoodieCleanClient extends AbstractHo * @param startCleanTime Cleaner Instant Time * @return Cleaner Plan if generated */ - @VisibleForTesting protected Option scheduleClean(String startCleanTime) { // Create a Hoodie table which encapsulated the commits and files visible HoodieTable table = HoodieTable.getHoodieTable(createMetaClient(true), config, jsc); @@ -138,7 +136,6 @@ public class HoodieCleanClient extends AbstractHo * @param table Hoodie Table * @param cleanInstant Cleaner Instant */ - @VisibleForTesting protected HoodieCleanMetadata runClean(HoodieTable table, HoodieInstant cleanInstant) { try { HoodieCleanerPlan cleanerPlan = CleanerUtils.getCleanerPlan(table.getMetaClient(), cleanInstant); diff --git a/hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java b/hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java index 23055da..931ca07 100644 --- a/hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java +++ b/hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java @@ -61,7 +61,6 @@ import org.apache.hudi.table.WorkloadProfile; import org.apache.hudi.table.WorkloadStat; import com.codahale.metrics.Timer; -import com.google.common.annotations.VisibleForTesting; import com.google.common.base.Preconditions; import com.google.common.collect.ImmutableMap; import org.apache.log4j.LogManager; @@ -121,7 +120,6 @@ public class HoodieWriteClient extends AbstractHo this(jsc, clientConfig, rollbackPending, HoodieIndex.createIndex(clientConfig, jsc)); } - @VisibleForTesting HoodieWriteClient(JavaSparkContext jsc, HoodieWriteConfig clientConfig, boolean rollbackPending, HoodieIndex index) { this(jsc, clientConfig, rollbackPending, index, Option.empty()); } @@ -1113,7 +,6 @@ public class HoodieWriteClient extends AbstractHo * @param inflightInstant Inflight Compaction Instant * @param table Hoodie Table */ - @VisibleForTesting void rollbackInflightCompaction(HoodieInstant inflightInstant, HoodieTable table) throws IOException { table.rollback(jsc, inflightInstant, false); // Revert instant state file diff --git a/hudi-client/src/main/java/org
[incubator-hudi] branch hudi_test_suite_refactor updated (a31a8f6 -> 162eb50)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard a31a8f6 [MINOR] Fix compile error after rebasing the branch add 162eb50 [MINOR] Fix compile error after rebasing the branch This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (a31a8f6) \ N -- N -- N refs/heads/hudi_test_suite_refactor (162eb50) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: hudi-utilities/pom.xml | 10 ++ pom.xml| 6 +++--- 2 files changed, 13 insertions(+), 3 deletions(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (09c34a0 -> 3dc85eb)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. omit 09c34a0 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE omit 66463ff [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime omit 1d2ecbc [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) omit 9b55d37 [HUDI-394] Provide a basic implementation of test suite add 8172197 Fix Error: java.lang.IllegalArgumentException: Can not create a Path from an empty string in HoodieCopyOnWrite#deleteFilesFunc (#1126) add 4b1b3fc [MINOR] Set info servity for ImportOrder temporarily (#1127) add 41f3677 [MINOR] fix typo add dd06660 [MINOR] fix typo add 94aec96 [minor] Fix few typos in the java docs (#1132) add 9c4217a [HUDI-389] Fixing Index look up to return right partitions for a given key along with fileId with Global Bloom (#1091) add 8affdf8 [HUDI-416] Improve hint information for cli (#1110) add 3c811ec [MINOR] fix typos add def18a5 [MINOR] optimize hudi timeline service (#1137) add 842eabb [HUDI-470] Fix NPE when print result via hudi-cli (#1138) add f20a130 [MINOR] typo fix (#1142) add 01c25d6 [MINOR] Update the java doc of HoodieTableType (#1148) add 58c5bed [HUDI-453] Fix throw failed to archive commits error when writing data to MOR/COW table add 179837e Fix checkstyle add 2f25416 Skip setting commit metadata add 8440482 Fix empty content clean plan add e4ea7a2 Update comment add 2a823f3 [MINOR]: alter some wrong params which bring fatal exception add ab6ae5c [HUDI-482] Fix missing @Override annotation on methods (#1156) add e637d9e [HUDI-455] Redo hudi-client log statements using SLF4J (#1145) add bb90ded [MINOR] Fix out of limits for results add 36c0e6b [MINOR] Fix out of limits for results add 74b00d1 trigger rebuild add 619f501 Clean up code add add4b1e Merge pull request #1143 from BigDataArtisans/outoflimit add 47c1f74 [HUDI-343]: Create a DOAP file for Hudi add 98c0d8c Merge pull request #1160 from smarthi/HUDI-343 add dde21e7 [HUDI-402]: code clean up in test cases add e1e5fe3 [MINOR] Fix error usage of String.format (#1169) add ff1113f [HUDI-492]Fix show env all in hudi-cli add 290278f [HUDI-118]: Options provided for passing properties to Cleaner, compactor and importer commands add a733f4e [MINOR] Optimize hudi-cli module (#1136) add 726ae47 [MINOR]Optimize hudi-client module (#1139) add 7031445 [HUDI-377] Adding Delete() support to DeltaStreamer (#1073) add 28ccf8c [HUDI-484] Fix NPE when reading IncrementalPull.sqltemplate in HiveIncrementalPuller (#1167) add b9fab0b Revert "[HUDI-455] Redo hudi-client log statements using SLF4J (#1145)" (#1181) add 2d5b79d [HUDI-438] Merge duplicated code fragment in HoodieSparkSqlWriter (#1114) add 8f935e7 [HUDI-406]: added default partition path in TimestampBasedKeyGenerator add c78092d [HUDI-501] Execute docker/setup_demo.sh in any directory add 75c3f63 [HUDI-405] Remove HIVE_ASSUME_DATE_PARTITION_OPT_KEY config from DataSource add b5df672 [HUDI-464] Use Hive Exec Core for tests (#1125) add 8306f74 [HUDI-417] Refactor HoodieWriteClient so that commit logic can be shareable by both bootstrap and normal write operations (#1166) add 9706f65 [HUDI-508] Standardizing on "Table" instead of "Dataset" across code (#1197) add 9884972 [MINOR] Remove old jekyll config file (#1198) add aba8387 Update deprecated HBase API add 480fc78 [HUDI-319] Add a new maven profile to generate unified Javadoc for all Java and Scala classes (#1195) add d09eacd [HUDI-25] Optimize HoodieInputformat.listStatus() for faster Hive incremental queries on Hoodie add 5af3dc6 [HUDI-331]Fix java docs for all public apis in HoodieWriteClient (#) add 3c90d25 [HUDI-114]: added option to overwrite payload implementation in hoodie.properties file add 04afac9 [HUDI-248] CLI doesn't allow rolling back a Delta commit add b95367d [HUDI-469] Fix: HoodieCommitMetadata only show first commit insert rows. add e103165 [CLEAN] replace utf-8 constant with StandardCharsets.UTF_8 add 017ee8e [MINOR] Fix partition typo (#1209) add d9675c4 [HUDI-522] Use the same version jcommander uniformly (#1214) add ad50008 [HUDI-91][HUDI-12]Migrate to spark 2.4.4, migrate to spark-avro library instead of databricks-avro, add support for Decimal/Date types add 971c7d4 [HUDI-322] DeltaSteamer should pick checkpoints off only deltacommits for MOR tables add a44c61b [HUDI-5
[incubator-hudi] branch hudi_test_suite_refactor updated (3dc85eb -> 0456214)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard 3dc85eb [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE add 0456214 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (3dc85eb) \ N -- N -- N refs/heads/hudi_test_suite_refactor (0456214) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .../java/org/apache/hudi/testsuite/helpers/HiveServiceProvider.java| 1 - .../main/java/org/apache/hudi/testsuite/job/HoodieTestSuiteJob.java| 3 ++- .../java/org/apache/hudi/testsuite/job/TestHoodieTestSuiteJob.java | 1 - 3 files changed, 2 insertions(+), 3 deletions(-)
[incubator-hudi] branch master updated (2a823f3 -> ab6ae5c)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 2a823f3 [MINOR]: alter some wrong params which bring fatal exception add ab6ae5c [HUDI-482] Fix missing @Override annotation on methods (#1156) No new revisions were added by this update. Summary of changes: .../java/org/apache/hudi/cli/HoodieHistoryFileNameProvider.java | 1 + hudi-cli/src/main/java/org/apache/hudi/cli/HoodieSplashScreen.java | 3 +++ hudi-client/src/main/java/org/apache/hudi/AbstractHoodieClient.java | 1 + hudi-client/src/main/java/org/apache/hudi/HoodieWriteClient.java| 1 + .../java/org/apache/hudi/func/SparkBoundedInMemoryExecutor.java | 1 + .../main/java/org/apache/hudi/index/bloom/BloomIndexFileInfo.java | 1 + .../src/main/java/org/apache/hudi/index/hbase/HBaseIndex.java | 2 ++ .../src/main/java/org/apache/hudi/io/HoodieCreateHandle.java| 1 + hudi-client/src/main/java/org/apache/hudi/io/HoodieMergeHandle.java | 6 ++ .../main/java/org/apache/hudi/io/storage/HoodieParquetWriter.java | 1 + .../src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java | 1 + .../java/org/apache/hudi/common/table/log/HoodieLogFileReader.java | 1 + .../org/apache/hudi/common/table/log/HoodieLogFormatWriter.java | 2 ++ .../org/apache/hudi/common/table/timeline/HoodieActiveTimeline.java | 1 + .../apache/hudi/common/table/view/AbstractTableFileSystemView.java | 3 +++ .../apache/hudi/common/table/view/HoodieTableFileSystemView.java| 2 ++ .../hudi/common/table/view/SpillableMapBasedFileSystemView.java | 2 ++ .../main/java/org/apache/hudi/common/util/ObjectSizeCalculator.java | 1 + .../java/org/apache/hudi/common/util/collection/DiskBasedMap.java | 1 + .../org/apache/hudi/common/util/collection/LazyFileIterable.java| 1 + .../main/java/org/apache/hudi/hadoop/HoodieParquetInputFormat.java | 2 ++ .../org/apache/hudi/hadoop/hive/HoodieCombineHiveInputFormat.java | 6 ++ .../apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java| 1 + 23 files changed, 42 insertions(+)
[incubator-hudi] branch master updated (ab6ae5c -> e637d9e)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from ab6ae5c [HUDI-482] Fix missing @Override annotation on methods (#1156) add e637d9e [HUDI-455] Redo hudi-client log statements using SLF4J (#1145) No new revisions were added by this update. Summary of changes: hudi-client/pom.xml| 5 ++ .../java/org/apache/hudi/AbstractHoodieClient.java | 6 +-- .../org/apache/hudi/CompactionAdminClient.java | 17 +++ .../java/org/apache/hudi/HoodieCleanClient.java| 16 +++ .../java/org/apache/hudi/HoodieReadClient.java | 6 +-- .../java/org/apache/hudi/HoodieWriteClient.java| 56 +++--- .../client/embedded/EmbeddedTimelineService.java | 10 ++-- .../hbase/DefaultHBaseQPSResourceAllocator.java| 10 ++-- .../org/apache/hudi/index/hbase/HBaseIndex.java| 42 .../org/apache/hudi/io/HoodieAppendHandle.java | 20 .../java/org/apache/hudi/io/HoodieCleanHelper.java | 16 +++ .../org/apache/hudi/io/HoodieCommitArchiveLog.java | 22 - .../org/apache/hudi/io/HoodieCreateHandle.java | 17 --- .../org/apache/hudi/io/HoodieKeyLookupHandle.java | 19 .../java/org/apache/hudi/io/HoodieMergeHandle.java | 39 --- .../java/org/apache/hudi/io/HoodieWriteHandle.java | 10 ++-- .../io/compact/HoodieRealtimeTableCompactor.java | 28 +-- .../org/apache/hudi/metrics/HoodieMetrics.java | 17 +++ .../apache/hudi/metrics/JmxMetricsReporter.java| 6 +-- .../main/java/org/apache/hudi/metrics/Metrics.java | 6 +-- .../hudi/metrics/MetricsGraphiteReporter.java | 6 +-- .../hudi/metrics/MetricsReporterFactory.java | 8 ++-- .../apache/hudi/table/HoodieCopyOnWriteTable.java | 52 ++-- .../apache/hudi/table/HoodieMergeOnReadTable.java | 27 +-- .../java/org/apache/hudi/table/HoodieTable.java| 12 ++--- .../org/apache/hudi/table/RollbackExecutor.java| 14 +++--- 26 files changed, 242 insertions(+), 245 deletions(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (dcfbab1 -> 1d2ecbc)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. omit dcfbab1 [HUDI-441] Rename WorkflowDagGenerator and some class names in test package omit 9151ccf [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE (#1118) omit ae5bd06 [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) omit eaaf3f6 [HUDI-394] Provide a basic implementation of test suite add f324057 [MINOR] Unify Lists import (#1103) add 8963a68 [HUDI-398]Add spark env set/get for spark launcher (#1096) add 9a1f698 [HUDI-308] Avoid Renames for tracking state transitions of all actions on dataset add 7498ca7 [MINOR] Add slack invite icon in README (#1108) add 14881e9 [HUDI-106] Adding support for DynamicBloomFilter (#976) add 36b3b6f [HUDI-415] Get commit time when Spark start (#1113) add b284091 [HUDI-386] Refactor hudi scala checkstyle rules (#1099) add 313fab5 [HUDI-444] Refactor the codes based on scala codestyle ReturnChecker rule (#1121) add 350b0ec [HUDI-311] : Support for AWS Database Migration Service in DeltaStreamer add 9b55d37 [HUDI-394] Provide a basic implementation of test suite add 1d2ecbc [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (dcfbab1) \ N -- N -- N refs/heads/hudi_test_suite_refactor (1d2ecbc) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: LICENSE| 28 +++ README.md | 1 + .../main/java/org/apache/hudi/cli/HoodieCLI.java | 15 +- .../hudi/cli/commands/CompactionCommand.java | 30 +-- .../apache/hudi/cli/commands/DatasetsCommand.java | 10 +- .../apache/hudi/cli/commands/SparkEnvCommand.java | 68 ++ .../java/org/apache/hudi/cli/utils/SparkUtil.java | 6 +- .../scala/org/apache/hudi/cli/DedupeSparkJob.scala | 4 +- .../scala/org/apache/hudi/cli/SparkHelpers.scala | 10 +- .../org/apache/hudi/CompactionAdminClient.java | 8 +- .../java/org/apache/hudi/HoodieCleanClient.java| 26 ++- .../java/org/apache/hudi/HoodieWriteClient.java| 125 ++- .../org/apache/hudi/client/utils/ClientUtils.java | 4 +- .../org/apache/hudi/config/HoodieIndexConfig.java | 11 +- .../org/apache/hudi/config/HoodieWriteConfig.java | 29 ++- .../org/apache/hudi/io/HoodieCommitArchiveLog.java | 42 +++- .../org/apache/hudi/io/HoodieKeyLookupHandle.java | 8 +- .../io/storage/HoodieStorageWriterFactory.java | 8 +- .../apache/hudi/table/HoodieCopyOnWriteTable.java | 128 ++- .../apache/hudi/table/HoodieMergeOnReadTable.java | 42 ++-- .../java/org/apache/hudi/table/HoodieTable.java| 10 +- .../java/org/apache/hudi/TestAsyncCompaction.java | 10 +- .../src/test/java/org/apache/hudi/TestCleaner.java | 31 ++- .../java/org/apache/hudi/TestClientRollback.java | 6 +- .../hudi/TestHoodieClientOnCopyOnWriteStorage.java | 47 +++- .../apache/hudi/common/HoodieClientTestUtils.java | 18 +- .../hudi/common/HoodieTestDataGenerator.java | 37 +-- .../hudi/func/TestBoundedInMemoryExecutor.java | 2 +- .../apache/hudi/func/TestBoundedInMemoryQueue.java | 2 +- .../java/org/apache/hudi/index/TestHbaseIndex.java | 11 +- .../hudi/index/bloom/TestHoodieBloomIndex.java | 9 +- .../apache/hudi/io/TestHoodieCommitArchiveLog.java | 34 +-- .../org/apache/hudi/io/TestHoodieCompactor.java| 6 +- .../apache/hudi/table/TestCopyOnWriteTable.java| 2 +- .../apache/hudi/table/TestMergeOnReadTable.java| 12 +- hudi-common/pom.xml| 1 + .../src/main/avro/HoodieArchivedMetaEntry.avsc | 22 ++ .../apache/hudi/avro/HoodieAvroWriteSupport.java | 7 +- .../hudi/common/bloom/filter/BloomFilter.java | 35 +-- .../common/bloom/filter/BloomFilterFactory.java| 63 ++ .../common/bloom/filter/BloomFilterTypeCode.java | 10 +- .../filter/BloomFilterUtils.java} | 34 +-- .../filter/HoodieDynamicBoundedBloomFilter.java| 109 +++
[incubator-hudi] branch master updated (350b0ec -> 8172197)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 350b0ec [HUDI-311] : Support for AWS Database Migration Service in DeltaStreamer add 8172197 Fix Error: java.lang.IllegalArgumentException: Can not create a Path from an empty string in HoodieCopyOnWrite#deleteFilesFunc (#1126) No new revisions were added by this update. Summary of changes: .../src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (1d2ecbc -> 66463ff)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 1d2ecbc [HUDI-391] Rename module name from hudi-bench to hudi-test-suite and fix some checkstyle issues (#1102) add 66463ff [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime No new revisions were added by this update. Summary of changes: .../src/main/java/org/apache/hudi/testsuite/writer/DeltaWriter.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
[incubator-hudi] branch hudi_test_suite_refactor updated (66463ff -> 09c34a0)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 66463ff [MINOR] Fix compile error about the deletion of HoodieActiveTimeline#createNewCommitTime add 09c34a0 [HUDI-442] Fix TestComplexKeyGenerator#testSingleValueKeyGenerator and testMultipleValueKeyGenerator NPE No new revisions were added by this update. Summary of changes: hudi-spark/src/main/java/org/apache/hudi/ComplexKeyGenerator.java | 5 - 1 file changed, 5 deletions(-)
[incubator-hudi] branch master updated (def18a5 -> 842eabb)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from def18a5 [MINOR] optimize hudi timeline service (#1137) add 842eabb [HUDI-470] Fix NPE when print result via hudi-cli (#1138) No new revisions were added by this update. Summary of changes: .../main/java/org/apache/hudi/cli/commands/HoodieLogFileCommand.java| 2 +- hudi-cli/src/main/java/org/apache/hudi/cli/commands/RepairsCommand.java | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-)
[incubator-hudi] branch master updated: [MINOR] Update the java doc of HoodieTableType (#1148)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 01c25d6 [MINOR] Update the java doc of HoodieTableType (#1148) 01c25d6 is described below commit 01c25d6afff703156c31c4f4c12138113dca494a Author: Mathieu <49835526+wangxian...@users.noreply.github.com> AuthorDate: Sun Dec 29 09:57:19 2019 +0800 [MINOR] Update the java doc of HoodieTableType (#1148) --- .../src/main/java/org/apache/hudi/common/model/HoodieTableType.java | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/hudi-common/src/main/java/org/apache/hudi/common/model/HoodieTableType.java b/hudi-common/src/main/java/org/apache/hudi/common/model/HoodieTableType.java index 17c7fd2..6c24e39 100644 --- a/hudi-common/src/main/java/org/apache/hudi/common/model/HoodieTableType.java +++ b/hudi-common/src/main/java/org/apache/hudi/common/model/HoodieTableType.java @@ -21,14 +21,14 @@ package org.apache.hudi.common.model; /** * Type of the Hoodie Table. * - * Currently, 1 type is supported + * Currently, 2 types are supported. * * COPY_ON_WRITE - Performs upserts by versioning entire files, with later versions containing newer value of a record. * - * In the future, following might be added. - * * MERGE_ON_READ - Speeds up upserts, by delaying merge until enough work piles up. * + * In the future, following might be added. + * * SIMPLE_LSM - A simple 2 level LSM tree. */ public enum HoodieTableType {
[incubator-hudi] annotated tag 0.5.2-incubating-rc1 updated (a355c76 -> 625ea09)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to annotated tag 0.5.2-incubating-rc1 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. *** WARNING: tag 0.5.2-incubating-rc1 was modified! *** from a355c76 (commit) to 625ea09 (tag) tagging a355c7611c55151e6cfd5942c9eb067b622ae57b (commit) replaces hoodie-0.4.7 by yanghua on Thu Mar 12 16:45:05 2020 +0800 - Log - 0.5.2 -BEGIN PGP SIGNATURE- iQEzBAABCAAdFiEEw6lux3FJVxron4J2TIZoTQR94DwFAl5p9pEACgkQTIZoTQR9 4DxpZwf9Fm306l380Myz8erlqG8PGSGa4fSUBNfd1paHzGLxT6BIO5wNWGdF1Jw1 xCAOrSVBRO6/lOUlem3sjEDswEOkV9VU4K1dbN8DnFheUQVwzd3BM8Q+70PdGDQW cxzeHgTRYKr+gaZdS6NySwkfbagFzWnmUomxDNFVfQf9qQ6JqtDDeoF/jNqluJPz WqksQyzOhXDtiDnW4Yb1jMs1/eLL/YUay1KSCsOMJb+0csZhyxnkzRdLmL+lP3aM mnXS9nRovoTAVpE4MSJBDG8geXoTczb6BPa/P9Bu5KS218QBt39nIatOxgnYBGdL WqdBem8oxbhV307kgROdb1aeAhoAGQ== =VP9J -END PGP SIGNATURE- --- No new revisions were added by this update. Summary of changes:
svn commit: r38489 - in /dev/incubator/hudi/hudi-0.5.2-incubating-rc1: ./ hudi-0.5.2-incubating-rc1/ hudi-0.5.2-incubating-rc1/docker/ hudi-0.5.2-incubating-rc1/docker/compose/ hudi-0.5.2-incubating-r
Author: vinoyang Date: Thu Mar 12 11:46:45 2020 New Revision: 38489 Log: Upload hudi-0.5.2-incubating-rc1 [This commit notification would consist of 62 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.]
[incubator-hudi] annotated tag release-0.5.2-incubating-rc1 updated (a355c76 -> 9e31684)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to annotated tag release-0.5.2-incubating-rc1 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. *** WARNING: tag release-0.5.2-incubating-rc1 was modified! *** from a355c76 (commit) to 9e31684 (tag) tagging a355c7611c55151e6cfd5942c9eb067b622ae57b (commit) replaces hoodie-0.4.7 by yanghua on Thu Mar 12 20:46:03 2020 +0800 - Log - release-0.5.2 -BEGIN PGP SIGNATURE- iQEzBAABCAAdFiEEw6lux3FJVxron4J2TIZoTQR94DwFAl5qLwsACgkQTIZoTQR9 4Dx3/wf+NaYw+EMtjappWE1sPYZ6gJURcufaOHiD24vZD6gqPX9xButSKjkuNR1I +bcK+6b1jQ5aYuRjwo3ubHt8p87y651uCNhMGfro/3MmwFksmwmH8MYYkdDbbJCo 082ix+k00QxZZtNQJ7+TWbAZEZ/GMt/w9fADLP4cd3wgpMM2pzOB36bDKItDpoJe C7VBzDu5v1OKF76egkUd6AY21eMoovDsn0lQOPwHcMLr4dc8d+chkeuB1BH8E1J9 W3RaPAkVT4Zp3Zu3Snxe574xqIH9B4/Jlz319apIgOdyhdVYKRWm7zrxebmXY37R XRzwsazQb8IpyzZ4kiK2djiIlYB/Dg== =AW8T -END PGP SIGNATURE- --- No new revisions were added by this update. Summary of changes:
svn commit: r38490 - /dev/incubator/hudi/hudi-0.5.2-incubating-rc1/hudi-0.5.2-incubating-rc1/
Author: vinoyang Date: Thu Mar 12 12:20:07 2020 New Revision: 38490 Log: Delete source dir Removed: dev/incubator/hudi/hudi-0.5.2-incubating-rc1/hudi-0.5.2-incubating-rc1/
[incubator-hudi] branch release-0.5.2 updated: [HUDI-581] NOTICE need more work as it missing content form included 3rd party ALv2 licensed NOTICE files (#1354)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/release-0.5.2 by this push: new e9f114f [HUDI-581] NOTICE need more work as it missing content form included 3rd party ALv2 licensed NOTICE files (#1354) e9f114f is described below commit e9f114f3dd8673b8b66b024e62cc1e3e808e7ea9 Author: Suneel Marthi AuthorDate: Sat Mar 7 22:08:35 2020 -0500 [HUDI-581] NOTICE need more work as it missing content form included 3rd party ALv2 licensed NOTICE files (#1354) * [HUDI-581] - Add 3rd party library NOTICE * [HUDI-581]: NOTICE need more work as it missing content form included 3rd party ALv2 licensed NOTICE files --- NOTICE | 83 +- 1 file changed, 82 insertions(+), 1 deletion(-) diff --git a/NOTICE b/NOTICE index c2961ac..c0469fa 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,86 @@ Apache Hudi (incubating) -Copyright 2020 The Apache Software Foundation +Copyright 2019 and onwards The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). + +This project bundles the following dependencies + + +Metrics +Copyright 2010-2013 Coda Hale and Yammer, Inc. + +This product includes software developed by Coda Hale and Yammer, Inc. + +- +Guava +Copyright (C) 2007 The Guava Authors + +Licensed under the Apache License, Version 2.0 + +- +Kryo (https://github.com/EsotericSoftware/kryo) +Copyright (c) 2008-2018, Nathan Sweet All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the +following conditions are met: + +Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + +Neither the name of Esoteric Software nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROF [...] + + +Jackson JSON Processor + +This copy of Jackson JSON processor streaming parser/generator is licensed under the +Apache (Software) License, version 2.0 ("the License"). +See the License for details about distribution rights, and the +specific rights regarding derivate works. + +You may obtain a copy of the License at: + +http://www.apache.org/licenses/LICENSE-2.0 + +-- + +Gson +Copyright 2008 Google Inc. + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + +http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. + + += Apache Hadoop 2.8.5 = +Apache Hadoop +Copyright 2009-2017 The Apache Software Foundation + += Apache Hive 2.3.1 = +Apache Hive +Copyright 2008-2017 The Apache Software Foundation + += Apache Spark 2.4.4 = +Apache Spark +Copyright 2014 and onwards The Apache Software Foundation + += Apache Kafka 2.0.0 = +Apache Kafka +Copyright 2020 The Apache Software Foundation. + += Apache HBase 1.2.3 = +Apache HBase +Copyright 2007-2019 The Apache Software Foundation. + += Apache Avro 1.8.2 = +Apache Avro +Copyright 2010-2019 The Apache Software Foundation. \ No newline at end of file
[incubator-hudi] branch master updated: [HUDI-681]Remove embeddedTimelineService from HoodieReadClient (#1388)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new f93e64f [HUDI-681]Remove embeddedTimelineService from HoodieReadClient (#1388) f93e64f is described below commit f93e64fee413ed1b774156e688794ee7937cc01a Author: hongdd AuthorDate: Mon Mar 9 18:31:04 2020 +0800 [HUDI-681]Remove embeddedTimelineService from HoodieReadClient (#1388) * [HUDI-681]Remove embeddedTimelineService from HoodieReadClient --- .../org/apache/hudi/client/HoodieReadClient.java| 21 ++--- .../main/java/org/apache/hudi/DataSourceUtils.java | 10 -- .../org/apache/hudi/HoodieSparkSqlWriter.scala | 5 + .../hudi/utilities/deltastreamer/DeltaSync.java | 2 +- 4 files changed, 8 insertions(+), 30 deletions(-) diff --git a/hudi-client/src/main/java/org/apache/hudi/client/HoodieReadClient.java b/hudi-client/src/main/java/org/apache/hudi/client/HoodieReadClient.java index 33d661b..d1e92b5 100644 --- a/hudi-client/src/main/java/org/apache/hudi/client/HoodieReadClient.java +++ b/hudi-client/src/main/java/org/apache/hudi/client/HoodieReadClient.java @@ -19,7 +19,6 @@ package org.apache.hudi.client; import org.apache.hudi.avro.model.HoodieCompactionPlan; -import org.apache.hudi.client.embedded.EmbeddedTimelineService; import org.apache.hudi.common.model.HoodieBaseFile; import org.apache.hudi.common.model.HoodieKey; import org.apache.hudi.common.model.HoodieRecord; @@ -72,18 +71,10 @@ public class HoodieReadClient implements Serializ /** * @param basePath path to Hoodie table */ - public HoodieReadClient(JavaSparkContext jsc, String basePath, Option timelineService) { + public HoodieReadClient(JavaSparkContext jsc, String basePath) { this(jsc, HoodieWriteConfig.newBuilder().withPath(basePath) // by default we use HoodieBloomIndex - .withIndexConfig(HoodieIndexConfig.newBuilder().withIndexType(HoodieIndex.IndexType.BLOOM).build()).build(), -timelineService); - } - - /** - * @param basePath path to Hoodie table - */ - public HoodieReadClient(JavaSparkContext jsc, String basePath) { -this(jsc, basePath, Option.empty()); + .withIndexConfig(HoodieIndexConfig.newBuilder().withIndexType(HoodieIndex.IndexType.BLOOM).build()).build()); } /** @@ -100,14 +91,6 @@ public class HoodieReadClient implements Serializ * @param clientConfig instance of HoodieWriteConfig */ public HoodieReadClient(JavaSparkContext jsc, HoodieWriteConfig clientConfig) { -this(jsc, clientConfig, Option.empty()); - } - - /** - * @param clientConfig instance of HoodieWriteConfig - */ - public HoodieReadClient(JavaSparkContext jsc, HoodieWriteConfig clientConfig, - Option timelineService) { this.jsc = jsc; final String basePath = clientConfig.getBasePath(); // Create a Hoodie table which encapsulated the commits and files visible diff --git a/hudi-spark/src/main/java/org/apache/hudi/DataSourceUtils.java b/hudi-spark/src/main/java/org/apache/hudi/DataSourceUtils.java index 6a4ad03..99a795d 100644 --- a/hudi-spark/src/main/java/org/apache/hudi/DataSourceUtils.java +++ b/hudi-spark/src/main/java/org/apache/hudi/DataSourceUtils.java @@ -23,11 +23,9 @@ import org.apache.avro.Schema; import org.apache.hudi.client.HoodieReadClient; import org.apache.hudi.client.HoodieWriteClient; import org.apache.hudi.client.WriteStatus; -import org.apache.hudi.client.embedded.EmbeddedTimelineService; import org.apache.hudi.common.model.HoodieKey; import org.apache.hudi.common.model.HoodieRecord; import org.apache.hudi.common.model.HoodieRecordPayload; -import org.apache.hudi.common.util.Option; import org.apache.hudi.common.util.ReflectionUtils; import org.apache.hudi.common.util.TypedProperties; import org.apache.hudi.config.HoodieCompactionConfig; @@ -222,9 +220,9 @@ public class DataSourceUtils { @SuppressWarnings("unchecked") public static JavaRDD dropDuplicates(JavaSparkContext jssc, JavaRDD incomingHoodieRecords, - HoodieWriteConfig writeConfig, Option timelineService) { + HoodieWriteConfig writeConfig) { try { - HoodieReadClient client = new HoodieReadClient<>(jssc, writeConfig, timelineService); + HoodieReadClient client = new HoodieReadClient<>(jssc, writeConfig); return client.tagLocation(incomingHoodieRecords) .filter(r -> !((HoodieRecord) r).isCurrentLocationKnown()); } catch (TableNotFoundException e) { @@ -236,10 +234,10 @@ public class DataSourceUtils { @SuppressWarnings("unchecked") public static JavaRDD dropDuplicates(JavaSparkContext jssc, JavaRDD incomingHoodieRecords, -
[incubator-hudi] branch release-0.5.2 updated: [HUDI-676] Address issues towards removing use of WIP Disclaimer (#1386)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/release-0.5.2 by this push: new 3639241 [HUDI-676] Address issues towards removing use of WIP Disclaimer (#1386) 3639241 is described below commit 3639241d24178292223996fd591a7d0c6e27a528 Author: vinoyang AuthorDate: Mon Mar 9 14:14:49 2020 +0800 [HUDI-676] Address issues towards removing use of WIP Disclaimer (#1386) * rename DISCLAIMER-STANDARD TO DISCLAIMER --- DISCLAIMER | 10 ++ DISCLAIMER-WIP | 26 -- pom.xml| 2 +- scripts/release/validate_staged_release.sh | 8 4 files changed, 15 insertions(+), 31 deletions(-) diff --git a/DISCLAIMER b/DISCLAIMER new file mode 100644 index 000..a4db88b --- /dev/null +++ b/DISCLAIMER @@ -0,0 +1,10 @@ +Apache Hudi (incubating) is an effort undergoing incubation +at The Apache Software Foundation (ASF), sponsored by the Apache Incubator. + +Incubation is required of all newly accepted projects until a further review +indicates that the infrastructure, communications, and decision making process +have stabilized in a manner consistent with other successful ASF projects. + +While incubation status is not necessarily a reflection of the +completeness or stability of the code, it does indicate that the +project has yet to be fully endorsed by the ASF. diff --git a/DISCLAIMER-WIP b/DISCLAIMER-WIP deleted file mode 100644 index 62fbb5f..000 --- a/DISCLAIMER-WIP +++ /dev/null @@ -1,26 +0,0 @@ -Apache Hudi (incubating) is an effort undergoing incubation -at The Apache Software Foundation (ASF), sponsored by the Apache Incubator. - -Incubation is required of all newly accepted projects until a further review -indicates that the infrastructure, communications, and decision making process -have stabilized in a manner consistent with other successful ASF projects. - -While incubation status is not necessarily a reflection of the -completeness or stability of the code, it does indicate that the -project has yet to be fully endorsed by the ASF. - -Some of the incubating project's releases may not be fully compliant -with ASF policy. For example, releases may have incomplete or -un-reviewed licensing conditions. What follows is a list of known -issues the project is currently aware of (note that this list, by -definition, is likely to be incomplete): - - * The LICENSE and NOTICE files may not be complete and will be fixed with the next release. - -If you are planning to incorporate this work into your -product or project, please be aware that you will need to conduct a -thorough licensing review to determine the overall implications of -including this work. For the current status of this project through the Apache -Incubator visit: - -http://incubator.apache.org/projects/hudi.html diff --git a/pom.xml b/pom.xml index 15ab109..3bd2ad2 100644 --- a/pom.xml +++ b/pom.xml @@ -335,7 +335,7 @@ NOTICE - DISCLAIMER-WIP + DISCLAIMER **/.* **/*.json **/*.log diff --git a/scripts/release/validate_staged_release.sh b/scripts/release/validate_staged_release.sh index 429047b..b90e5cf 100755 --- a/scripts/release/validate_staged_release.sh +++ b/scripts/release/validate_staged_release.sh @@ -107,11 +107,11 @@ fi echo -e "\t\tNo Binary Files in Source Release? - [OK]\n" ### END: Binary Files Check -### Checking for DISCLAIMER-WIP -echo "Checking for DISCLAIMERi-WIP" -disclaimerFile="./DISCLAIMER-WIP" +### Checking for DISCLAIMER +echo "Checking for DISCLAIMER" +disclaimerFile="./DISCLAIMER" if [ ! -f "$disclaimerFile" ]; then - echo "DISCLAIMER-WIP file missing" + echo "DISCLAIMER file missing" exit -1 fi echo -e "\t\tDISCLAIMER file exists ? [OK]\n"
[incubator-hudi] branch master updated: [HUDI-715] Fix duplicate name in TableCommand (#1410)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 3ef9e88 [HUDI-715] Fix duplicate name in TableCommand (#1410) 3ef9e88 is described below commit 3ef9e885cacc064fc316c61c7c826f3a1cb96da0 Author: hongdd AuthorDate: Mon Mar 16 17:19:57 2020 +0800 [HUDI-715] Fix duplicate name in TableCommand (#1410) --- hudi-cli/src/main/java/org/apache/hudi/cli/commands/TableCommand.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/TableCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/TableCommand.java index 439b9c8..9dcd5c9 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/TableCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/TableCommand.java @@ -54,7 +54,7 @@ public class TableCommand implements CommandMarker { help = "Enable eventual consistency") final boolean eventuallyConsistent, @CliOption(key = {"initialCheckIntervalMs"}, unspecifiedDefaultValue = "2000", help = "Initial wait time for eventual consistency") final Integer initialConsistencyIntervalMs, - @CliOption(key = {"maxCheckIntervalMs"}, unspecifiedDefaultValue = "30", + @CliOption(key = {"maxWaitIntervalMs"}, unspecifiedDefaultValue = "30", help = "Max wait time for eventual consistency") final Integer maxConsistencyIntervalMs, @CliOption(key = {"maxCheckIntervalMs"}, unspecifiedDefaultValue = "7", help = "Max checks for eventual consistency") final Integer maxConsistencyChecks)
[incubator-hudi] branch master updated (661b0b3 -> 644c1cc)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 661b0b3 [HUDI-761] Refactoring rollback and restore actions using the ActionExecutor abstraction (#1492) add 644c1cc [HUDI-698]Add unit test for CleansCommand (#1449) No new revisions were added by this update. Summary of changes: hudi-cli/pom.xml | 25 +++ .../apache/hudi/cli/HoodieTableHeaderFields.java | 9 + .../apache/hudi/cli/commands/CleansCommand.java| 15 +- .../org/apache/hudi/cli/commands/SparkMain.java| 2 +- .../hudi/cli/commands/TestCleansCommand.java | 183 + .../apache/hudi/cli/integ/ITTestCleansCommand.java | 101 .../src/test/resources/clean.properties| 5 +- scripts/run_travis_tests.sh| 7 + 8 files changed, 338 insertions(+), 9 deletions(-) create mode 100644 hudi-cli/src/test/java/org/apache/hudi/cli/commands/TestCleansCommand.java create mode 100644 hudi-cli/src/test/java/org/apache/hudi/cli/integ/ITTestCleansCommand.java copy hudi-utilities/src/test/resources/delta-streamer-config/base.properties => hudi-cli/src/test/resources/clean.properties (87%)
[incubator-hudi] branch master updated: [HUDI-789]Adjust logic of upsert in HDFSParquetImporter (#1511)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 84dd904 [HUDI-789]Adjust logic of upsert in HDFSParquetImporter (#1511) 84dd904 is described below commit 84dd9047d3902650d7ff5bc95b9789d6880ca8e2 Author: hongdd AuthorDate: Tue Apr 21 14:21:30 2020 +0800 [HUDI-789]Adjust logic of upsert in HDFSParquetImporter (#1511) --- .../apache/hudi/utilities/HDFSParquetImporter.java | 22 +- .../hudi/utilities/TestHDFSParquetImporter.java| 255 - 2 files changed, 217 insertions(+), 60 deletions(-) diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/HDFSParquetImporter.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/HDFSParquetImporter.java index f389c58..4befaec 100644 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/HDFSParquetImporter.java +++ b/hudi-utilities/src/main/java/org/apache/hudi/utilities/HDFSParquetImporter.java @@ -100,6 +100,10 @@ public class HDFSParquetImporter implements Serializable { } + private boolean isUpsert() { +return "upsert".equals(cfg.command.toLowerCase()); + } + public int dataImport(JavaSparkContext jsc, int retry) { this.fs = FSUtils.getFs(cfg.targetPath, jsc.hadoopConfiguration()); this.props = cfg.propsFilePath == null ? UtilHelpers.buildProperties(cfg.configs) @@ -108,7 +112,7 @@ public class HDFSParquetImporter implements Serializable { int ret = -1; try { // Verify that targetPath is not present. - if (fs.exists(new Path(cfg.targetPath))) { + if (fs.exists(new Path(cfg.targetPath)) && !isUpsert()) { throw new HoodieIOException(String.format("Make sure %s is not present.", cfg.targetPath)); } do { @@ -122,20 +126,22 @@ public class HDFSParquetImporter implements Serializable { protected int dataImport(JavaSparkContext jsc) throws IOException { try { - if (fs.exists(new Path(cfg.targetPath))) { + if (fs.exists(new Path(cfg.targetPath)) && !isUpsert()) { // cleanup target directory. fs.delete(new Path(cfg.targetPath), true); } + if (!fs.exists(new Path(cfg.targetPath))) { +// Initialize target hoodie table. +Properties properties = new Properties(); +properties.put(HoodieTableConfig.HOODIE_TABLE_NAME_PROP_NAME, cfg.tableName); +properties.put(HoodieTableConfig.HOODIE_TABLE_TYPE_PROP_NAME, cfg.tableType); + HoodieTableMetaClient.initTableAndGetMetaClient(jsc.hadoopConfiguration(), cfg.targetPath, properties); + } + // Get schema. String schemaStr = UtilHelpers.parseSchema(fs, cfg.schemaFile); - // Initialize target hoodie table. - Properties properties = new Properties(); - properties.put(HoodieTableConfig.HOODIE_TABLE_NAME_PROP_NAME, cfg.tableName); - properties.put(HoodieTableConfig.HOODIE_TABLE_TYPE_PROP_NAME, cfg.tableType); - HoodieTableMetaClient.initTableAndGetMetaClient(jsc.hadoopConfiguration(), cfg.targetPath, properties); - HoodieWriteClient client = UtilHelpers.createHoodieClient(jsc, cfg.targetPath, schemaStr, cfg.parallelism, Option.empty(), props); diff --git a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java index c94edf3..a4711b5 100644 --- a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java +++ b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java @@ -20,6 +20,7 @@ package org.apache.hudi.utilities; import org.apache.hudi.client.HoodieReadClient; import org.apache.hudi.client.HoodieWriteClient; +import org.apache.hudi.common.HoodieClientTestUtils; import org.apache.hudi.common.HoodieTestDataGenerator; import org.apache.hudi.common.minicluster.HdfsTestService; import org.apache.hudi.common.model.HoodieTestUtils; @@ -37,8 +38,13 @@ import org.apache.parquet.avro.AvroParquetWriter; import org.apache.parquet.hadoop.ParquetWriter; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; +import org.apache.spark.sql.Dataset; +import org.apache.spark.sql.Row; import org.apache.spark.sql.SQLContext; + +import org.junit.After; import org.junit.AfterClass; +import org.junit.Before; import org.junit.BeforeClass; import org.junit.Test; @@ -50,8 +56,10 @@ import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Map.Entry; +import java.util.Objects; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicInteger; +import java.util.stream.Collectors; import static org.junit.Assert.assertEquals; import static org.junit
[incubator-hudi] branch master updated (ddd105b -> 2a2f31d)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from ddd105b [HUDI-772] Make UserDefinedBulkInsertPartitioner configurable for DataSource (#1500) add 2a2f31d [MINOR] Remove reduntant code and fix typo in HoodieDefaultTimeline (#1535) No new revisions were added by this update. Summary of changes: .../apache/hudi/common/table/timeline/HoodieDefaultTimeline.java| 6 +- 1 file changed, 1 insertion(+), 5 deletions(-)
[incubator-hudi] branch master updated: [HUDI-700]Add unit test for FileSystemViewCommand (#1490)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new a464a29 [HUDI-700]Add unit test for FileSystemViewCommand (#1490) a464a29 is described below commit a464a2972e83e648585277ebe567703c8285cf1e Author: hongdd AuthorDate: Sat Apr 11 10:12:21 2020 +0800 [HUDI-700]Add unit test for FileSystemViewCommand (#1490) --- .../apache/hudi/cli/HoodieTableHeaderFields.java | 56 + .../hudi/cli/commands/FileSystemViewCommand.java | 47 ++-- .../cli/commands/TestFileSystemViewCommand.java| 267 + 3 files changed, 351 insertions(+), 19 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java new file mode 100644 index 000..001a54a --- /dev/null +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java @@ -0,0 +1,56 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.cli; + +/** + * Fields of print table header. + */ +public class HoodieTableHeaderFields { + public static final String HEADER_PARTITION = "Partition"; + public static final String HEADER_FILE_ID = "FileId"; + public static final String HEADER_BASE_INSTANT = "Base-Instant"; + + /** + * Fields of data header. + */ + public static final String HEADER_DATA_FILE = "Data-File"; + public static final String HEADER_DATA_FILE_SIZE = HEADER_DATA_FILE + " Size"; + + /** + * Fields of delta header. + */ + public static final String HEADER_DELTA_SIZE = "Delta Size"; + public static final String HEADER_DELTA_FILES = "Delta Files"; + public static final String HEADER_TOTAL_DELTA_SIZE = "Total " + HEADER_DELTA_SIZE; + public static final String HEADER_TOTAL_DELTA_FILE_SIZE = "Total Delta File Size"; + public static final String HEADER_NUM_DELTA_FILES = "Num " + HEADER_DELTA_FILES; + + /** + * Fields of compaction scheduled header. + */ + private static final String COMPACTION_SCHEDULED_SUFFIX = " - compaction scheduled"; + private static final String COMPACTION_UNSCHEDULED_SUFFIX = " - compaction unscheduled"; + + public static final String HEADER_DELTA_SIZE_SCHEDULED = HEADER_DELTA_SIZE + COMPACTION_SCHEDULED_SUFFIX; + public static final String HEADER_DELTA_SIZE_UNSCHEDULED = HEADER_DELTA_SIZE + COMPACTION_UNSCHEDULED_SUFFIX; + public static final String HEADER_DELTA_BASE_SCHEDULED = "Delta To Base Ratio" + COMPACTION_SCHEDULED_SUFFIX; + public static final String HEADER_DELTA_BASE_UNSCHEDULED = "Delta To Base Ratio" + COMPACTION_UNSCHEDULED_SUFFIX; + public static final String HEADER_DELTA_FILES_SCHEDULED = "Delta Files" + COMPACTION_SCHEDULED_SUFFIX; + public static final String HEADER_DELTA_FILES_UNSCHEDULED = "Delta Files" + COMPACTION_UNSCHEDULED_SUFFIX; +} diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/FileSystemViewCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/FileSystemViewCommand.java index a7025f8..cf86184 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/FileSystemViewCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/FileSystemViewCommand.java @@ -20,6 +20,7 @@ package org.apache.hudi.cli.commands; import org.apache.hudi.cli.HoodieCLI; import org.apache.hudi.cli.HoodiePrintHelper; +import org.apache.hudi.cli.HoodieTableHeaderFields; import org.apache.hudi.cli.TableHeader; import org.apache.hudi.common.model.FileSlice; import org.apache.hudi.common.model.HoodieLogFile; @@ -99,14 +100,18 @@ public class FileSystemViewCommand implements CommandMarker { Function converterFunction = entry -> NumericUtils.humanReadableByteCount((Double.parseDouble(entry.toString(; Map> fieldNameToConverterMap = new HashMap&l
[incubator-hudi] branch master updated: [HUDI-798] Migrate to Mockito Jupiter for JUnit 5 (#1521)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new acdc4a8 [HUDI-798] Migrate to Mockito Jupiter for JUnit 5 (#1521) acdc4a8 is described below commit acdc4a8d004394590f9e5ffcc703ea23624e66d9 Author: Raymond Xu <2701446+xushi...@users.noreply.github.com> AuthorDate: Thu Apr 16 01:07:32 2020 -0700 [HUDI-798] Migrate to Mockito Jupiter for JUnit 5 (#1521) --- hudi-cli/pom.xml | 2 +- hudi-client/pom.xml| 2 +- .../org/apache/hudi/client/TestWriteStatus.java| 14 +-- .../client/utils/TestParquetReaderIterator.java| 19 ++-- .../java/org/apache/hudi/index/TestHbaseIndex.java | 45 .../apache/hudi/table/TestHoodieRecordSizing.java | 6 +- hudi-common/pom.xml| 2 +- .../view/TestPriorityBasedFileSystemView.java | 124 +++-- hudi-hadoop-mr/pom.xml | 2 +- .../realtime/TestHoodieRealtimeFileSplit.java | 40 +++ hudi-hive-sync/pom.xml | 2 +- hudi-spark/pom.xml | 2 +- hudi-timeline-service/pom.xml | 2 +- hudi-utilities/pom.xml | 2 +- pom.xml| 6 +- 15 files changed, 135 insertions(+), 135 deletions(-) diff --git a/hudi-cli/pom.xml b/hudi-cli/pom.xml index 2f0f4b4..fed2bf9 100644 --- a/hudi-cli/pom.xml +++ b/hudi-cli/pom.xml @@ -276,7 +276,7 @@ org.mockito - mockito-all + mockito-junit-jupiter test diff --git a/hudi-client/pom.xml b/hudi-client/pom.xml index 9c716f2..326cf83 100644 --- a/hudi-client/pom.xml +++ b/hudi-client/pom.xml @@ -265,7 +265,7 @@ org.mockito - mockito-all + mockito-junit-jupiter test diff --git a/hudi-client/src/test/java/org/apache/hudi/client/TestWriteStatus.java b/hudi-client/src/test/java/org/apache/hudi/client/TestWriteStatus.java index 945759f..91878e1 100644 --- a/hudi-client/src/test/java/org/apache/hudi/client/TestWriteStatus.java +++ b/hudi-client/src/test/java/org/apache/hudi/client/TestWriteStatus.java @@ -20,11 +20,11 @@ package org.apache.hudi.client; import org.apache.hudi.common.model.HoodieRecord; -import org.junit.Test; -import org.mockito.Mockito; +import org.junit.jupiter.api.Test; -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.mock; public class TestWriteStatus { @Test @@ -32,7 +32,7 @@ public class TestWriteStatus { WriteStatus status = new WriteStatus(true, 0.1); Throwable t = new Exception("some error in writing"); for (int i = 0; i < 1000; i++) { - status.markFailure(Mockito.mock(HoodieRecord.class), t, null); + status.markFailure(mock(HoodieRecord.class), t, null); } assertTrue(status.getFailedRecords().size() > 0); assertTrue(status.getFailedRecords().size() < 150); // 150 instead of 100, to prevent flaky test @@ -44,8 +44,8 @@ public class TestWriteStatus { WriteStatus status = new WriteStatus(false, 1.0); Throwable t = new Exception("some error in writing"); for (int i = 0; i < 1000; i++) { - status.markSuccess(Mockito.mock(HoodieRecord.class), null); - status.markFailure(Mockito.mock(HoodieRecord.class), t, null); + status.markSuccess(mock(HoodieRecord.class), null); + status.markFailure(mock(HoodieRecord.class), t, null); } assertEquals(1000, status.getFailedRecords().size()); assertTrue(status.hasErrors()); diff --git a/hudi-client/src/test/java/org/apache/hudi/client/utils/TestParquetReaderIterator.java b/hudi-client/src/test/java/org/apache/hudi/client/utils/TestParquetReaderIterator.java index 4e291aa..f20c5f9 100644 --- a/hudi-client/src/test/java/org/apache/hudi/client/utils/TestParquetReaderIterator.java +++ b/hudi-client/src/test/java/org/apache/hudi/client/utils/TestParquetReaderIterator.java @@ -21,11 +21,14 @@ package org.apache.hudi.client.utils; import org.apache.hudi.exception.HoodieIOException; import org.apache.parquet.hadoop.ParquetReader; -import org.junit.Assert; -import org.junit.Test; +import org.junit.jupiter.api.Test; import java.io.IOException; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.Mockito.mock; import stati
[incubator-hudi] branch master updated: [HUDI-740]Fix can not specify the sparkMaster and code clean for SparkUtil (#1452)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 4e5c867 [HUDI-740]Fix can not specify the sparkMaster and code clean for SparkUtil (#1452) 4e5c867 is described below commit 4e5c8671ef3213ffa5c40f09aae27aacfa20f907 Author: hongdd AuthorDate: Wed Apr 8 21:33:15 2020 +0800 [HUDI-740]Fix can not specify the sparkMaster and code clean for SparkUtil (#1452) --- .../org/apache/hudi/cli/HoodieCliSparkConfig.java | 46 .../apache/hudi/cli/commands/CleansCommand.java| 2 +- .../hudi/cli/commands/CompactionCommand.java | 16 +++--- .../org/apache/hudi/cli/commands/SparkMain.java| 62 +- .../java/org/apache/hudi/cli/utils/SparkUtil.java | 36 - 5 files changed, 103 insertions(+), 59 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCliSparkConfig.java b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCliSparkConfig.java new file mode 100644 index 000..0d64135 --- /dev/null +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieCliSparkConfig.java @@ -0,0 +1,46 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.cli; + +/** + * Class storing configs for init spark. + */ +public class HoodieCliSparkConfig { + /** + * Configs to start spark application. + */ + public static final String CLI_SPARK_MASTER = "SPARK_MASTER"; + public static final String CLI_SERIALIZER = "spark.serializer"; + public static final String CLI_DRIVER_MAX_RESULT_SIZE = "spark.driver.maxResultSize"; + public static final String CLI_EVENT_LOG_OVERWRITE = "spark.eventLog.overwrite"; + public static final String CLI_EVENT_LOG_ENABLED = "spark.eventLog.enabled"; + public static final String CLI_EXECUTOR_MEMORY = "spark.executor.memory"; + + /** + * Hadoop output config. + */ + public static final String CLI_MAPRED_OUTPUT_COMPRESS = "spark.hadoop.mapred.output.compress"; + public static final String CLI_MAPRED_OUTPUT_COMPRESSION_CODEC = "spark.hadoop.mapred.output.compression.codec"; + public static final String CLI_MAPRED_OUTPUT_COMPRESSION_TYPE = "spark.hadoop.mapred.output.compression.type"; + + /** + * Parquet file config. + */ + public static final String CLI_PARQUET_ENABLE_SUMMARY_METADATA = "parquet.enable.summary-metadata"; +} diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CleansCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CleansCommand.java index 34321ef..609e44b 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CleansCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CleansCommand.java @@ -139,7 +139,7 @@ public class CleansCommand implements CommandMarker { SparkLauncher sparkLauncher = SparkUtil.initLauncher(sparkPropertiesPath); String cmd = SparkMain.SparkCommand.CLEAN.toString(); -sparkLauncher.addAppArgs(cmd, metaClient.getBasePath(), master, propsFilePath, sparkMemory); +sparkLauncher.addAppArgs(cmd, master, sparkMemory, metaClient.getBasePath(), propsFilePath); UtilHelpers.validateAndAddProperties(configs, sparkLauncher); Process process = sparkLauncher.launch(); InputStreamConsumer.captureOutput(process); diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CompactionCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CompactionCommand.java index 0843a87..a4c70da 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CompactionCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/CompactionCommand.java @@ -423,8 +423,8 @@ public class CompactionCommand implements CommandMarker { String sparkPropertiesPath = Utils .getDefaultPropertiesFile(scala.collection.JavaConversions.propertiesAsScalaMap(System.getProperties())); SparkLauncher sparkLauncher = SparkUtil.initLaun
[incubator-hudi] branch asf-site updated: Update the old hudi version to 0.5.2 (#1407)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/asf-site by this push: new 8504a50 Update the old hudi version to 0.5.2 (#1407) 8504a50 is described below commit 8504a50a966defa30212cf9df09e10528997294c Author: vinoyang AuthorDate: Sun Mar 15 18:56:38 2020 +0800 Update the old hudi version to 0.5.2 (#1407) --- docs/_docs/0.5.2/1_1_quick_start_guide.cn.md | 4 ++-- docs/_docs/0.5.2/1_1_quick_start_guide.md| 4 ++-- docs/_docs/0.5.2/2_2_writing_data.md | 2 +- docs/_docs/0.5.2/2_3_querying_data.md| 4 ++-- docs/_docs/0.5.2/2_6_deployment.md | 4 ++-- 5 files changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/_docs/0.5.2/1_1_quick_start_guide.cn.md b/docs/_docs/0.5.2/1_1_quick_start_guide.cn.md index a8f4d49..c56774b 100644 --- a/docs/_docs/0.5.2/1_1_quick_start_guide.cn.md +++ b/docs/_docs/0.5.2/1_1_quick_start_guide.cn.md @@ -15,7 +15,7 @@ Hudi适用于Spark-2.x版本。您可以按照[此处](https://spark.apache.org/ 在提取的目录中,使用spark-shell运行Hudi: ```scala -bin/spark-shell --packages org.apache.hudi:hudi-spark-bundle:0.5.0-incubating --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' +bin/spark-shell --packages org.apache.hudi:hudi-spark-bundle:0.5.2-incubating --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' ``` 设置表名、基本路径和数据生成器来为本指南生成记录。 @@ -153,7 +153,7 @@ spark.sql("select `_hoodie_commit_time`, fare, begin_lon, begin_lat, ts from hu 您也可以通过[自己构建hudi](https://github.com/apache/incubator-hudi#building-apache-hudi-from-source)来快速开始, 并在spark-shell命令中使用`--jars /packaging/hudi-spark-bundle/target/hudi-spark-bundle-*.*.*-SNAPSHOT.jar`, -而不是`--packages org.apache.hudi:hudi-spark-bundle:0.5.0-incubating` +而不是`--packages org.apache.hudi:hudi-spark-bundle:0.5.2-incubating` 这里我们使用Spark演示了Hudi的功能。但是,Hudi可以支持多种存储类型/视图,并且可以从Hive,Spark,Presto等查询引擎中查询Hudi数据集。 diff --git a/docs/_docs/0.5.2/1_1_quick_start_guide.md b/docs/_docs/0.5.2/1_1_quick_start_guide.md index d7bd0ff..ab4e37c 100644 --- a/docs/_docs/0.5.2/1_1_quick_start_guide.md +++ b/docs/_docs/0.5.2/1_1_quick_start_guide.md @@ -18,7 +18,7 @@ From the extracted directory run spark-shell with Hudi as: ```scala spark-2.4.4-bin-hadoop2.7/bin/spark-shell \ - --packages org.apache.hudi:hudi-spark-bundle_2.11:0.5.1-incubating,org.apache.spark:spark-avro_2.11:2.4.4 \ + --packages org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4 \ --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' ``` @@ -209,7 +209,7 @@ Note: Only `Append` mode is supported for delete operation. You can also do the quickstart by [building hudi yourself](https://github.com/apache/incubator-hudi#building-apache-hudi-from-source), and using `--jars /packaging/hudi-spark-bundle/target/hudi-spark-bundle_2.11-*.*.*-SNAPSHOT.jar` in the spark-shell command above -instead of `--packages org.apache.hudi:hudi-spark-bundle_2.11:0.5.1-incubating`. Hudi also supports scala 2.12. Refer [build with scala 2.12](https://github.com/apache/incubator-hudi#build-with-scala-212) +instead of `--packages org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating`. Hudi also supports scala 2.12. Refer [build with scala 2.12](https://github.com/apache/incubator-hudi#build-with-scala-212) for more info. Also, we used Spark here to show case the capabilities of Hudi. However, Hudi can support multiple table types/query types and diff --git a/docs/_docs/0.5.2/2_2_writing_data.md b/docs/_docs/0.5.2/2_2_writing_data.md index 3dc85d0..10b6189 100644 --- a/docs/_docs/0.5.2/2_2_writing_data.md +++ b/docs/_docs/0.5.2/2_2_writing_data.md @@ -205,7 +205,7 @@ cd hudi-hive ./run_sync_tool.sh --jdbc-url jdbc:hive2:\/\/hiveserver:1 --user hive --pass hive --partitioned-by partition --base-path --database default --table ``` -Starting with Hudi 0.5.1 version read optimized version of merge-on-read tables are suffixed '_ro' by default. For backwards compatibility with older Hudi versions, +Starting with Hudi 0.5.2 version read optimized version of merge-on-read tables are suffixed '_ro' by default. For backwards compatibility with older Hudi versions, an optional HiveSyncConfig - `--skip-ro-suffix`, has been provided to turn off '_ro' suffixing if desired. Explore other hive sync options using the following command: ```java diff --git a/docs/_docs/0.5.2/2_3_querying_data.md b/docs/_docs/0.5.2/2_3_querying_data.md index 242c84a..0c28b12 100644 --- a/docs/_docs/0.5.2/2_3_querying_data.md +++ b/docs/_docs/0.5.2/2_3_querying_data.md @@ -118,7 +118,7 @@ both parquet and avro data, this default setting needs to be turned off using se This will force Spark to fallback to using the Hive Serde to read the data (planning/executions is still Spark). ``
[incubator-hudi] branch master updated: [HUDI-710] Fixing failure in Staging Validation Script (#1403)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 23afe7a [HUDI-710] Fixing failure in Staging Validation Script (#1403) 23afe7a is described below commit 23afe7a4872fca66d9aeb36d209c6538a17d81f1 Author: Balaji Varadarajan AuthorDate: Sun Mar 15 07:13:20 2020 -0700 [HUDI-710] Fixing failure in Staging Validation Script (#1403) --- scripts/release/validate_staged_release.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/release/validate_staged_release.sh b/scripts/release/validate_staged_release.sh index b90e5cf..4dcb9dd 100755 --- a/scripts/release/validate_staged_release.sh +++ b/scripts/release/validate_staged_release.sh @@ -97,7 +97,7 @@ cd hudi-${RELEASE_VERSION}-incubating-rc${RC_NUM} ### BEGIN: Binary Files Check echo "Checking for binary files in source release" -numBinaryFiles=`find . -iname '*' | xargs -I {} file -I {} | grep -va directory | grep -va 'text/' | grep -va 'application/xml' | wc -l | sed -e s'/ //g'` +numBinaryFiles=`find . -iname '*' | xargs -I {} file -I {} | grep -va directory | grep -va 'text/' | grep -va 'application/xml' | grep -va 'application/json' | wc -l | sed -e s'/ //g'` if [ "$numBinaryFiles" -gt "0" ]; then echo -e "There were non-text files in source release. Please check below\n"
[incubator-hudi] branch master updated: [HUDI-694]Add unit test for SparkEnvCommand (#1401)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 55e6d34 [HUDI-694]Add unit test for SparkEnvCommand (#1401) 55e6d34 is described below commit 55e6d348155f63eb128cd208687d02206bad66a5 Author: hongdd AuthorDate: Mon Mar 16 11:52:40 2020 +0800 [HUDI-694]Add unit test for SparkEnvCommand (#1401) * Add test for SparkEnvCommand --- .../hudi/cli/AbstractShellIntegrationTest.java | 46 +++ .../hudi/cli/commands/TestSparkEnvCommand.java | 53 ++ .../test/resources/log4j-surefire-quiet.properties | 23 ++ .../src/test/resources/log4j-surefire.properties | 25 ++ 4 files changed, 147 insertions(+) diff --git a/hudi-cli/src/test/java/org/apache/hudi/cli/AbstractShellIntegrationTest.java b/hudi-cli/src/test/java/org/apache/hudi/cli/AbstractShellIntegrationTest.java new file mode 100644 index 000..6db65a7 --- /dev/null +++ b/hudi-cli/src/test/java/org/apache/hudi/cli/AbstractShellIntegrationTest.java @@ -0,0 +1,46 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.cli; + +import org.junit.AfterClass; +import org.junit.BeforeClass; +import org.springframework.shell.Bootstrap; +import org.springframework.shell.core.JLineShellComponent; + +/** + * Class to start Bootstrap and JLineShellComponent. + */ +public abstract class AbstractShellIntegrationTest { + private static JLineShellComponent shell; + + @BeforeClass + public static void startup() { +Bootstrap bootstrap = new Bootstrap(); +shell = bootstrap.getJLineShellComponent(); + } + + @AfterClass + public static void shutdown() { +shell.stop(); + } + + protected static JLineShellComponent getShell() { +return shell; + } +} diff --git a/hudi-cli/src/test/java/org/apache/hudi/cli/commands/TestSparkEnvCommand.java b/hudi-cli/src/test/java/org/apache/hudi/cli/commands/TestSparkEnvCommand.java new file mode 100644 index 000..f54fcae --- /dev/null +++ b/hudi-cli/src/test/java/org/apache/hudi/cli/commands/TestSparkEnvCommand.java @@ -0,0 +1,53 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.cli.commands; + +import org.apache.hudi.cli.AbstractShellIntegrationTest; +import org.apache.hudi.cli.HoodiePrintHelper; +import org.junit.Test; +import org.springframework.shell.core.CommandResult; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; + +/** + * Test Cases for {@link SparkEnvCommand}. + */ +public class TestSparkEnvCommand extends AbstractShellIntegrationTest { + + /** + * Test Cases for set and get spark env. + */ + @Test + public void testSetAndGetSparkEnv() { +// First, be empty +CommandResult cr = getShell().executeCommand("show envs all"); +String nullResult = HoodiePrintHelper.print(new String[] {"key", "value"}, new String[0][2]); +assertEquals(nullResult, cr.getResult().toString()); + +// Set SPARK_HOME +cr = getShell().executeCommand("set --conf SPARK_HOME=/usr/etc/spark"); +assertTrue(cr.isSuccess()); + +//Get +cr = getShell().executeCommand("show env --key SPARK_HOME"); +String
[incubator-hudi] annotated tag release-0.5.2-incubating updated (41202da -> 8c4a620)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to annotated tag release-0.5.2-incubating in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. *** WARNING: tag release-0.5.2-incubating was modified! *** from 41202da (commit) to 8c4a620 (tag) tagging 41202da7788193da77f1ae4b784127bb93eaae2c (commit) replaces release-0.5.2-incubating-rc2 by yanghua on Thu Mar 26 09:42:40 2020 +0800 - Log - release-0.5.2-incubating -BEGIN PGP SIGNATURE- iQEzBAABCAAdFiEEw6lux3FJVxron4J2TIZoTQR94DwFAl58CJAACgkQTIZoTQR9 4Dxa3wf8CQZ7DVVJb9z9NO4hhl+ObBUKk9XJtBcL8tW60nVI7bP6gJ4Egq4a2wpG qHUX6lsvBKZ+mnOHlEk3pCwu+D/x1pRprD6qcSGvAjafVnsDeAybNI6qSsuRaRdL 68CQsdR7tLyLibEQ24RukHs0CU38mc1GviUuRFxmrPmlFKZP+LCs+Ym21vmOjo1F 6FwLcjjUgweZsEm92zgvWSN2tbrKRXtLu1i6oRZSlX2HkdQ7ULDUFF5hmRwY1eS3 sOWsOzuzvkySkE0J4rvyh6NHEMtA4uGgbq9LtQJIrLjAmKXV369MSPWG9058bqQk fKItkwYmzn4BRf+cyplKJC3hqmqfLg== =c25o -END PGP SIGNATURE- --- No new revisions were added by this update. Summary of changes:
[incubator-hudi] branch asf-site updated: Update the release note url for 0.5.2 (#1447)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/asf-site by this push: new f33188a Update the release note url for 0.5.2 (#1447) f33188a is described below commit f33188a8c619d1434201856f582b6be0c81a92ff Author: vinoyang AuthorDate: Thu Mar 26 10:05:35 2020 +0800 Update the release note url for 0.5.2 (#1447) --- content/cn/docs/powered_by.html | 2 +- content/cn/releases.html| 2 +- content/docs/powered_by.html| 2 +- content/releases.html | 2 +- docs/_pages/releases.cn.md | 2 +- docs/_pages/releases.md | 2 +- 6 files changed, 6 insertions(+), 6 deletions(-) diff --git a/content/cn/docs/powered_by.html b/content/cn/docs/powered_by.html index 5866edd..f852d64 100644 --- a/content/cn/docs/powered_by.html +++ b/content/cn/docs/powered_by.html @@ -351,7 +351,7 @@ Hudi还支持几个增量的Hive ETL管道,并且目前已集成到Uber的数 Yields.io -Yields.io是第一个使用AI在企业范围内进行自动模型验证和实时监控的金融科技平台。他们的数据湖由Hudi管理,他们还积极使用Hudi为增量式、跨语言/平台机器学习构建基础架构。 +https://www.yields.io/Blog/Apache-Hudi-at-Yields;>Yields.io是第一个使用AI在企业范围内进行自动模型验证和实时监控的金融科技平台。他们的数据湖由Hudi管理,他们还积极使用Hudi为增量式、跨语言/平台机器学习构建基础架构。 Yotpo diff --git a/content/cn/releases.html b/content/cn/releases.html index ed77b36..a48f73c 100644 --- a/content/cn/releases.html +++ b/content/cn/releases.html @@ -245,7 +245,7 @@ Raw Release Notes -The raw release notes are available https://issues.apache.org/jira/projects/HUDI/versions/12346606#release-report-tab-body;>here +The raw release notes are available https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12322822version=12346606;>here https://github.com/apache/incubator-hudi/releases/tag/release-0.5.1-incubating;>Release 0.5.1-incubating (docs) diff --git a/content/docs/powered_by.html b/content/docs/powered_by.html index 86e6f32..e5b52e3 100644 --- a/content/docs/powered_by.html +++ b/content/docs/powered_by.html @@ -355,7 +355,7 @@ offering, providing means for AWS users to perform record-level updates/deletes Yields.io -Yields.io is the first FinTech platform that uses AI for automated model validation and real-time monitoring on an enterprise-wide scale. Their data lake is managed by Hudi. They are also actively building their infrastructure for incremental, cross language/platform machine learning using Hudi. +Yields.io is the first FinTech platform that uses AI for automated model validation and real-time monitoring on an enterprise-wide scale. Their https://www.yields.io/Blog/Apache-Hudi-at-Yields;>data lake is managed by Hudi. They are also actively building their infrastructure for incremental, cross language/platform machine learning using Hudi. Yotpo diff --git a/content/releases.html b/content/releases.html index 9f97bff..303ba95 100644 --- a/content/releases.html +++ b/content/releases.html @@ -252,7 +252,7 @@ Raw Release Notes -The raw release notes are available https://issues.apache.org/jira/projects/HUDI/versions/12346606#release-report-tab-body;>here +The raw release notes are available https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12322822version=12346606;>here https://github.com/apache/incubator-hudi/releases/tag/release-0.5.1-incubating;>Release 0.5.1-incubating (docs) diff --git a/docs/_pages/releases.cn.md b/docs/_pages/releases.cn.md index 181df17..211b55a 100644 --- a/docs/_pages/releases.cn.md +++ b/docs/_pages/releases.cn.md @@ -31,7 +31,7 @@ temp_query --sql "select Instant, NumInserts, NumWrites from satishkotha_debug w ``` ### Raw Release Notes - The raw release notes are available [here](https://issues.apache.org/jira/projects/HUDI/versions/12346606#release-report-tab-body) + The raw release notes are available [here](https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12322822=12346606) ## [Release 0.5.1-incubating](https://github.com/apache/incubator-hudi/releases/tag/release-0.5.1-incubating) ([docs](/docs/0.5.1-quick-start-guide.html)) diff --git a/docs/_pages/releases.md b/docs/_pages/releases.md index c4d34bb..aacdcde 100644 --- a/docs/_pages/releases.md +++ b/docs/_pages/releases.md @@ -30,7 +30,7 @@ temp_query --sql "select Instant, NumInserts, NumWrites from satishkotha_debug w ``` ### Raw Release Notes - The raw release notes are available [here](https://issues.apache.org/jira/projects/HUDI/versions/12346606#release-report-tab-body) + The raw release notes are available [here](https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12322822=12346606) ## [Release 0.5.1-incubating](https://github.com/apache/incubator-hudi/releases/tag/release-0.5.1-incubating) ([docs](/docs/0.5.1-quick-start-guide.html))
[incubator-hudi] branch master updated: [MINOR] Fix javadoc of InsertBucket (#1445)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 5eed6c9 [MINOR] Fix javadoc of InsertBucket (#1445) 5eed6c9 is described below commit 5eed6c98a8880dc3b4e64ec9ff9dae4859e89b4c Author: Mathieu AuthorDate: Wed Mar 25 22:25:47 2020 +0800 [MINOR] Fix javadoc of InsertBucket (#1445) --- .../src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hudi-client/src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java b/hudi-client/src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java index 94c520c..0b16efe 100644 --- a/hudi-client/src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java +++ b/hudi-client/src/main/java/org/apache/hudi/table/HoodieCopyOnWriteTable.java @@ -499,7 +499,7 @@ public class HoodieCopyOnWriteTable extends Hoodi } /** - * Helper class for an insert bucket along with the weight [0.0, 0.1] that defines the amount of incoming inserts that + * Helper class for an insert bucket along with the weight [0.0, 1.0] that defines the amount of incoming inserts that * should be allocated to the bucket. */ class InsertBucket implements Serializable {
[incubator-hudi] branch master updated: [HUDI-743]: Remove FileIOUtils.close() (#1461)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 04449f3 [HUDI-743]: Remove FileIOUtils.close() (#1461) 04449f3 is described below commit 04449f33feb300b99750c52ec37f2561aa644456 Author: Suneel Marthi AuthorDate: Sat Mar 28 06:03:15 2020 -0400 [HUDI-743]: Remove FileIOUtils.close() (#1461) --- .../main/java/org/apache/hudi/metrics/Metrics.java | 3 +-- .../org/apache/hudi/common/util/FileIOUtils.java | 25 -- 2 files changed, 1 insertion(+), 27 deletions(-) diff --git a/hudi-client/src/main/java/org/apache/hudi/metrics/Metrics.java b/hudi-client/src/main/java/org/apache/hudi/metrics/Metrics.java index b6d2f7a..b62279e 100644 --- a/hudi-client/src/main/java/org/apache/hudi/metrics/Metrics.java +++ b/hudi-client/src/main/java/org/apache/hudi/metrics/Metrics.java @@ -18,7 +18,6 @@ package org.apache.hudi.metrics; -import org.apache.hudi.common.util.FileIOUtils; import org.apache.hudi.config.HoodieWriteConfig; import org.apache.hudi.exception.HoodieException; @@ -53,7 +52,7 @@ public class Metrics { Runtime.getRuntime().addShutdownHook(new Thread(() -> { try { reporter.report(); -FileIOUtils.close(reporter.getReporter(), true); +getReporter().close(); } catch (Exception e) { e.printStackTrace(); } diff --git a/hudi-common/src/main/java/org/apache/hudi/common/util/FileIOUtils.java b/hudi-common/src/main/java/org/apache/hudi/common/util/FileIOUtils.java index 65a28b0..f1095b6 100644 --- a/hudi-common/src/main/java/org/apache/hudi/common/util/FileIOUtils.java +++ b/hudi-common/src/main/java/org/apache/hudi/common/util/FileIOUtils.java @@ -18,10 +18,7 @@ package org.apache.hudi.common.util; -import javax.annotation.Nullable; - import java.io.ByteArrayOutputStream; -import java.io.Closeable; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; @@ -94,26 +91,4 @@ public class FileIOUtils { out.flush(); out.close(); } - - /** - * Closes a {@link Closeable}, with control over whether an {@code IOException} may be thrown. - * @param closeable the {@code Closeable} object to be closed, or null, - * in which case this method does nothing. - * @param swallowIOException if true, don't propagate IO exceptions thrown by the {@code close} methods. - * - * @throws IOException if {@code swallowIOException} is false and {@code close} throws an {@code IOException}. - */ - public static void close(@Nullable Closeable closeable, boolean swallowIOException) - throws IOException { -if (closeable == null) { - return; -} -try { - closeable.close(); -} catch (IOException e) { - if (!swallowIOException) { -throw e; - } -} - } }
[incubator-hudi] branch master updated: [HUDI-679] Make io package Spark free (#1460)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 07c3c5d [HUDI-679] Make io package Spark free (#1460) 07c3c5d is described below commit 07c3c5d797f612f9b73b2805b65275c2029c2442 Author: leesf <490081...@qq.com> AuthorDate: Sun Mar 29 16:54:00 2020 +0800 [HUDI-679] Make io package Spark free (#1460) * [HUDI-679] Make io package Spark free --- .../scala/org/apache/hudi/cli/SparkHelpers.scala | 3 +- .../hudi/client/SparkTaskContextSupplier.java | 42 ++ .../hudi/execution/BulkInsertMapFunction.java | 2 +- .../execution/CopyOnWriteLazyInsertIterable.java | 10 -- .../execution/MergeOnReadLazyInsertIterable.java | 9 ++--- .../org/apache/hudi/io/HoodieAppendHandle.java | 14 .../org/apache/hudi/io/HoodieCreateHandle.java | 14 .../java/org/apache/hudi/io/HoodieMergeHandle.java | 14 .../java/org/apache/hudi/io/HoodieWriteHandle.java | 25 + .../hudi/io/storage/HoodieParquetWriter.java | 10 +++--- .../io/storage/HoodieStorageWriterFactory.java | 13 +++ .../apache/hudi/table/HoodieCopyOnWriteTable.java | 8 ++--- .../apache/hudi/table/HoodieMergeOnReadTable.java | 4 +-- .../java/org/apache/hudi/table/HoodieTable.java| 7 .../hudi/client/TestUpdateSchemaEvolution.java | 4 +-- .../hudi/common/HoodieClientTestHarness.java | 3 ++ .../apache/hudi/common/HoodieClientTestUtils.java | 4 ++- .../io/storage/TestHoodieStorageWriterFactory.java | 6 ++-- .../apache/hudi/table/TestCopyOnWriteTable.java| 2 +- 19 files changed, 136 insertions(+), 58 deletions(-) diff --git a/hudi-cli/src/main/scala/org/apache/hudi/cli/SparkHelpers.scala b/hudi-cli/src/main/scala/org/apache/hudi/cli/SparkHelpers.scala index 4c8e4c1..6fdac1c 100644 --- a/hudi-cli/src/main/scala/org/apache/hudi/cli/SparkHelpers.scala +++ b/hudi-cli/src/main/scala/org/apache/hudi/cli/SparkHelpers.scala @@ -22,6 +22,7 @@ import org.apache.avro.generic.IndexedRecord import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.{FileSystem, Path} import org.apache.hudi.avro.HoodieAvroWriteSupport +import org.apache.hudi.client.SparkTaskContextSupplier import org.apache.hudi.common.HoodieJsonPayload import org.apache.hudi.common.bloom.filter.{BloomFilter, BloomFilterFactory} import org.apache.hudi.common.model.HoodieRecord @@ -45,7 +46,7 @@ object SparkHelpers { HoodieIndexConfig.DEFAULT_HOODIE_BLOOM_INDEX_FILTER_DYNAMIC_MAX_ENTRIES.toInt, HoodieIndexConfig.DEFAULT_BLOOM_INDEX_FILTER_TYPE); val writeSupport: HoodieAvroWriteSupport = new HoodieAvroWriteSupport(new AvroSchemaConverter().convert(schema), schema, filter) val parquetConfig: HoodieParquetConfig = new HoodieParquetConfig(writeSupport, CompressionCodecName.GZIP, HoodieStorageConfig.DEFAULT_PARQUET_BLOCK_SIZE_BYTES.toInt, HoodieStorageConfig.DEFAULT_PARQUET_PAGE_SIZE_BYTES.toInt, HoodieStorageConfig.DEFAULT_PARQUET_FILE_MAX_BYTES.toInt, fs.getConf, HoodieStorageConfig.DEFAULT_STREAM_COMPRESSION_RATIO.toDouble) -val writer = new HoodieParquetWriter[HoodieJsonPayload, IndexedRecord](instantTime, destinationFile, parquetConfig, schema) +val writer = new HoodieParquetWriter[HoodieJsonPayload, IndexedRecord](instantTime, destinationFile, parquetConfig, schema, new SparkTaskContextSupplier()) for (rec <- sourceRecords) { val key: String = rec.get(HoodieRecord.RECORD_KEY_METADATA_FIELD).toString if (!keysToSkip.contains(key)) { diff --git a/hudi-client/src/main/java/org/apache/hudi/client/SparkTaskContextSupplier.java b/hudi-client/src/main/java/org/apache/hudi/client/SparkTaskContextSupplier.java new file mode 100644 index 000..601dd98 --- /dev/null +++ b/hudi-client/src/main/java/org/apache/hudi/client/SparkTaskContextSupplier.java @@ -0,0 +1,42 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.hudi.client; + +import org.apache.spark.TaskContext; +
[incubator-hudi] branch master updated: [HUDI-751] Fix some coding issues reported by FindBugs (#1470)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 78b3194 [HUDI-751] Fix some coding issues reported by FindBugs (#1470) 78b3194 is described below commit 78b3194e8241c519a85310997f31b2b55df487e1 Author: Shaofeng Shi AuthorDate: Tue Mar 31 21:19:32 2020 +0800 [HUDI-751] Fix some coding issues reported by FindBugs (#1470) --- .../org/apache/hudi/cli/commands/RollbacksCommand.java | 2 +- .../org/apache/hudi/cli/commands/SparkEnvCommand.java| 6 +++--- .../java/org/apache/hudi/cli/commands/StatsCommand.java | 6 +++--- .../main/java/org/apache/hudi/cli/utils/HiveUtil.java| 16 .../java/org/apache/hudi/client/HoodieCleanClient.java | 1 + .../java/org/apache/hudi/client/HoodieReadClient.java| 1 + .../java/org/apache/hudi/client/HoodieWriteClient.java | 1 + .../main/java/org/apache/hudi/client/WriteStatus.java| 1 + .../org/apache/hudi/config/HoodieHBaseIndexConfig.java | 2 +- .../BoundedPartitionAwareCompactionStrategy.java | 2 +- .../compact/strategy/DayBasedCompactionStrategy.java | 6 +++--- .../hudi/common/config/SerializableConfiguration.java| 1 + .../org/apache/hudi/common/model/HoodieBaseFile.java | 1 + .../java/org/apache/hudi/common/model/HoodieLogFile.java | 2 ++ .../java/org/apache/hudi/common/model/HoodieRecord.java | 10 +- .../hudi/common/model/HoodieRollingStatMetadata.java | 2 +- .../apache/hudi/common/table/HoodieTableMetaClient.java | 7 --- .../common/table/timeline/HoodieDefaultTimeline.java | 1 + .../java/org/apache/hudi/common/util/HoodieTimer.java| 2 +- .../apache/hudi/common/util/collection/DiskBasedMap.java | 2 +- .../org/apache/hudi/hadoop/HoodieROTablePathFilter.java | 1 + .../hive/SlashEncodedDayPartitionValueExtractor.java | 1 + .../org/apache/hudi/utilities/HDFSParquetImporter.java | 1 + .../apache/hudi/utilities/deltastreamer/Compactor.java | 1 + .../apache/hudi/utilities/deltastreamer/DeltaSync.java | 5 +++-- .../utilities/deltastreamer/HoodieDeltaStreamer.java | 3 +++ .../apache/hudi/utilities/perf/TimelineServerPerf.java | 1 + .../org/apache/hudi/utilities/sources/CsvDFSSource.java | 8 +--- .../hudi/utilities/sources/HiveIncrPullSource.java | 2 ++ .../hudi/utilities/sources/helpers/AvroConvertor.java| 1 + .../hudi/utilities/sources/helpers/DFSPathSelector.java | 2 +- 31 files changed, 57 insertions(+), 41 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java index 9e4bf28..70b34bc 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java @@ -120,7 +120,7 @@ public class RollbacksCommand implements CommandMarker { /** * An Active timeline containing only rollbacks. */ - class RollbackTimeline extends HoodieActiveTimeline { + static class RollbackTimeline extends HoodieActiveTimeline { public RollbackTimeline(HoodieTableMetaClient metaClient) { super(metaClient, CollectionUtils.createImmutableSet(HoodieTimeline.ROLLBACK_EXTENSION)); diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkEnvCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkEnvCommand.java index d209a08..7969808 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkEnvCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkEnvCommand.java @@ -34,7 +34,7 @@ import java.util.Map; @Component public class SparkEnvCommand implements CommandMarker { - public static Map env = new HashMap(); + public static Map env = new HashMap<>(); @CliCommand(value = "set", help = "Set spark launcher env to cli") public void setEnv(@CliOption(key = {"conf"}, help = "Env config to be set") final String confMap) { @@ -49,8 +49,8 @@ public class SparkEnvCommand implements CommandMarker { public String showAllEnv() { String[][] rows = new String[env.size()][2]; int i = 0; -for (String key: env.keySet()) { - rows[i] = new String[]{key, env.get(key)}; +for (Map.Entry entry: env.entrySet()) { + rows[i] = new String[]{entry.getKey(), entry.getValue()}; i++; } return HoodiePrintHelper.print(new String[] {"key", "value"}, rows); diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/StatsCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/StatsCommand.java index 9db544c..e5be0e4 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/StatsCommand.java +++ b/hud
[incubator-hudi] branch master updated: [HUDI-754] Configure .asf.yaml for Hudi Github repository (#1472)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new c146ca9 [HUDI-754] Configure .asf.yaml for Hudi Github repository (#1472) c146ca9 is described below commit c146ca90fdd102f2bda90a2b15f6aef56414f9f4 Author: vinoyang AuthorDate: Wed Apr 1 10:02:47 2020 +0800 [HUDI-754] Configure .asf.yaml for Hudi Github repository (#1472) * [HUDI-754] Configure .asf.yaml for Hudi Github repository --- .asf.yaml | 12 1 file changed, 12 insertions(+) diff --git a/.asf.yaml b/.asf.yaml new file mode 100644 index 000..4a74628 --- /dev/null +++ b/.asf.yaml @@ -0,0 +1,12 @@ +github: + description: "Upserts, Deletes And Incremental Processing on Big Data." + homepage: https://hudi.apache.org/ + labels: +- hudi +- apachehudi +- datalake +- incremental processing +- bigdata +- stream processing +- data integration +- apachespark
[incubator-hudi] branch master updated: [HUDI-731] Add ChainedTransformer (#1440)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 5b53b0d [HUDI-731] Add ChainedTransformer (#1440) 5b53b0d is described below commit 5b53b0d85e0d60a37c37941b5a653b0718534e7b Author: Raymond Xu <2701446+xushi...@users.noreply.github.com> AuthorDate: Wed Apr 1 08:21:31 2020 -0700 [HUDI-731] Add ChainedTransformer (#1440) * [HUDI-731] Add ChainedTransformer --- .../org/apache/hudi/utilities/UtilHelpers.java | 13 ++- .../hudi/utilities/deltastreamer/DeltaSync.java| 8 +- .../deltastreamer/HoodieDeltaStreamer.java | 21 - .../utilities/transform/ChainedTransformer.java| 54 +++ .../hudi/utilities/TestHoodieDeltaStreamer.java| 67 +++--- .../org/apache/hudi/utilities/TestUtilHelpers.java | 101 + .../transform/TestChainedTransformer.java | 92 +++ .../{ => transform}/TestFlatteningTransformer.java | 4 +- 8 files changed, 314 insertions(+), 46 deletions(-) diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/UtilHelpers.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/UtilHelpers.java index 8930084..222a391 100644 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/UtilHelpers.java +++ b/hudi-utilities/src/main/java/org/apache/hudi/utilities/UtilHelpers.java @@ -34,6 +34,7 @@ import org.apache.hudi.exception.HoodieIOException; import org.apache.hudi.index.HoodieIndex; import org.apache.hudi.utilities.schema.SchemaProvider; import org.apache.hudi.utilities.sources.Source; +import org.apache.hudi.utilities.transform.ChainedTransformer; import org.apache.hudi.utilities.transform.Transformer; import org.apache.avro.Schema; @@ -67,7 +68,9 @@ import java.sql.DriverManager; import java.sql.PreparedStatement; import java.sql.ResultSet; import java.sql.SQLException; +import java.util.ArrayList; import java.util.Arrays; +import java.util.Collections; import java.util.Enumeration; import java.util.HashMap; import java.util.List; @@ -102,11 +105,15 @@ public class UtilHelpers { } } - public static Transformer createTransformer(String transformerClass) throws IOException { + public static Option createTransformer(List classNames) throws IOException { try { - return transformerClass == null ? null : (Transformer) ReflectionUtils.loadClass(transformerClass); + List transformers = new ArrayList<>(); + for (String className : Option.ofNullable(classNames).orElse(Collections.emptyList())) { +transformers.add(ReflectionUtils.loadClass(className)); + } + return transformers.isEmpty() ? Option.empty() : Option.of(new ChainedTransformer(transformers)); } catch (Throwable e) { - throw new IOException("Could not load transformer class " + transformerClass, e); + throw new IOException("Could not load transformer class(es) " + classNames, e); } } diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java index 99cb497..5cc33ee 100644 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java +++ b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java @@ -106,7 +106,7 @@ public class DeltaSync implements Serializable { /** * Allows transforming source to target table before writing. */ - private transient Transformer transformer; + private transient Option transformer; /** * Extract the key for the target table. @@ -173,7 +173,7 @@ public class DeltaSync implements Serializable { refreshTimeline(); -this.transformer = UtilHelpers.createTransformer(cfg.transformerClassName); +this.transformer = UtilHelpers.createTransformer(cfg.transformerClassNames); this.keyGenerator = DataSourceUtils.createKeyGenerator(props); this.formatAdapter = new SourceFormatAdapter( @@ -281,14 +281,14 @@ public class DeltaSync implements Serializable { final Option> avroRDDOptional; final String checkpointStr; final SchemaProvider schemaProvider; -if (transformer != null) { +if (transformer.isPresent()) { // Transformation is needed. Fetch New rows in Row Format, apply transformation and then convert them // to generic records for writing InputBatch> dataAndCheckpoint = formatAdapter.fetchNewDataInRowFormat(resumeCheckpointStr, cfg.sourceLimit); Option> transformed = - dataAndCheckpoint.getBatch().map(data -> transformer.apply(jssc, sparkSession, data, props)); + dataAndCheckpoint.getBatch().map(data -> transformer.get().apply(jssc,
[incubator-hudi] branch master updated: [MINIOR] Add license header for .asf.yaml and adjust labels
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new bd716ec [MINIOR] Add license header for .asf.yaml and adjust labels bd716ec is described below commit bd716ece1826fcc44092bd0add6cbbc097eeaec2 Author: yanghua AuthorDate: Thu Apr 2 16:14:35 2020 +0800 [MINIOR] Add license header for .asf.yaml and adjust labels --- .asf.yaml | 24 +--- 1 file changed, 21 insertions(+), 3 deletions(-) diff --git a/.asf.yaml b/.asf.yaml index 6065545..9abfd9b 100644 --- a/.asf.yaml +++ b/.asf.yaml @@ -1,3 +1,21 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + github: description: "Upserts, Deletes And Incremental Processing on Big Data." homepage: https://hudi.apache.org/ @@ -5,8 +23,8 @@ github: - hudi - apachehudi - datalake -- incrementalprocessing +- incremental-processing - bigdata -- streamprocessing -- dataintegration +- stream-processing +- data-integration - apachespark
[incubator-hudi] branch asf-site updated: [MINOR] Add Hudi Online Meetup (#1441)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/asf-site by this push: new 691c522 [MINOR] Add Hudi Online Meetup (#1441) 691c522 is described below commit 691c52212a109a1e564363d3b658008fad062f8e Author: leesf <490081...@qq.com> AuthorDate: Wed Mar 25 13:36:26 2020 +0800 [MINOR] Add Hudi Online Meetup (#1441) * [MINOR] Add Hudi Online Meetup --- docs/_docs/1_4_powered_by.cn.md | 11 ++- docs/_docs/1_4_powered_by.md| 4 2 files changed, 14 insertions(+), 1 deletion(-) diff --git a/docs/_docs/1_4_powered_by.cn.md b/docs/_docs/1_4_powered_by.cn.md index 2bb41f5..82f9c02 100644 --- a/docs/_docs/1_4_powered_by.cn.md +++ b/docs/_docs/1_4_powered_by.cn.md @@ -48,10 +48,19 @@ Hudi在Yotpo有不少用途。首先,在他们的[开源ETL框架](https://git 7. ["Building highly efficient data lakes using Apache Hudi (Incubating)"](https://www.slideshare.net/ChesterChen/sf-big-analytics-20190612-building-highly-efficient-data-lakes-using-apache-hudi) - By Vinoth Chandar June 2019, SF Big Analytics Meetup, San Mateo, CA - + 8. ["Apache Hudi (Incubating) - The Past, Present and Future Of Efficient Data Lake Architectures"](https://docs.google.com/presentation/d/1FHhsvh70ZP6xXlHdVsAI0g__B_6Mpto5KQFlZ0b8-mM) - By Vinoth Chandar & Balaji Varadarajan September 2019, ApacheCon NA 19, Las Vegas, NV, USA +9. ["Insert, upsert, and delete data in Amazon S3 using Amazon EMR"](https://www.portal.reinvent.awsevents.com/connect/sessionDetail.ww?SESSION_ID=98662=YS67-AG7B-QIAV-ZZBK-E6TT-MD4Q-1HEP-747P) - By Paul Codding & Vinoth Chandar + December 2019, AWS re:Invent 2019, Las Vegas, NV, USA + +10. ["Building Robust CDC Pipeline With Apache Hudi And Debezium"](https://www.slideshare.net/SyedKather/building-robust-cdc-pipeline-with-apache-hudi-and-debezium) - By Pratyaksh, Purushotham, Syed and Shaik December 2019, Hadoop Summit Bangalore, India + +11. ["Using Apache Hudi to build the next-generation data lake and its application in medical big data"](https://drive.google.com/open?id=1dmH2kWJF69PNdifPp37QBgjivOHaSLDn) - By JingHuang & Leesf March 2020, Apache Hudi & Apache Kylin Online Meetup, China + +12. ["Building a near real-time, high-performance data warehouse based on Apache Hudi and Apache Kylin"](https://drive.google.com/open?id=1Pk_WdFxfEZxMMfAOn0R8-m3ALkcN6G9e) - By ShaoFeng Shi March 2020, Apache Hudi & Apache Kylin Online Meetup, China + ## 文章 1. ["The Case for incremental processing on Hadoop"](https://www.oreilly.com/ideas/ubers-case-for-incremental-processing-on-hadoop) - O'reilly Ideas article by Vinoth Chandar diff --git a/docs/_docs/1_4_powered_by.md b/docs/_docs/1_4_powered_by.md index 761876e..229150e 100644 --- a/docs/_docs/1_4_powered_by.md +++ b/docs/_docs/1_4_powered_by.md @@ -63,6 +63,10 @@ Using Hudi at Yotpo for several usages. Firstly, integrated Hudi as a writer in 10. ["Building Robust CDC Pipeline With Apache Hudi And Debezium"](https://www.slideshare.net/SyedKather/building-robust-cdc-pipeline-with-apache-hudi-and-debezium) - By Pratyaksh, Purushotham, Syed and Shaik December 2019, Hadoop Summit Bangalore, India +11. ["Using Apache Hudi to build the next-generation data lake and its application in medical big data"](https://drive.google.com/open?id=1dmH2kWJF69PNdifPp37QBgjivOHaSLDn) - By JingHuang & Leesf March 2020, Apache Hudi & Apache Kylin Online Meetup, China + +12. ["Building a near real-time, high-performance data warehouse based on Apache Hudi and Apache Kylin"](https://drive.google.com/open?id=1Pk_WdFxfEZxMMfAOn0R8-m3ALkcN6G9e) - By ShaoFeng Shi March 2020, Apache Hudi & Apache Kylin Online Meetup, China + ## Articles 1. ["The Case for incremental processing on Hadoop"](https://www.oreilly.com/ideas/ubers-case-for-incremental-processing-on-hadoop) - O'reilly Ideas article by Vinoth Chandar
[incubator-hudi] branch release-0.5.2 updated: Change version from 0.5.2-incubating-rc2 to 0.5.2-incubating
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/release-0.5.2 by this push: new 41202da Change version from 0.5.2-incubating-rc2 to 0.5.2-incubating 41202da is described below commit 41202da7788193da77f1ae4b784127bb93eaae2c Author: yanghua AuthorDate: Wed Mar 25 17:10:06 2020 +0800 Change version from 0.5.2-incubating-rc2 to 0.5.2-incubating --- docker/hoodie/hadoop/base/pom.xml | 2 +- docker/hoodie/hadoop/datanode/pom.xml | 2 +- docker/hoodie/hadoop/historyserver/pom.xml| 2 +- docker/hoodie/hadoop/hive_base/pom.xml| 2 +- docker/hoodie/hadoop/namenode/pom.xml | 2 +- docker/hoodie/hadoop/pom.xml | 2 +- docker/hoodie/hadoop/prestobase/pom.xml | 2 +- docker/hoodie/hadoop/spark_base/pom.xml | 2 +- docker/hoodie/hadoop/sparkadhoc/pom.xml | 2 +- docker/hoodie/hadoop/sparkmaster/pom.xml | 2 +- docker/hoodie/hadoop/sparkworker/pom.xml | 2 +- hudi-cli/pom.xml | 2 +- hudi-client/pom.xml | 2 +- hudi-common/pom.xml | 2 +- hudi-hadoop-mr/pom.xml| 2 +- hudi-hive/pom.xml | 2 +- hudi-integ-test/pom.xml | 2 +- hudi-spark/pom.xml| 2 +- hudi-timeline-service/pom.xml | 2 +- hudi-utilities/pom.xml| 2 +- packaging/hudi-hadoop-mr-bundle/pom.xml | 2 +- packaging/hudi-hive-bundle/pom.xml| 2 +- packaging/hudi-presto-bundle/pom.xml | 2 +- packaging/hudi-spark-bundle/pom.xml | 2 +- packaging/hudi-timeline-server-bundle/pom.xml | 2 +- packaging/hudi-utilities-bundle/pom.xml | 2 +- pom.xml | 2 +- 27 files changed, 27 insertions(+), 27 deletions(-) diff --git a/docker/hoodie/hadoop/base/pom.xml b/docker/hoodie/hadoop/base/pom.xml index 5c05aa4..4d70b51 100644 --- a/docker/hoodie/hadoop/base/pom.xml +++ b/docker/hoodie/hadoop/base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/datanode/pom.xml b/docker/hoodie/hadoop/datanode/pom.xml index f04c70c..0aaaf55 100644 --- a/docker/hoodie/hadoop/datanode/pom.xml +++ b/docker/hoodie/hadoop/datanode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/historyserver/pom.xml b/docker/hoodie/hadoop/historyserver/pom.xml index 293cbac..a4c0c5a 100644 --- a/docker/hoodie/hadoop/historyserver/pom.xml +++ b/docker/hoodie/hadoop/historyserver/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/hive_base/pom.xml b/docker/hoodie/hadoop/hive_base/pom.xml index 67aec60..7099f96 100644 --- a/docker/hoodie/hadoop/hive_base/pom.xml +++ b/docker/hoodie/hadoop/hive_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/namenode/pom.xml b/docker/hoodie/hadoop/namenode/pom.xml index 89c1764..841e851 100644 --- a/docker/hoodie/hadoop/namenode/pom.xml +++ b/docker/hoodie/hadoop/namenode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/pom.xml b/docker/hoodie/hadoop/pom.xml index 8cb31a0..f6be901 100644 --- a/docker/hoodie/hadoop/pom.xml +++ b/docker/hoodie/hadoop/pom.xml @@ -19,7 +19,7 @@ hudi org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating ../../../pom.xml 4.0.0 diff --git a/docker/hoodie/hadoop/prestobase/pom.xml b/docker/hoodie/hadoop/prestobase/pom.xml index e9d3f33..20fdbaa 100644 --- a/docker/hoodie/hadoop/prestobase/pom.xml +++ b/docker/hoodie/hadoop/prestobase/pom.xml @@ -22,7 +22,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/spark_base/pom.xml b/docker/hoodie/hadoop/spark_base/pom.xml index 4e67030..718ea93 100644 --- a/docker/hoodie/hadoop/spark_base/pom.xml +++ b/docker/hoodie/hadoop/spark_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc2 +0.5.2-incubating 4.0.0 pom diff --git a/docker/hoodie/hadoop/sparkadhoc/pom.xml b/docker/hoodie/hadoop/sparkadhoc/pom.xml index 313d2f1..57d45d0 100644 --- a/docker/hoodie/hadoop/sparkadhoc/pom.xml +++ b/docker
svn commit: r38617 - in /release/incubator/hudi/hudi-0.5.2-incubating: ./ hudi-0.5.2-incubating.src.tgz hudi-0.5.2-incubating.src.tgz.asc hudi-0.5.2-incubating.src.tgz.sha512
Author: vinoyang Date: Wed Mar 25 09:30:11 2020 New Revision: 38617 Log: Upload source code for hudi-0.5.2-incubating Added: release/incubator/hudi/hudi-0.5.2-incubating/ release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz (with props) release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.asc release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.sha512 Added: release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz == Binary file - no diff available. Propchange: release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz -- svn:mime-type = application/octet-stream Added: release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.asc == --- release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.asc (added) +++ release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.asc Wed Mar 25 09:30:11 2020 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQEzBAABCAAdFiEEw6lux3FJVxron4J2TIZoTQR94DwFAl57IE8ACgkQTIZoTQR9 +4DyU1AgAjZDLXw3d8Y+S+QUGiDjclQOeA+p73Mc1Osr5gZbRYG8XJDw2RY3rtc4a +cDzyh/Or8a8jYdqMYaEBzWWLkDVT3L3ka7XSJewCqMSAp2TNxyRv+HV949ow8n29 +JgrQonu7aA4//WEGBOd9d0xxC9Q+rPUFTa+mxcj4B9o6UPDbve80beBek+9hDEjS +eHOhd8WKR4HXcaDybefZxblSnk9PmQk4p6lJ21nJRk0kuDEn7sBosne7KDTFzrzc +x2o8mzckDYDaFTGHijf/x6EkrJnShCx8mRtGoeptETkQRAkUjl2kyXKGIZuFpidl +90EFAoHNTEUWfw7kQxBHRANL7zOs6Q== +=npq5 +-END PGP SIGNATURE- Added: release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.sha512 == --- release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.sha512 (added) +++ release/incubator/hudi/hudi-0.5.2-incubating/hudi-0.5.2-incubating.src.tgz.sha512 Wed Mar 25 09:30:11 2020 @@ -0,0 +1 @@ +fda1b268c22510579eab3541643e99ef7705f18ddf49b96cc23d332008e5fdfb5131929b58a7313119af9cc61d77d635229188d3f076da5fed835b63bbcc4de3 hudi-0.5.2-incubating.src.tgz
svn commit: r38618 - in /release/incubator/hudi: 0.5.2-incubating/ hudi-0.5.2-incubating/
Author: vinoyang Date: Wed Mar 25 09:39:42 2020 New Revision: 38618 Log: rename hudi-0.5.2-incubating to 0.5.2-incubating Added: release/incubator/hudi/0.5.2-incubating/ - copied from r38617, release/incubator/hudi/hudi-0.5.2-incubating/ Removed: release/incubator/hudi/hudi-0.5.2-incubating/
[incubator-hudi] branch master updated: [HUDI-814] Migrate hudi-client tests to JUnit 5 (#1570)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 69b1630 [HUDI-814] Migrate hudi-client tests to JUnit 5 (#1570) 69b1630 is described below commit 69b16309c8c46f831c8b9be42de8b2e29c74f03e Author: Raymond Xu <2701446+xushi...@users.noreply.github.com> AuthorDate: Tue Apr 28 22:57:28 2020 -0700 [HUDI-814] Migrate hudi-client tests to JUnit 5 (#1570) --- .../hudi/common/config/TestHoodieWriteConfig.java | 8 ++-- .../bloom/TestBucketizedBloomCheckPartitioner.java | 24 +- .../hudi/index/bloom/TestKeyRangeLookupTree.java | 4 +- .../strategy/TestHoodieCompactionStrategy.java | 54 +++--- 4 files changed, 44 insertions(+), 46 deletions(-) diff --git a/hudi-client/src/test/java/org/apache/hudi/common/config/TestHoodieWriteConfig.java b/hudi-client/src/test/java/org/apache/hudi/common/config/TestHoodieWriteConfig.java index 3516a6a..a1904a5 100644 --- a/hudi-client/src/test/java/org/apache/hudi/common/config/TestHoodieWriteConfig.java +++ b/hudi-client/src/test/java/org/apache/hudi/common/config/TestHoodieWriteConfig.java @@ -22,7 +22,7 @@ import org.apache.hudi.config.HoodieCompactionConfig; import org.apache.hudi.config.HoodieWriteConfig; import org.apache.hudi.config.HoodieWriteConfig.Builder; -import org.junit.Test; +import org.junit.jupiter.api.Test; import java.io.ByteArrayInputStream; import java.io.ByteArrayOutputStream; @@ -32,7 +32,7 @@ import java.util.HashMap; import java.util.Map; import java.util.Properties; -import static org.junit.Assert.assertEquals; +import static org.junit.jupiter.api.Assertions.assertEquals; public class TestHoodieWriteConfig { @@ -52,8 +52,8 @@ public class TestHoodieWriteConfig { inputStream.close(); } HoodieWriteConfig config = builder.build(); -assertEquals(config.getMaxCommitsToKeep(), 5); -assertEquals(config.getMinCommitsToKeep(), 2); +assertEquals(5, config.getMaxCommitsToKeep()); +assertEquals(2, config.getMinCommitsToKeep()); } private ByteArrayOutputStream saveParamsIntoOutputStream(Map params) throws IOException { diff --git a/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestBucketizedBloomCheckPartitioner.java b/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestBucketizedBloomCheckPartitioner.java index 3ad5a99..e946450 100644 --- a/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestBucketizedBloomCheckPartitioner.java +++ b/hudi-client/src/test/java/org/apache/hudi/index/bloom/TestBucketizedBloomCheckPartitioner.java @@ -20,7 +20,7 @@ package org.apache.hudi.index.bloom; import org.apache.hudi.common.util.collection.Pair; -import org.junit.Test; +import org.junit.jupiter.api.Test; import java.util.HashMap; import java.util.List; @@ -28,9 +28,9 @@ import java.util.Map; import java.util.stream.Collectors; import java.util.stream.IntStream; -import static org.junit.Assert.assertArrayEquals; -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; +import static org.junit.jupiter.api.Assertions.assertArrayEquals; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; public class TestBucketizedBloomCheckPartitioner { @@ -45,12 +45,12 @@ public class TestBucketizedBloomCheckPartitioner { }; BucketizedBloomCheckPartitioner p = new BucketizedBloomCheckPartitioner(4, fileToComparisons, 10); Map> assignments = p.getFileGroupToPartitions(); -assertEquals("f1 should have 4 buckets", 4, assignments.get("f1").size()); -assertEquals("f2 should have 4 buckets", 4, assignments.get("f2").size()); -assertEquals("f3 should have 2 buckets", 2, assignments.get("f3").size()); -assertArrayEquals("f1 spread across 3 partitions", new Integer[] {0, 0, 1, 3}, assignments.get("f1").toArray()); -assertArrayEquals("f2 spread across 3 partitions", new Integer[] {1, 2, 2, 0}, assignments.get("f2").toArray()); -assertArrayEquals("f3 spread across 2 partitions", new Integer[] {3, 1}, assignments.get("f3").toArray()); +assertEquals(4, assignments.get("f1").size(), "f1 should have 4 buckets"); +assertEquals(4, assignments.get("f2").size(), "f2 should have 4 buckets"); +assertEquals(2, assignments.get("f3").size(), "f3 should have 2 buckets"); +assertArrayEquals(new Integer[] {0, 0, 1, 3}, assignments.get("f1").toArray(), "f1 spread across 3 partitions"); +assertArrayEquals(new Integer[] {1, 2, 2, 0}, assignments.get("f2").toArray(), "f2 s
[incubator-hudi] branch hudi_test_suite_refactor updated (908e57c -> 7313a22)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 908e57c [HUDI-397]Normalize log print statement (#1224) add 7313a22 trigger rebuild No new revisions were added by this update. Summary of changes:
[incubator-hudi] branch master updated: [HUDI-809] Migrate CommonTestHarness to JUnit 5 (#1530)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 6e15eeb [HUDI-809] Migrate CommonTestHarness to JUnit 5 (#1530) 6e15eeb is described below commit 6e15eebd81da41b1076179a8ddcedcf07d1c9043 Author: Raymond Xu <2701446+xushi...@users.noreply.github.com> AuthorDate: Tue Apr 21 23:10:25 2020 -0700 [HUDI-809] Migrate CommonTestHarness to JUnit 5 (#1530) --- .../common/table/TestHoodieTableMetaClient.java| 54 ++-- .../hudi/common/table/TestTimelineLayout.java | 24 +- .../table/view/TestHoodieTableFileSystemView.java | 335 ++--- .../table/view/TestRocksDbBasedFileSystemView.java | 4 +- .../testutils/HoodieCommonTestHarnessJunit5.java | 52 .../apache/hudi/common/util/TestFileIOUtils.java | 20 +- hudi-hadoop-mr/pom.xml | 6 - .../apache/hudi/hadoop/InputFormatTestUtil.java| 67 ++--- .../hudi/hadoop/TestHoodieParquetInputFormat.java | 81 +++-- .../hudi/hadoop/TestHoodieROTablePathFilter.java | 26 +- .../realtime/TestHoodieCombineHiveInputFormat.java | 52 ++-- .../realtime/TestHoodieRealtimeRecordReader.java | 127 .../hudi/utilities/TestHoodieSnapshotCopier.java | 22 +- .../TestKafkaConnectHdfsProvider.java | 20 +- 14 files changed, 460 insertions(+), 430 deletions(-) diff --git a/hudi-common/src/test/java/org/apache/hudi/common/table/TestHoodieTableMetaClient.java b/hudi-common/src/test/java/org/apache/hudi/common/table/TestHoodieTableMetaClient.java index e1279d1..5e307bd 100644 --- a/hudi-common/src/test/java/org/apache/hudi/common/table/TestHoodieTableMetaClient.java +++ b/hudi-common/src/test/java/org/apache/hudi/common/table/TestHoodieTableMetaClient.java @@ -18,41 +18,41 @@ package org.apache.hudi.common.table; -import org.apache.hudi.common.HoodieCommonTestHarness; import org.apache.hudi.common.model.HoodieTestUtils; import org.apache.hudi.common.table.timeline.HoodieActiveTimeline; import org.apache.hudi.common.table.timeline.HoodieInstant; import org.apache.hudi.common.table.timeline.HoodieTimeline; +import org.apache.hudi.common.testutils.HoodieCommonTestHarnessJunit5; import org.apache.hudi.common.util.Option; -import org.junit.Before; -import org.junit.Test; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; import java.io.IOException; -import static org.junit.Assert.assertArrayEquals; -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertFalse; -import static org.junit.Assert.assertNotEquals; -import static org.junit.Assert.assertNotNull; -import static org.junit.Assert.assertTrue; +import static org.junit.jupiter.api.Assertions.assertArrayEquals; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNotEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertTrue; + /** * Tests hoodie table meta client {@link HoodieTableMetaClient}. */ -public class TestHoodieTableMetaClient extends HoodieCommonTestHarness { +public class TestHoodieTableMetaClient extends HoodieCommonTestHarnessJunit5 { - @Before + @BeforeEach public void init() throws IOException { initMetaClient(); } @Test public void checkMetadata() { -assertEquals("Table name should be raw_trips", HoodieTestUtils.RAW_TRIPS_TEST_NAME, -metaClient.getTableConfig().getTableName()); -assertEquals("Basepath should be the one assigned", basePath, metaClient.getBasePath()); -assertEquals("Metapath should be ${basepath}/.hoodie", basePath + "/.hoodie", metaClient.getMetaPath()); +assertEquals(HoodieTestUtils.RAW_TRIPS_TEST_NAME, metaClient.getTableConfig().getTableName(), "Table name should be raw_trips"); +assertEquals(basePath, metaClient.getBasePath(), "Basepath should be the one assigned"); +assertEquals(basePath + "/.hoodie", metaClient.getMetaPath(), "Metapath should be ${basepath}/.hoodie"); } @Test @@ -67,16 +67,15 @@ public class TestHoodieTableMetaClient extends HoodieCommonTestHarness { commitTimeline.saveAsComplete(instant, Option.of("test-detail".getBytes())); commitTimeline = commitTimeline.reload(); HoodieInstant completedInstant = HoodieTimeline.getCompletedInstant(instant); -assertEquals("Commit should be 1 and completed", completedInstant, commitTimeline.getInstants().findFirst().get()); -assertArrayEquals("Commit value should be \"test-detail\"", "test-detail".getBytes(), -commitTimeline.
[incubator-hudi] branch hudi_test_suite_refactor updated (e7b1474 -> 908e57c)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. discard e7b1474 [HUDI-397]Normalize log print statement (#1224) omit da3232e Testing running 3 builds to limit total build time omit c13e885 [HUDI-394] Provide a basic implementation of test suite add ddd105b [HUDI-772] Make UserDefinedBulkInsertPartitioner configurable for DataSource (#1500) add 2a2f31d [MINOR] Remove reduntant code and fix typo in HoodieDefaultTimeline (#1535) add 332072b [HUDI-371] Supporting hive combine input format for realtime tables (#1503) add 84dd904 [HUDI-789]Adjust logic of upsert in HDFSParquetImporter (#1511) add 62bd3e7 [HUDI-757] Added hudi-cli command to export metadata of Instants. add 2a56f82 [HUDI-821] Fixing JCommander param parsing in deltastreamer (#1525) add 6e15eeb [HUDI-809] Migrate CommonTestHarness to JUnit 5 (#1530) add 26684f5 [HUDI-816] Fixed MAX_MEMORY_FOR_MERGE_PROP and MAX_MEMORY_FOR_COMPACTION_PROP do not work due to HUDI-678 (#1536) add aea7c16 [HUDI-795] Handle auto-deleted empty aux folder (#1515) add 19cc15c [MINOR]: Fix cli docs for DeltaStreamer (#1547) add 0c75316 [HUDI-394] Provide a basic implementation of test suite add 7ab93b0 Testing running 3 builds to limit total build time add 908e57c [HUDI-397]Normalize log print statement (#1224) This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (e7b1474) \ N -- N -- N refs/heads/hudi_test_suite_refactor (908e57c) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .../apache/hudi/cli/commands/ExportCommand.java| 231 .../apache/hudi/client/utils/SparkConfigUtils.java | 10 +- .../org/apache/hudi/config/HoodieWriteConfig.java | 10 + .../apache/hudi/table/HoodieCommitArchiveLog.java | 24 +- .../hudi/client/utils/TestSparkConfigUtils.java| 65 +++ .../hudi/common/HoodieMergeOnReadTestUtils.java| 4 +- .../java/org/apache/hudi/avro/HoodieAvroUtils.java | 29 + .../table/timeline/HoodieDefaultTimeline.java | 6 +- .../table/timeline/TimelineMetadataUtils.java | 4 + .../hudi/common/util/collection/ArrayUtils.java| 62 ++ .../common/table/TestHoodieTableMetaClient.java| 54 +- .../hudi/common/table/TestTimelineLayout.java | 24 +- .../table/view/TestHoodieTableFileSystemView.java | 335 ++- .../table/view/TestRocksDbBasedFileSystemView.java | 4 +- .../HoodieCommonTestHarnessJunit5.java}| 33 +- .../apache/hudi/common/util/TestFileIOUtils.java | 20 +- hudi-hadoop-mr/pom.xml | 8 +- .../hadoop/hive/HoodieCombineHiveInputFormat.java | 626 - .../hive/HoodieCombineRealtimeFileSplit.java | 169 ++ .../hive/HoodieCombineRealtimeHiveSplit.java | 27 +- .../realtime/AbstractRealtimeRecordReader.java | 3 + .../HoodieCombineRealtimeRecordReader.java | 103 .../realtime/HoodieParquetRealtimeInputFormat.java | 2 +- .../realtime/HoodieRealtimeRecordReader.java | 1 + .../realtime/RealtimeUnmergedRecordReader.java | 22 +- .../apache/hudi/hadoop/InputFormatTestUtil.java| 165 -- .../hudi/hadoop/TestHoodieParquetInputFormat.java | 99 ++-- .../hudi/hadoop/TestHoodieROTablePathFilter.java | 26 +- .../realtime/TestHoodieCombineHiveInputFormat.java | 156 + .../realtime/TestHoodieRealtimeRecordReader.java | 206 +++ .../main/java/org/apache/hudi/DataSourceUtils.java | 27 +- hudi-spark/src/test/java/DataSourceTestUtils.java | 13 + hudi-spark/src/test/java/DataSourceUtilsTest.java | 86 +++ .../apache/hudi/utilities/HDFSParquetImporter.java | 22 +- .../deltastreamer/HoodieDeltaStreamer.java | 18 +- .../HoodieMultiTableDeltaStreamer.java | 5 +- .../hudi/utilities/TestHDFSParquetImporter.java| 255 +++-- .../hudi/utilities/TestHoodieSnapshotCopier.java | 22 +- .../TestKafkaConnectHdfsProvider.java | 20 +- 39 files changed, 2125 insertions(+), 871 deletions(-) create mode 100644 hudi-cli/src/main/java/org/apache/hudi/cli/commands/ExportCommand.java create mode 100644 hudi-client/src/test/java/org/apache/hudi/client/utils/TestSparkConfigUtils.java
[incubator-hudi] branch hudi_test_suite_refactor updated (da3232e -> e7b1474)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from da3232e Testing running 3 builds to limit total build time add e7b1474 [HUDI-397]Normalize log print statement (#1224) No new revisions were added by this update. Summary of changes: .../apache/hudi/testsuite/dag/nodes/BulkInsertNode.java | 2 +- .../org/apache/hudi/testsuite/dag/nodes/CleanNode.java | 2 +- .../org/apache/hudi/testsuite/dag/nodes/CompactNode.java | 2 +- .../org/apache/hudi/testsuite/dag/nodes/DagNode.java | 6 +++--- .../apache/hudi/testsuite/dag/nodes/HiveQueryNode.java | 6 +++--- .../apache/hudi/testsuite/dag/nodes/HiveSyncNode.java| 2 +- .../org/apache/hudi/testsuite/dag/nodes/InsertNode.java | 6 +++--- .../apache/hudi/testsuite/dag/nodes/RollbackNode.java| 4 ++-- .../hudi/testsuite/dag/nodes/ScheduleCompactNode.java| 4 ++-- .../hudi/testsuite/dag/nodes/SparkSQLQueryNode.java | 4 ++-- .../org/apache/hudi/testsuite/dag/nodes/UpsertNode.java | 4 ++-- .../hudi/testsuite/dag/scheduler/DagScheduler.java | 12 ++-- .../apache/hudi/testsuite/generator/DeltaGenerator.java | 6 +++--- .../generator/GenericRecordFullPayloadGenerator.java | 10 +- .../apache/hudi/testsuite/job/HoodieTestSuiteJob.java| 10 +- .../testsuite/reader/DFSHoodieDatasetInputReader.java| 16 .../hudi/testsuite/writer/AvroDeltaInputWriter.java | 7 --- .../reader/TestDFSHoodieDatasetInputReader.java | 1 + 18 files changed, 53 insertions(+), 51 deletions(-)
[incubator-hudi] branch master updated: [HUDI-813] Migrate hudi-utilities tests to JUnit 5 (#1589)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 096f7f5 [HUDI-813] Migrate hudi-utilities tests to JUnit 5 (#1589) 096f7f5 is described below commit 096f7f55b2553265c0b72f42a1eb7f291e5626ad Author: Raymond Xu <2701446+xushi...@users.noreply.github.com> AuthorDate: Sun May 3 21:43:42 2020 -0700 [HUDI-813] Migrate hudi-utilities tests to JUnit 5 (#1589) --- .../TestAWSDatabaseMigrationServiceSource.java | 16 ++-- .../hudi/utilities/TestHDFSParquetImporter.java| 33 .../hudi/utilities/TestHiveIncrementalPuller.java | 15 ++-- .../hudi/utilities/TestHoodieDeltaStreamer.java| 88 ++ .../TestHoodieMultiTableDeltaStreamer.java | 78 +-- .../utilities/TestJdbcbasedSchemaProvider.java | 12 +-- .../hudi/utilities/TestSchedulerConfGenerator.java | 16 ++-- .../utilities/TestTimestampBasedKeyGenerator.java | 16 ++-- .../org/apache/hudi/utilities/TestUtilHelpers.java | 48 ++-- .../apache/hudi/utilities/UtilitiesTestBase.java | 16 ++-- .../utilities/inline/fs/TestParquetInLining.java | 10 +-- .../sources/AbstractDFSSourceTestBase.java | 22 +++--- .../hudi/utilities/sources/TestCsvDFSSource.java | 4 +- .../hudi/utilities/sources/TestJsonDFSSource.java | 4 +- .../hudi/utilities/sources/TestKafkaSource.java| 20 ++--- .../utilities/sources/TestParquetDFSSource.java| 4 +- .../transform/TestFlatteningTransformer.java | 4 +- 17 files changed, 192 insertions(+), 214 deletions(-) diff --git a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestAWSDatabaseMigrationServiceSource.java b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestAWSDatabaseMigrationServiceSource.java index d015a42..1fb45f0 100644 --- a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestAWSDatabaseMigrationServiceSource.java +++ b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestAWSDatabaseMigrationServiceSource.java @@ -28,29 +28,29 @@ import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.Dataset; import org.apache.spark.sql.Row; import org.apache.spark.sql.SparkSession; -import org.junit.AfterClass; -import org.junit.BeforeClass; -import org.junit.Test; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; import java.io.IOException; import java.io.Serializable; import java.util.Arrays; -import static org.junit.Assert.assertFalse; -import static org.junit.Assert.assertTrue; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; public class TestAWSDatabaseMigrationServiceSource { private static JavaSparkContext jsc; private static SparkSession spark; - @BeforeClass + @BeforeAll public static void setupTest() { jsc = UtilHelpers.buildSparkContext("aws-dms-test", "local[2]"); spark = SparkSession.builder().config(jsc.getConf()).getOrCreate(); } - @AfterClass + @AfterAll public static void tearDownTest() { if (jsc != null) { jsc.stop(); @@ -99,7 +99,7 @@ public class TestAWSDatabaseMigrationServiceSource { new Record("2", 3433L)), Record.class); Dataset outputFrame = transformer.apply(jsc, spark, inputFrame, null); -assertTrue(Arrays.asList(outputFrame.schema().fields()).stream() +assertTrue(Arrays.stream(outputFrame.schema().fields()) .map(f -> f.name()).anyMatch(n -> n.equals(AWSDmsAvroPayload.OP_FIELD))); assertTrue(outputFrame.select(AWSDmsAvroPayload.OP_FIELD).collectAsList().stream() .allMatch(r -> r.getString(0).equals(""))); diff --git a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java index a4711b5..cf6cf75 100644 --- a/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java +++ b/hudi-utilities/src/test/java/org/apache/hudi/utilities/TestHDFSParquetImporter.java @@ -41,12 +41,11 @@ import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.Dataset; import org.apache.spark.sql.Row; import org.apache.spark.sql.SQLContext; - -import org.junit.After; -import org.junit.AfterClass; -import org.junit.Before; -import org.junit.BeforeClass; -import org.junit.Test; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; import java.io.IOException; import java.io.Serializable; @@ -61,8 +60,8 @@ import java.util.concurrent.TimeUnit; im
[incubator-hudi] branch hudi_test_suite_refactor updated (7db66af -> 33590b7)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 7db66af [HUDI-394] Provide a basic implementation of test suite add 33590b7 [MINOR] Code cleanup for DeltaConfig No new revisions were added by this update. Summary of changes: .../hudi/testsuite/configuration/DeltaConfig.java | 23 +++--- 1 file changed, 11 insertions(+), 12 deletions(-)
[incubator-hudi] branch master updated (404c7e8 -> 32ea4c7)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 404c7e8 [HUDI-884] Shade avro and parquet-avro in hudi-hive-sync-bundle (#1618) add 32ea4c7 [HUDI-869] Add support for alluxio (#1608) No new revisions were added by this update. Summary of changes: .../src/main/java/org/apache/hudi/common/fs/StorageSchemes.java | 4 +++- .../test/java/org/apache/hudi/common/storage/TestStorageSchemes.java | 1 + 2 files changed, 4 insertions(+), 1 deletion(-)
[incubator-hudi] branch master updated: [HUDI-705] Add unit test for RollbacksCommand (#1611)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 57132f7 [HUDI-705] Add unit test for RollbacksCommand (#1611) 57132f7 is described below commit 57132f79bb2dad6cfb215480b435a778714a442d Author: hongdd AuthorDate: Mon May 18 14:04:06 2020 +0800 [HUDI-705] Add unit test for RollbacksCommand (#1611) --- .../apache/hudi/cli/HoodieTableHeaderFields.java | 10 ++ .../apache/hudi/cli/commands/RollbacksCommand.java | 19 ++- .../hudi/cli/commands/TestRollbacksCommand.java| 182 + 3 files changed, 204 insertions(+), 7 deletions(-) diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java index 5e31e5c..4fc41a1 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/HoodieTableHeaderFields.java @@ -23,6 +23,7 @@ package org.apache.hudi.cli; */ public class HoodieTableHeaderFields { public static final String HEADER_PARTITION = "Partition"; + public static final String HEADER_INSTANT = "Instant"; public static final String HEADER_PARTITION_PATH = HEADER_PARTITION + " Path"; public static final String HEADER_FILE_ID = "FileId"; public static final String HEADER_BASE_INSTANT = "Base-Instant"; @@ -81,4 +82,13 @@ public class HoodieTableHeaderFields { public static final String HEADER_HOODIE_PROPERTY = "Property"; public static final String HEADER_OLD_VALUE = "Old Value"; public static final String HEADER_NEW_VALUE = "New Value"; + + /** + * Fields of Rollback. + */ + public static final String HEADER_ROLLBACK_INSTANT = "Rolledback " + HEADER_INSTANT; + public static final String HEADER_TIME_TOKEN_MILLIS = "Time taken in millis"; + public static final String HEADER_TOTAL_PARTITIONS = "Total Partitions"; + public static final String HEADER_DELETED_FILE = "Deleted File"; + public static final String HEADER_SUCCEEDED = "Succeeded"; } diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java index 70b34bc..4feb4c1 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/RollbacksCommand.java @@ -21,6 +21,7 @@ package org.apache.hudi.cli.commands; import org.apache.hudi.avro.model.HoodieRollbackMetadata; import org.apache.hudi.cli.HoodieCLI; import org.apache.hudi.cli.HoodiePrintHelper; +import org.apache.hudi.cli.HoodieTableHeaderFields; import org.apache.hudi.cli.TableHeader; import org.apache.hudi.common.table.HoodieTableMetaClient; import org.apache.hudi.common.table.timeline.HoodieActiveTimeline; @@ -56,8 +57,7 @@ public class RollbacksCommand implements CommandMarker { @CliOption(key = {"sortBy"}, help = "Sorting Field", unspecifiedDefaultValue = "") final String sortByField, @CliOption(key = {"desc"}, help = "Ordering", unspecifiedDefaultValue = "false") final boolean descending, @CliOption(key = {"headeronly"}, help = "Print Header Only", - unspecifiedDefaultValue = "false") final boolean headerOnly) - throws IOException { + unspecifiedDefaultValue = "false") final boolean headerOnly) { HoodieActiveTimeline activeTimeline = new RollbackTimeline(HoodieCLI.getTableMetaClient()); HoodieTimeline rollback = activeTimeline.getRollbackTimeline().filterCompletedInstants(); @@ -79,9 +79,11 @@ public class RollbacksCommand implements CommandMarker { e.printStackTrace(); } }); -TableHeader header = new TableHeader().addTableHeaderField("Instant").addTableHeaderField("Rolledback Instant") -.addTableHeaderField("Total Files Deleted").addTableHeaderField("Time taken in millis") -.addTableHeaderField("Total Partitions"); +TableHeader header = new TableHeader().addTableHeaderField(HoodieTableHeaderFields.HEADER_INSTANT) +.addTableHeaderField(HoodieTableHeaderFields.HEADER_ROLLBACK_INSTANT) + .addTableHeaderField(HoodieTableHeaderFields.HEADER_TOTAL_FILES_DELETED) +.addTableHeaderField(HoodieTableHeaderFields.HEADER_TIME_TOKEN_MILLIS) +.addTableHeaderField(HoodieTableHeaderFields.HEADER_TOTAL_PARTITIONS); return HoodiePrintHelper.print(header, new HashMap<>(), sortByField, descending, limit, headerOnly, rows); } @@ -112,8 +114,11 @@ public class
[incubator-hudi] branch hudi_test_suite_refactor updated (33590b7 -> 6f4547d)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch hudi_test_suite_refactor in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 33590b7 [MINOR] Code cleanup for DeltaConfig add 6f4547d [MINOR] Code cleanup for dag package No new revisions were added by this update. Summary of changes: .../src/main/java/org/apache/hudi/testsuite/dag/DagUtils.java | 6 +++--- .../src/main/java/org/apache/hudi/testsuite/dag/nodes/DagNode.java | 6 +++--- .../java/org/apache/hudi/testsuite/dag/nodes/SparkSQLQueryNode.java | 2 +- 3 files changed, 7 insertions(+), 7 deletions(-)
[incubator-hudi] branch master updated: [HUDI-701] Add unit test for HDFSParquetImportCommand (#1574)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 3a2fe13 [HUDI-701] Add unit test for HDFSParquetImportCommand (#1574) 3a2fe13 is described below commit 3a2fe13fcb7c168f8ff023e3bdb6ae482b400316 Author: hongdd AuthorDate: Thu May 14 19:15:49 2020 +0800 [HUDI-701] Add unit test for HDFSParquetImportCommand (#1574) --- hudi-cli/pom.xml | 7 + .../cli/commands/HDFSParquetImportCommand.java | 8 +- .../org/apache/hudi/cli/commands/SparkMain.java| 19 +-- .../cli/integ/ITTestHDFSParquetImportCommand.java | 186 + .../functional/TestHDFSParquetImporter.java| 8 +- 5 files changed, 209 insertions(+), 19 deletions(-) diff --git a/hudi-cli/pom.xml b/hudi-cli/pom.xml index fed2bf9..dbb4463 100644 --- a/hudi-cli/pom.xml +++ b/hudi-cli/pom.xml @@ -194,6 +194,13 @@ test test-jar + + org.apache.hudi + hudi-utilities_${scala.binary.version} + ${project.version} + test + test-jar + diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java index 0f1db50..a31f310 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java @@ -18,7 +18,6 @@ package org.apache.hudi.cli.commands; -import org.apache.hudi.cli.HoodieCLI; import org.apache.hudi.cli.commands.SparkMain.SparkCommand; import org.apache.hudi.cli.utils.InputStreamConsumer; import org.apache.hudi.cli.utils.SparkUtil; @@ -57,6 +56,7 @@ public class HDFSParquetImportCommand implements CommandMarker { @CliOption(key = "schemaFilePath", mandatory = true, help = "Path for Avro schema file") final String schemaFilePath, @CliOption(key = "format", mandatory = true, help = "Format for the input data") final String format, + @CliOption(key = "sparkMaster", unspecifiedDefaultValue = "", help = "Spark Master") String master, @CliOption(key = "sparkMemory", mandatory = true, help = "Spark executor memory") final String sparkMemory, @CliOption(key = "retry", mandatory = true, help = "Number of retries") final String retry, @CliOption(key = "propsFilePath", help = "path to properties file on localfs or dfs with configurations for hoodie client for importing", @@ -66,8 +66,6 @@ public class HDFSParquetImportCommand implements CommandMarker { (new FormatValidator()).validate("format", format); -boolean initialized = HoodieCLI.initConf(); -HoodieCLI.initFS(initialized); String sparkPropertiesPath = Utils.getDefaultPropertiesFile(JavaConverters.mapAsScalaMapConverter(System.getenv()).asScala()); @@ -78,8 +76,8 @@ public class HDFSParquetImportCommand implements CommandMarker { cmd = SparkCommand.UPSERT.toString(); } -sparkLauncher.addAppArgs(cmd, srcPath, targetPath, tableName, tableType, rowKeyField, partitionPathField, -parallelism, schemaFilePath, sparkMemory, retry, propsFilePath); +sparkLauncher.addAppArgs(cmd, master, sparkMemory, srcPath, targetPath, tableName, tableType, rowKeyField, +partitionPathField, parallelism, schemaFilePath, retry, propsFilePath); UtilHelpers.validateAndAddProperties(configs, sparkLauncher); Process process = sparkLauncher.launch(); InputStreamConsumer.captureOutput(process); diff --git a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java index 5d8972d..be9d7dd 100644 --- a/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java +++ b/hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java @@ -82,17 +82,17 @@ public class SparkMain { break; case IMPORT: case UPSERT: -assert (args.length >= 12); +assert (args.length >= 13); String propsFilePath = null; -if (!StringUtils.isNullOrEmpty(args[11])) { - propsFilePath = args[11]; +if (!StringUtils.isNullOrEmpty(args[12])) { + propsFilePath = args[12]; } List configs = new ArrayList<>(); -if (args.length > 12) { - configs.addAll(Arrays.asList(args).subList(12, args.length)); +if (args.length > 13) { + configs.addAll(Arrays.asList(args).subList(13, args.length)); } -returnCode = dataLoad(jsc, command, args[1], args[2], args[3], args[4], args[5
[incubator-hudi] branch master updated: [HUDI-725] Remove init log in the constructor of DeltaSync (#1425)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new eeab532 [HUDI-725] Remove init log in the constructor of DeltaSync (#1425) eeab532 is described below commit eeab532d794426115f839e6ee11a9fc1314698fe Author: Mathieu AuthorDate: Fri Mar 20 17:47:59 2020 +0800 [HUDI-725] Remove init log in the constructor of DeltaSync (#1425) --- .../src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java | 1 - 1 file changed, 1 deletion(-) diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java index 3073dfa..b8524ba 100644 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java +++ b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/DeltaSync.java @@ -168,7 +168,6 @@ public class DeltaSync implements Serializable { this.tableType = tableType; this.onInitializingHoodieWriteClient = onInitializingHoodieWriteClient; this.props = props; -LOG.info("Creating delta streamer with configs : " + props.toString()); this.schemaProvider = schemaProvider; refreshTimeline();
[incubator-hudi] branch master updated: [HUDI-726]Delete unused method in HoodieDeltaStreamer (#1426)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new 21c45e1 [HUDI-726]Delete unused method in HoodieDeltaStreamer (#1426) 21c45e1 is described below commit 21c45e1051b593f0e1023a84cb96658320046dae Author: Mathieu AuthorDate: Fri Mar 20 17:44:16 2020 +0800 [HUDI-726]Delete unused method in HoodieDeltaStreamer (#1426) --- .../org/apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java | 4 1 file changed, 4 deletions(-) diff --git a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java index 01ab1cc..bff2b41 100644 --- a/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java +++ b/hudi-utilities/src/main/java/org/apache/hudi/utilities/deltastreamer/HoodieDeltaStreamer.java @@ -576,8 +576,4 @@ public class HoodieDeltaStreamer implements Serializable { }, executor)).toArray(CompletableFuture[]::new)), executor); } } - - public DeltaSyncService getDeltaSyncService() { -return deltaSyncService; - } }
[incubator-hudi] branch master updated: [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git The following commit(s) were added to refs/heads/master by this push: new c5030f7 [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417) c5030f7 is described below commit c5030f77a0e63f609ed2c674bea00201b97d8bb6 Author: vinoyang AuthorDate: Sat Mar 21 10:54:04 2020 +0800 [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417) * [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles --- LICENSE | 9 NOTICE | 146 +++- 2 files changed, 154 insertions(+), 1 deletion(-) diff --git a/LICENSE b/LICENSE index 28dfacd..ed8458a 100644 --- a/LICENSE +++ b/LICENSE @@ -321,3 +321,12 @@ Copyright (c) 2005, European Commission project OneLab under contract 034819 (ht License: http://www.apache.org/licenses/LICENSE-2.0 --- + + This product includes code from Apache commons-lang + + * org.apache.hudi.common.util.collection.Pair adapted from org.apache.commons.lang3.tuple.Pair + + Copyright 2001-2020 The Apache Software Foundation + + Home page: https://commons.apache.org/proper/commons-lang/ + License: http://www.apache.org/licenses/LICENSE-2.0 diff --git a/NOTICE b/NOTICE index ecd4479..59e56b4 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,149 @@ Apache Hudi (incubating) -Copyright 2019 and onwards The Apache Software Foundation +Copyright 2019-2020 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). + + + +This product includes code from Apache Hive, which includes the following in +its NOTICE file: + + Apache Hive + Copyright 2008-2018 The Apache Software Foundation + + This product includes software developed by The Apache Software + Foundation (http://www.apache.org/). + + This project includes software licensed under the JSON license. + + + +This product includes code from Apache SystemML, which includes the following in +its NOTICE file: + + Apache SystemML + Copyright [2015-2018] The Apache Software Foundation + + This product includes software developed at + The Apache Software Foundation (http://www.apache.org/). + + + +This product includes code from Apache Spark, which includes the following in +its NOTICE file: + + Apache Spark + Copyright 2014 and onwards The Apache Software Foundation. + + This product includes software developed at + The Apache Software Foundation (http://www.apache.org/). + + + Export Control Notice + - + + This distribution includes cryptographic software. The country in which you currently reside may have + restrictions on the import, possession, use, and/or re-export to another country, of encryption software. + BEFORE using any encryption software, please check your country's laws, regulations and policies concerning + the import, possession, or use, and re-export of encryption software, to see if this is permitted. See + <http://www.wassenaar.org/> for more information. + + The U.S. Government Department of Commerce, Bureau of Industry and Security (BIS), has classified this + software as Export Commodity Control Number (ECCN) 5D002.C.1, which includes information security software + using or performing cryptographic functions with asymmetric algorithms. The form and manner of this Apache + Software Foundation distribution makes it eligible for export under the License Exception ENC Technology + Software Unrestricted (TSU) exception (see the BIS Export Administration Regulations, Section 740.13) for + both object code and source code. + + The following provides more details on the included cryptographic software: + + This software uses Apache Commons Crypto (https://commons.apache.org/proper/commons-crypto/) to + support authentication, and encryption and decryption of data sent across the network between + services. + + + Metrics + Copyright 2010-2013 Coda Hale and Yammer, Inc. + + This product includes software developed by Coda Hale and Yammer, Inc. + + This product includes code derived from the JSR-166 project (ThreadLocalRandom, Striped64, + LongAdder), which was released with the following comments: + + Written by Doug Lea with assistance from members of JCP JSR-166 + Expert Group and released to the public domain, as explained at +
[incubator-hudi] branch release-0.5.2 updated (3639241 -> 5029888)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a change to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git. from 3639241 [HUDI-676] Address issues towards removing use of WIP Disclaimer (#1386) add a355c76 [HUDI-688] Paring down the NOTICE file to minimum required notices (#1391) new 71b5d92 [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417) new 5029888 Bumping release candidate number 2 The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: LICENSE | 11 +- NOTICE| 179 +- docker/hoodie/hadoop/base/pom.xml | 2 +- docker/hoodie/hadoop/datanode/pom.xml | 2 +- docker/hoodie/hadoop/historyserver/pom.xml| 2 +- docker/hoodie/hadoop/hive_base/pom.xml| 2 +- docker/hoodie/hadoop/namenode/pom.xml | 2 +- docker/hoodie/hadoop/pom.xml | 2 +- docker/hoodie/hadoop/prestobase/pom.xml | 2 +- docker/hoodie/hadoop/spark_base/pom.xml | 2 +- docker/hoodie/hadoop/sparkadhoc/pom.xml | 2 +- docker/hoodie/hadoop/sparkmaster/pom.xml | 2 +- docker/hoodie/hadoop/sparkworker/pom.xml | 2 +- hudi-cli/pom.xml | 2 +- hudi-client/pom.xml | 2 +- hudi-common/pom.xml | 2 +- hudi-hadoop-mr/pom.xml| 2 +- hudi-hive/pom.xml | 2 +- hudi-integ-test/pom.xml | 2 +- hudi-spark/pom.xml| 2 +- hudi-timeline-service/pom.xml | 2 +- hudi-utilities/pom.xml| 2 +- packaging/hudi-hadoop-mr-bundle/pom.xml | 2 +- packaging/hudi-hive-bundle/pom.xml| 2 +- packaging/hudi-presto-bundle/pom.xml | 2 +- packaging/hudi-spark-bundle/pom.xml | 2 +- packaging/hudi-timeline-server-bundle/pom.xml | 2 +- packaging/hudi-utilities-bundle/pom.xml | 2 +- pom.xml | 2 +- 29 files changed, 158 insertions(+), 86 deletions(-)
[incubator-hudi] 01/02: [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417)
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git commit 71b5d92a15844ef1847721f2fac21b8a2956a8b7 Author: vinoyang AuthorDate: Sat Mar 21 10:54:04 2020 +0800 [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles (#1417) * [HUDI-720] NOTICE file needs to add more content based on the NOTICE files of the ASF projects that hudi bundles --- LICENSE | 9 NOTICE | 146 +++- 2 files changed, 154 insertions(+), 1 deletion(-) diff --git a/LICENSE b/LICENSE index 28dfacd..ed8458a 100644 --- a/LICENSE +++ b/LICENSE @@ -321,3 +321,12 @@ Copyright (c) 2005, European Commission project OneLab under contract 034819 (ht License: http://www.apache.org/licenses/LICENSE-2.0 --- + + This product includes code from Apache commons-lang + + * org.apache.hudi.common.util.collection.Pair adapted from org.apache.commons.lang3.tuple.Pair + + Copyright 2001-2020 The Apache Software Foundation + + Home page: https://commons.apache.org/proper/commons-lang/ + License: http://www.apache.org/licenses/LICENSE-2.0 diff --git a/NOTICE b/NOTICE index ecd4479..59e56b4 100644 --- a/NOTICE +++ b/NOTICE @@ -1,5 +1,149 @@ Apache Hudi (incubating) -Copyright 2019 and onwards The Apache Software Foundation +Copyright 2019-2020 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (http://www.apache.org/). + + + +This product includes code from Apache Hive, which includes the following in +its NOTICE file: + + Apache Hive + Copyright 2008-2018 The Apache Software Foundation + + This product includes software developed by The Apache Software + Foundation (http://www.apache.org/). + + This project includes software licensed under the JSON license. + + + +This product includes code from Apache SystemML, which includes the following in +its NOTICE file: + + Apache SystemML + Copyright [2015-2018] The Apache Software Foundation + + This product includes software developed at + The Apache Software Foundation (http://www.apache.org/). + + + +This product includes code from Apache Spark, which includes the following in +its NOTICE file: + + Apache Spark + Copyright 2014 and onwards The Apache Software Foundation. + + This product includes software developed at + The Apache Software Foundation (http://www.apache.org/). + + + Export Control Notice + - + + This distribution includes cryptographic software. The country in which you currently reside may have + restrictions on the import, possession, use, and/or re-export to another country, of encryption software. + BEFORE using any encryption software, please check your country's laws, regulations and policies concerning + the import, possession, or use, and re-export of encryption software, to see if this is permitted. See + <http://www.wassenaar.org/> for more information. + + The U.S. Government Department of Commerce, Bureau of Industry and Security (BIS), has classified this + software as Export Commodity Control Number (ECCN) 5D002.C.1, which includes information security software + using or performing cryptographic functions with asymmetric algorithms. The form and manner of this Apache + Software Foundation distribution makes it eligible for export under the License Exception ENC Technology + Software Unrestricted (TSU) exception (see the BIS Export Administration Regulations, Section 740.13) for + both object code and source code. + + The following provides more details on the included cryptographic software: + + This software uses Apache Commons Crypto (https://commons.apache.org/proper/commons-crypto/) to + support authentication, and encryption and decryption of data sent across the network between + services. + + + Metrics + Copyright 2010-2013 Coda Hale and Yammer, Inc. + + This product includes software developed by Coda Hale and Yammer, Inc. + + This product includes code derived from the JSR-166 project (ThreadLocalRandom, Striped64, + LongAdder), which was released with the following comments: + + Written by Doug Lea with assistance from members of JCP JSR-166 + Expert Group and released to the public domain, as explained at + http://creativecommons.org/publicdomain/zero/1.0/ + + + +Portions of this software were developed at +Twitter, Inc (https://twitt
[incubator-hudi] 02/02: Bumping release candidate number 2
This is an automated email from the ASF dual-hosted git repository. vinoyang pushed a commit to branch release-0.5.2 in repository https://gitbox.apache.org/repos/asf/incubator-hudi.git commit 5029888db024898e928bf301da2a78a7432ba198 Author: yanghua AuthorDate: Sat Mar 21 16:23:18 2020 +0800 Bumping release candidate number 2 --- docker/hoodie/hadoop/base/pom.xml | 2 +- docker/hoodie/hadoop/datanode/pom.xml | 2 +- docker/hoodie/hadoop/historyserver/pom.xml| 2 +- docker/hoodie/hadoop/hive_base/pom.xml| 2 +- docker/hoodie/hadoop/namenode/pom.xml | 2 +- docker/hoodie/hadoop/pom.xml | 2 +- docker/hoodie/hadoop/prestobase/pom.xml | 2 +- docker/hoodie/hadoop/spark_base/pom.xml | 2 +- docker/hoodie/hadoop/sparkadhoc/pom.xml | 2 +- docker/hoodie/hadoop/sparkmaster/pom.xml | 2 +- docker/hoodie/hadoop/sparkworker/pom.xml | 2 +- hudi-cli/pom.xml | 2 +- hudi-client/pom.xml | 2 +- hudi-common/pom.xml | 2 +- hudi-hadoop-mr/pom.xml| 2 +- hudi-hive/pom.xml | 2 +- hudi-integ-test/pom.xml | 2 +- hudi-spark/pom.xml| 2 +- hudi-timeline-service/pom.xml | 2 +- hudi-utilities/pom.xml| 2 +- packaging/hudi-hadoop-mr-bundle/pom.xml | 2 +- packaging/hudi-hive-bundle/pom.xml| 2 +- packaging/hudi-presto-bundle/pom.xml | 2 +- packaging/hudi-spark-bundle/pom.xml | 2 +- packaging/hudi-timeline-server-bundle/pom.xml | 2 +- packaging/hudi-utilities-bundle/pom.xml | 2 +- pom.xml | 2 +- 27 files changed, 27 insertions(+), 27 deletions(-) diff --git a/docker/hoodie/hadoop/base/pom.xml b/docker/hoodie/hadoop/base/pom.xml index 3a61697..5c05aa4 100644 --- a/docker/hoodie/hadoop/base/pom.xml +++ b/docker/hoodie/hadoop/base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/datanode/pom.xml b/docker/hoodie/hadoop/datanode/pom.xml index 745c637..f04c70c 100644 --- a/docker/hoodie/hadoop/datanode/pom.xml +++ b/docker/hoodie/hadoop/datanode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/historyserver/pom.xml b/docker/hoodie/hadoop/historyserver/pom.xml index 69be132..293cbac 100644 --- a/docker/hoodie/hadoop/historyserver/pom.xml +++ b/docker/hoodie/hadoop/historyserver/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/hive_base/pom.xml b/docker/hoodie/hadoop/hive_base/pom.xml index edee9b1..67aec60 100644 --- a/docker/hoodie/hadoop/hive_base/pom.xml +++ b/docker/hoodie/hadoop/hive_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/namenode/pom.xml b/docker/hoodie/hadoop/namenode/pom.xml index c299707..89c1764 100644 --- a/docker/hoodie/hadoop/namenode/pom.xml +++ b/docker/hoodie/hadoop/namenode/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/pom.xml b/docker/hoodie/hadoop/pom.xml index a749480..8cb31a0 100644 --- a/docker/hoodie/hadoop/pom.xml +++ b/docker/hoodie/hadoop/pom.xml @@ -19,7 +19,7 @@ hudi org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 ../../../pom.xml 4.0.0 diff --git a/docker/hoodie/hadoop/prestobase/pom.xml b/docker/hoodie/hadoop/prestobase/pom.xml index e2ca512..e9d3f33 100644 --- a/docker/hoodie/hadoop/prestobase/pom.xml +++ b/docker/hoodie/hadoop/prestobase/pom.xml @@ -22,7 +22,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/spark_base/pom.xml b/docker/hoodie/hadoop/spark_base/pom.xml index 439dd73..4e67030 100644 --- a/docker/hoodie/hadoop/spark_base/pom.xml +++ b/docker/hoodie/hadoop/spark_base/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff --git a/docker/hoodie/hadoop/sparkadhoc/pom.xml b/docker/hoodie/hadoop/sparkadhoc/pom.xml index fccd5f3..313d2f1 100644 --- a/docker/hoodie/hadoop/sparkadhoc/pom.xml +++ b/docker/hoodie/hadoop/sparkadhoc/pom.xml @@ -19,7 +19,7 @@ hudi-hadoop-docker org.apache.hudi -0.5.2-incubating-rc1 +0.5.2-incubating-rc2 4.0.0 pom diff
svn commit: r38576 - in /dev/incubator/hudi/hudi-0.5.2-incubating-rc2: ./ hudi-0.5.2-incubating-rc2.src.tgz hudi-0.5.2-incubating-rc2.src.tgz.asc hudi-0.5.2-incubating-rc2.src.tgz.sha512
Author: vinoyang Date: Sat Mar 21 09:04:21 2020 New Revision: 38576 Log: hudi 0.5.2 incubating rc2 Added: dev/incubator/hudi/hudi-0.5.2-incubating-rc2/ dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz (with props) dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.asc dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.sha512 Added: dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz == Binary file - no diff available. Propchange: dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz -- svn:mime-type = application/octet-stream Added: dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.asc == --- dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.asc (added) +++ dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.asc Sat Mar 21 09:04:21 2020 @@ -0,0 +1,11 @@ +-BEGIN PGP SIGNATURE- + +iQEzBAABCAAdFiEEw6lux3FJVxron4J2TIZoTQR94DwFAl510D0ACgkQTIZoTQR9 +4DxxOggAyeI6xJvnEZinr71o7yFuGck7tIDgqe8BFx0u98aRG1YqkaicxZjwPqLp +4C4Y1ybePlMt/j75XgfeT6GjyR5EtRaGu+P3VknthADrvJKuQoo6raHElJjpLd6B +TummGjAuns4Cipx0DolhJKLmJKDZWUC8/qMCezxf4Z2MGVRhDxJVpkVyv1oVaN8e +w6vEyxo86tjTvqLrKwrbaqGIfZerNl1sZwZUQ6jMhvyS048Eh/REA67U8q74B0w6 +VJqM5X4WhxEpXDTlFAEDdbcPhRyK3kuEXrzzsT+a1DWVZCAn3HBjRfaWhIkRFJBT ++7WbRodKQApD3i1zGkhsEpvWhGYMSQ== +=xi+Y +-END PGP SIGNATURE- Added: dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.sha512 == --- dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.sha512 (added) +++ dev/incubator/hudi/hudi-0.5.2-incubating-rc2/hudi-0.5.2-incubating-rc2.src.tgz.sha512 Sat Mar 21 09:04:21 2020 @@ -0,0 +1 @@ +fed753810de419955359c6b0b57651920da37ae4e8b04c4b0a8cbe43960103b947ae1ae2352721e70548d3165f850635588169a4c7cf90ccd3b762ef65c85b23 hudi-0.5.2-incubating-rc2.src.tgz