nsivabalan opened a new pull request #5065:
URL: https://github.com/apache/hudi/pull/5065
## What is the purpose of the pull request
- Added multi-writer test support to integ test framework. With this
support, two jobs can concurrently run (1 deltastreamer and 1 spark datasource
) writing to same hudi table. Each job will do localized validation (for the
data ingested by self).
## Brief change log
- Added a new job called HoodieMultiWriterTestSuiteJob, which can be used to
trigger multiple concurrent jobs.
- Fixed sparkdatasource test nodes integ test to de-couple from
deltastreamer checkpointing.
- Added properties and yamls for multi-writer tests
## Verify this pull request
- Verified by running a job locally
```
spark-submit --packages org.apache.spark:spark-avro_2.11:2.4.0 --conf
spark.task.cpus=3 --conf spark.executor.cores=3 --conf
spark.task.maxFailures=100 --conf spark.memory.fraction=0.4 --conf
spark.rdd.compress=true --conf spark.kryoserializer.buffer.max=2000m --conf
spark.serializer=org.apache.spark.serializer.KryoSerializer --conf
spark.memory.storageFraction=0.1 --conf spark.shuffle.service.enabled=true
--conf spark.sql.hive.convertMetastoreParquet=false --conf
spark.driver.maxResultSize=12g --conf spark.executor.heartbeatInterval=120s
--conf spark.network.timeout=600s --conf spark.yarn.max.executor.failures=10
--conf spark.sql.catalogImplementation=hive --conf
spark.driver.extraClassPath=/var/demo/jars/* --conf
spark.executor.extraClassPath=/var/demo/jars/* --class
org.apache.hudi.integ.testsuite.HoodieMultiWriterTestSuiteJob
/opt/hudi-integ-test-bundle-0.11.0-SNAPSHOT.jar --source-ordering-field
test_suite_source_ordering_field --use-deltastreamer --target-base-path /user
/hive/warehouse/hudi-integ-test-suite/output --input-base-paths
"/user/hive/warehouse/hudi-integ-test-suite/input1,/user/hive/warehouse/hudi-integ-test-suite/input2"
--target-table table1 --props-paths
"multi-writer-1.properties,multi-writer-2.properties" --schemaprovider-class
org.apache.hudi.integ.testsuite.schema.TestSuiteFileBasedSchemaProvider
--source-class org.apache.hudi.utilities.sources.AvroDFSSource
--input-file-size 125829120 --workload-yaml-paths
"file:/opt/multi-writer-1-ds.yaml,file:/opt/multi-writer-2-sds.yaml"
--workload-generator-classname
org.apache.hudi.integ.testsuite.dag.WorkflowDagGenerator --table-type
COPY_ON_WRITE --compact-scheduling-minshare 1 --input-base-path "dummyValue"
--workload-yaml-path "dummyValue" --props "dummyValue"
--use-hudi-data-to-generate-updates
```
Properties of interest (or diff from a regular test suite job)
--input-base-paths
"/user/hive/warehouse/hudi-integ-test-suite/input1,/user/hive/warehouse/hudi-integ-test-suite/input2"
--props-paths "multi-writer-1.properties,multi-writer-2.properties"
--workload-yaml-paths
"file:/opt/multi-writer-1-ds.yaml,file:/opt/multi-writer-2-sds.yaml"
## Committer checklist
- [ ] Has a corresponding JIRA in PR title & commit
- [ ] Commit message is descriptive of the change
- [ ] CI is green
- [ ] Necessary doc changes done or have another open PR
- [ ] For large changes, please consider breaking it into sub-tasks under
an umbrella JIRA.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]