rangareddy commented on issue #12070: URL: https://github.com/apache/hudi/issues/12070#issuecomment-2401802358
Hi @dubeyx I can see that you have submitted the Spark application using incorrect syntax. Please use the following spark-submit command and let me know how it goes. ```sh spark-submit \ --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer /data/debezium/spark/jars/hudi-utilities-slim-bundle_2.12-0.14.0.jar \ --packages org.apache.hudi:hudi-spark3.4-bundle_2.12:0.14.0 \ --jars /data/debezium/spark/jars/hudi-common-0.14.0.jar,/data/debezium/spark/jars/mbk-transformer-0.0.1-SNAPSHOT.jar \ --properties-file spark-config.properties \ --master 'local[*]' \ --executor-memory 5g \ --driver-memory 10g \ --table-type COPY_ON_WRITE \ --target-base-path /data/debezium/spark/data/hudini \ --target-table hudi_memberbase \ --source-ordering-field CreatedAt \ --source-class org.apache.hudi.utilities.sources.debezium.MysqlDebeziumSource \ --payload-class org.apache.hudi.common.model.debezium.MySqlDebeziumAvroPayload \ --transformer-class com.example.mbk_transformer.hudi.DayTransform \ --op UPSERT \ --continuous \ --source-limit 400 \ --min-sync-interval-seconds 20 \ --hoodie-conf bootstrap.servers=localhost:19092,localhost:29092,localhost:39092 \ --hoodie-conf schema.registry.url=http://localhost:8081 \ --hoodie-conf hoodie.deltastreamer.schemaprovider.registry.url=http://localhost:8081/subjects/dummy.mobinew.member_base-value/versions/latest \ --hoodie-conf hoodie.deltastreamer.source.kafka.value.deserializer.class=io.confluent.kafka.serializers.KafkaAvroDeserializer \ --hoodie-conf hoodie.deltastreamer.source.kafka.topic=dummy.mobinew.member_base \ --hoodie-conf auto.offset.reset=earliest \ --hoodie-conf hoodie.datasource.write.recordkey.field=id \ --hoodie-conf hoodie.datasource.write.partitionpath.field=day \ --hoodie-conf hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.SimpleKeyGenerator \ --hoodie-conf hoodie.datasource.write.hive_style_partitioning=true \ --hoodie-conf hoodie.datasource.write.precombine.field=ts_ms` \ --conf "spark.driver.extraJavaOptions=-verbose:class" \ --conf "spark.executor.extraJavaOptions=-verbose:class" ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
