rubenssoto commented on issue #2515:
URL: https://github.com/apache/hudi/issues/2515#issuecomment-772997018
Thank you so much for helping me....
my spark submit:
`spark-submit --deploy-mode cluster --conf spark.executor.cores=5 --conf
spark.executor.memoryOverhead=2000 --conf spark.executor.memory=32g --conf
spark.yarn.maxAppAttempts=1 --conf spark.dynamicAllocation.maxExecutors=4
--conf spark.serializer=org.apache.spark.serializer.KryoSerializer --packages
org.apache.spark:spark-avro_2.12:2.4.4,org.apache.hudi:hudi-spark-bundle_2.12:0.7.0
--jars s3://dl/lib/spark-daria_2.12-0.38.2.jar --class TableProcessorWrapper
s3://dl/code/projects/data_projects/batch_processor_engine/batch-processor-engine_2.12-3.0.1_0.5.jar
courier_api_group01`
My spark session:
```
trait SparkSessionWrapper extends Serializable {
val spark = {
SparkSession.builder
.appName("Batch Processor Engine")
.config(
"spark.jars.packages",
"org.apache.spark:spark-avro_2.12:2.4.4,org.apache.hudi:hudi-spark-bundle_2.12:0.7.0"
)
.config("spark.serializer",
"org.apache.spark.serializer.KryoSerializer")
.enableHiveSupport()
.getOrCreate()
}
}
```
I am doing anything wrong?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]