BalaMahesh commented on issue #5494:
URL: https://github.com/apache/hudi/issues/5494#issuecomment-1118079346

   @alexeykudinkin  This is the full command and logs
   
   ./spark-submit  --jars 
packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.12-0.11.0.jar  
--class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer 
packaging/hudi-utilities-slim-bundle/target/hudi-utilities-slim-bundle_2.12-0.11.0.jar
 --props file:////hudi/properties/gl_started.properties --schemaprovider-class 
org.apache.hudi.utilities.schema.FilebasedSchemaProvider --source-class 
org.apache.hudi.utilities.sources.JsonKafkaSource --target-base-path 
gs://xx/hudi/gl_cow/ --target-table hudi.gl_cow --op INSERT --table-type 
COPY_ON_WRITE --source-ordering-field time --continuous  --transformer-class 
org.apache.hudi.utilities.transform.AddDateHourColumnTransformer --source-limit 
150
   ```
   22/05/04 10:21:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   log4j:WARN No appenders could be found for logger 
(org.apache.hudi.utilities.deltastreamer.SchedulerConfGenerator).
   log4j:WARN Please initialize the log4j system properly.
   log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for 
more info.
   Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
   22/05/04 10:21:26 INFO SparkContext: Running Spark version 3.2.1
   22/05/04 10:21:26 INFO ResourceUtils: 
==============================================================
   22/05/04 10:21:26 INFO ResourceUtils: No custom resources configured for 
spark.driver.
   22/05/04 10:21:26 INFO ResourceUtils: 
==============================================================
   22/05/04 10:21:26 INFO SparkContext: Submitted application: 
delta-streamer-hudi.gl_cow
   22/05/04 10:21:26 INFO ResourceProfile: Default ResourceProfile created, 
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , 
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: 
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: 
cpus, amount: 1.0)
   22/05/04 10:21:26 INFO ResourceProfile: Limiting resource is cpu
   22/05/04 10:21:26 INFO ResourceProfileManager: Added ResourceProfile id: 0
   22/05/04 10:21:26 INFO SecurityManager: Changing view acls to: xx
   22/05/04 10:21:26 INFO SecurityManager: Changing modify acls to: xx
   22/05/04 10:21:26 INFO SecurityManager: Changing view acls groups to: 
   22/05/04 10:21:26 INFO SecurityManager: Changing modify acls groups to: 
   22/05/04 10:21:26 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(xx); groups with 
view permissions: Set(); users  with modify permissions: Set(xx); groups with 
modify permissions: Set()
   22/05/04 10:21:26 INFO deprecation: mapred.output.compression.codec is 
deprecated. Instead, use mapreduce.output.fileoutputformat.compress.codec
   22/05/04 10:21:26 INFO deprecation: mapred.output.compress is deprecated. 
Instead, use mapreduce.output.fileoutputformat.compress
   22/05/04 10:21:26 INFO deprecation: mapred.output.compression.type is 
deprecated. Instead, use mapreduce.output.fileoutputformat.compress.type
   22/05/04 10:21:26 INFO Utils: Successfully started service 'sparkDriver' on 
port 49423.
   22/05/04 10:21:26 INFO SparkEnv: Registering MapOutputTracker
   22/05/04 10:21:26 INFO SparkEnv: Registering BlockManagerMaster
   22/05/04 10:21:26 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   22/05/04 10:21:26 INFO BlockManagerMasterEndpoint: 
BlockManagerMasterEndpoint up
   22/05/04 10:21:26 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
   22/05/04 10:21:26 INFO DiskBlockManager: Created local directory at 
/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/blockmgr-2c16d6dd-7a31-44da-b728-bc8bfe8c11e7
   22/05/04 10:21:26 INFO MemoryStore: MemoryStore started with capacity 366.3 
MiB
   22/05/04 10:21:26 INFO SparkEnv: Registering OutputCommitCoordinator
   22/05/04 10:21:27 INFO Utils: Successfully started service 'SparkUI' on port 
8090.
   22/05/04 10:21:27 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://xx-x0-2:8090
   22/05/04 10:21:27 INFO SparkContext: Added JAR 
file://packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.12-0.11.0.jar 
at spark://xxx-x0-2:49423/jars/hudi-spark3.2-bundle_2.12-0.11.0.jar with 
timestamp 1651639886168
   22/05/04 10:21:27 INFO SparkContext: Added JAR 
file:packaging/hudi-utilities-slim-bundle/target/hudi-utilities-slim-bundle_2.12-0.11.0.jar
 at spark://xx-x0-2:49423/jars/hudi-utilities-slim-bundle_2.12-0.11.0.jar with 
timestamp 1651639886168
   22/05/04 10:21:27 INFO Executor: Starting executor ID driver on host xx-x0-2
   22/05/04 10:21:27 INFO Executor: Fetching 
spark://xx-x0-2:49423/jars/hudi-spark3.2-bundle_2.12-0.11.0.jar with timestamp 
1651639886168
   22/05/04 10:21:27 INFO TransportClientFactory: Successfully created 
connection to xx-x0-2/192.168.0.188:49423 after 36 ms (0 ms spent in bootstraps)
   22/05/04 10:21:27 INFO Utils: Fetching 
spark://xx-x0-2:49423/jars/hudi-spark3.2-bundle_2.12-0.11.0.jar to 
/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-0552e0b7-1da6-4a5e-b1a7-3e440743cddb/userFiles-f4a62876-1c52-41f9-91e9-a0a8efa42b86/fetchFileTemp2640123694101165597.tmp
   22/05/04 10:21:27 INFO Executor: Adding 
file:/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-0552e0b7-1da6-4a5e-b1a7-3e440743cddb/userFiles-f4a62876-1c52-41f9-91e9-a0a8efa42b86/hudi-spark3.2-bundle_2.12-0.11.0.jar
 to class loader
   22/05/04 10:21:27 INFO Executor: Fetching 
spark://xx-x0-2:49423/jars/hudi-utilities-slim-bundle_2.12-0.11.0.jar with 
timestamp 1651639886168
   22/05/04 10:21:27 INFO Utils: Fetching 
spark://xx-x0-2:49423/jars/hudi-utilities-slim-bundle_2.12-0.11.0.jar to 
/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-0552e0b7-1da6-4a5e-b1a7-3e440743cddb/userFiles-f4a62876-1c52-41f9-91e9-a0a8efa42b86/fetchFileTemp7928452199573881385.tmp
   22/05/04 10:21:27 INFO Executor: Adding 
file:/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-0552e0b7-1da6-4a5e-b1a7-3e440743cddb/userFiles-f4a62876-1c52-41f9-91e9-a0a8efa42b86/hudi-utilities-slim-bundle_2.12-0.11.0.jar
 to class loader
   22/05/04 10:21:27 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 49425.
   22/05/04 10:21:27 INFO NettyBlockTransferService: Server created on 
xx-x0-2:49425
   22/05/04 10:21:27 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
   22/05/04 10:21:28 INFO BlockManagerMaster: Registering BlockManager 
BlockManagerId(driver, xx-x0-2, 49425, None)
   22/05/04 10:21:28 INFO BlockManagerMasterEndpoint: Registering block manager 
xx-x0-2:49425 with 366.3 MiB RAM, BlockManagerId(driver, xx-x0-2, 49425, None)
   22/05/04 10:21:28 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, xx-x0-2, 49425, None)
   22/05/04 10:21:28 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(driver, xx-x0-2, 49425, None)
   22/05/04 10:21:28 WARN DFSPropertiesConfiguration: Cannot find 
HUDI_CONF_DIR, please set it as the dir of hudi-defaults.conf
   22/05/04 10:21:28 WARN DFSPropertiesConfiguration: Properties file 
file:/etc/hudi/conf/hudi-defaults.conf not found. Ignoring to load props file
   22/05/04 10:21:34 INFO SparkUI: Stopped Spark web UI at http://xx-x0-2:8090
   22/05/04 10:21:34 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   22/05/04 10:21:34 INFO MemoryStore: MemoryStore cleared
   22/05/04 10:21:34 INFO BlockManager: BlockManager stopped
   22/05/04 10:21:34 INFO BlockManagerMaster: BlockManagerMaster stopped
   22/05/04 10:21:34 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   22/05/04 10:21:34 INFO SparkContext: Successfully stopped SparkContext
   Exception in thread "main" java.lang.NoSuchFieldError: DROP_PARTITION_COLUMNS
        at 
org.apache.hudi.DataSourceWriteOptions$.<init>(DataSourceOptions.scala:488)
        at 
org.apache.hudi.DataSourceWriteOptions$.<clinit>(DataSourceOptions.scala)
        at 
org.apache.hudi.DataSourceWriteOptions.RECONCILE_SCHEMA(DataSourceOptions.scala)
        at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.combineProperties(HoodieDeltaStreamer.java:160)
        at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:130)
        at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:115)
        at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(HoodieDeltaStreamer.java:549)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   22/05/04 10:21:34 INFO ShutdownHookManager: Shutdown hook called
   22/05/04 10:21:34 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-629bf96a-39b1-4764-9c9b-8a3075506818
   22/05/04 10:21:34 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/6w/4y9hyhmj4d15hdqlnd74rp5c0000gp/T/spark-0552e0b7-1da6-4a5e-b1a7-3e440743cddb
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to