hanrongMan opened a new issue, #9827:
URL: https://github.com/apache/hudi/issues/9827

   OS : MacOS 14.0 (23A344)
   Chip : Apple M1
   Docker Env:  colima version 0.5.5 +  Docker version 20.10.17, build 
100c70180f
    
   All my operating steps are as follows:
    What I did according to this article 
:https://hudi.apache.org/cn/docs/0.13.0/docker_demo#testing-hudi-in-local-docker-environment
   1. I pulled the master branch,then cd docker directory;
   2. I execute  command: ./setup_demo.sh --mac-aarch64,the result  logs  shows:
   ```
       Copying spark default config and setting up configs
      23/10/06 14:00:10 WARN util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java 
      classes where applicable
     23/10/06 14:00:11 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java 
     classes where applicable
     23/10/06 14:00:12 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java 
     classes where applicable
   ```
   but ervery docker contain is ruuning normally.
   
   3.I continue command 
   `cat docker/demo/data/batch_1.json | kcat -b kafkabroker -t stock_ticks -P
   `
   `kcat -b kafkabroker -L -J | jq .`
   The displayed results are consistent with those in the guidance document.
   
   4. I continue command ` docker exec -it adhoc-2 /bin/bash`
   5.  I continue command  ` spark-submit \
     --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer 
$HUDI_UTILITIES_BUNDLE \
     --table-type COPY_ON_WRITE \
     --source-class org.apache.hudi.utilities.sources.JsonKafkaSource \
     --source-ordering-field ts  \
     --target-base-path /user/hive/warehouse/stock_ticks_cow \
     --target-table stock_ticks_cow --props 
/var/demo/config/kafka-source.properties \
     --schemaprovider-class 
org.apache.hudi.utilities.schema.FilebasedSchemaProvider`
   
   error happens:
   here are logs:
   `23/10/06 14:02:13 INFO spark.SparkContext: Successfully stopped SparkContext
   Exception in thread "main" java.io.IOException: Could not load schema 
provider class org.apache.hudi.utilities.schema.FilebasedSchemaProvider
           at 
org.apache.hudi.utilities.UtilHelpers.createSchemaProvider(UtilHelpers.java:168)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer$DeltaSyncService.<init>(HoodieDeltaStreamer.java:678)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:148)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:121)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(HoodieDeltaStreamer.java:573)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: org.apache.hudi.exception.HoodieException: Unable to instantiate 
class org.apache.hudi.utilities.schema.FilebasedSchemaProvider
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:79)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:106)
           at 
org.apache.hudi.utilities.UtilHelpers.createSchemaProvider(UtilHelpers.java:166)
           ... 16 more
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:77)
           ... 18 more
   Caused by: org.apache.hudi.exception.HoodieNotSupportedException: Required 
property hoodie.deltastreamer.schemaprovider.source.schema.file is missing
           at 
org.apache.hudi.DataSourceUtils.lambda$checkRequiredProperties$1(DataSourceUtils.java:162)
           at java.util.Collections$SingletonList.forEach(Collections.java:4822)
           at 
org.apache.hudi.DataSourceUtils.checkRequiredProperties(DataSourceUtils.java:160)
           at 
org.apache.hudi.utilities.schema.FilebasedSchemaProvider.<init>(FilebasedSchemaProvider.java:55)
           ... 23 more
   23/10/06 14:02:13 INFO util.ShutdownHookManager: Shutdown hook called
   23/10/06 14:02:13 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-309f1e07-912e-4048-b715-fd9157fa456d
   23/10/06 14:02:13 INFO util.ShutdownHookManager: Deleting directory 
/tmp/spark-83f4e447-c562-4216-85b2-cf2679a30221
   `
   
   Please help me check if my understanding of the guidance manual is correct 
and where the problem lies.
   Do M1 chips need to be tested based on the master branch?Do I need to 
compile first? The m1 chip will generate an error during compilation
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to