weimingdiit commented on code in PR #8640:
URL: https://github.com/apache/hudi/pull/8640#discussion_r1190676870


##########
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/bootstrap/SparkBootstrapCommitActionExecutor.java:
##########
@@ -326,6 +326,8 @@ private Map<BootstrapMode, List<Pair<String, 
List<HoodieFileStatus>>>> listAndPr
       if (!(selector instanceof FullRecordBootstrapModeSelector)) {
         FullRecordBootstrapModeSelector fullRecordBootstrapModeSelector = new 
FullRecordBootstrapModeSelector(config);
         result.putAll(fullRecordBootstrapModeSelector.select(folders));
+      } else {
+        result.putAll(selector.select(folders));
       }

Review Comment:
   @danny0405  hi danny, I posted the detailed information in jira before, the 
following is the command I used and the detailed error log.
   
   https://issues.apache.org/jira/browse/HUDI-6107
   
   
   An orc or parquet type hive table will report an error when using bootstrap 
to convert the table into a hudi table
   
   my command:
   
   spark-submit \
   --queue root.default_queue \
   --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
   --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer  
hudijar/hudi-utilities-bundle_2.12-0.13.0.jar \
   --run-bootstrap \
   --target-base-path 
/user/prod_datalake_test/datalake_test/hive/datalake_test/wm_test_bootstrap_hudi01
 \
   --target-table wm_test_bootstrap_hudi01 \
   --table-type COPY_ON_WRITE \
   --base-file-format PARQUET \
   --hoodie-conf 
hoodie.bootstrap.base.path=/user/prod_datalake_test/datalake_test_dev/hive/datalake_test_dev/wm_test_bootstrap_hudi01
 \
   --hoodie-conf hoodie.datasource.write.recordkey.field=account \
   --hoodie-conf hoodie.datasource.write.partitionpath.field=dt \
   --hoodie-conf hoodie.datasource.write.precombine.field=account \
   --hoodie-conf 
hoodie.bootstrap.keygen.class=org.apache.hudi.keygen.SimpleKeyGenerator \
   --hoodie-conf 
hoodie.bootstrap.full.input.provider=org.apache.hudi.bootstrap.SparkOrcBootstrapDataProvider
 \
   --hoodie-conf 
hoodie.bootstrap.mode.selector=org.apache.hudi.client.bootstrap.selector.FullRecordBootstrapModeSelector
 \
   --hoodie-conf hoodie.bootstrap.mode.selector.regex.mode=FULL_RECORD \
   --hoodie-conf hoodie.datasource.write.hive_style_partitioning=true \
   --enable-sync  \
   --hoodie-conf hoodie.datasource.hive_sync.mode=HMS \
   --hoodie-conf hoodie.datasource.hive_sync.database=datalake_test \
   --hoodie-conf hoodie.datasource.hive_sync.auto_create_database=true \
   --hoodie-conf hoodie.datasource.hive_sync.create_managed_table=true \
   --hoodie-conf hoodie.datasource.hive_sync.table=wm_test_bootstrap_hudi01 \
   --hoodie-conf hoodie.datasource.hive_sync.partition_fields=dt \
   --hoodie-conf 
hoodie.datasource.hive_sync.partition_extractor_class=org.apache.hudi.hive.MultiPartKeysValueExtractor
   
    
   
   error log:
   
   23/04/20 14:12:43 ERROR ApplicationMaster: User class threw exception: 
java.lang.IllegalArgumentException java.lang.IllegalArgumentException at 
org.apache.hudi.common.util.ValidationUtils.checkArgument(ValidationUtils.java:31)
 at 
org.apache.hudi.table.action.bootstrap.SparkBootstrapCommitActionExecutor.listAndProcessSourcePartitions(SparkBootstrapCommitActionExecutor.java:337)
 at 
org.apache.hudi.table.action.bootstrap.SparkBootstrapCommitActionExecutor.execute(SparkBootstrapCommitActionExecutor.java:134)
 at 
org.apache.hudi.table.HoodieSparkCopyOnWriteTable.bootstrap(HoodieSparkCopyOnWriteTable.java:187)
 at 
org.apache.hudi.client.SparkRDDWriteClient.bootstrap(SparkRDDWriteClient.java:131)
 at 
org.apache.hudi.utilities.deltastreamer.BootstrapExecutor.execute(BootstrapExecutor.java:167)
 at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(HoodieDeltaStreamer.java:189)
 at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(HoodieDeltaStreamer.java:573)
 at su
 n.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:748)
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to