rangareddy commented on issue #13752:
URL: https://github.com/apache/hudi/issues/13752#issuecomment-3218893256

   Hi @YinChunGuang 
   
   I was able to reproduce this issue by using single quotes, and the insert 
operation executed successfully after they were removed. We will update the 
documentation with this fix shortly.
   
   ```sql
   CREATE TABLE hudi_table_t2722 (application_id BIGINT ,
   application_time1 STRING ,
   application_time2 STRING ,
   application_time3 STRING ,
   application_time4 STRING ,
   application_time5 STRING ,
   application_time6 STRING ,
   application_time7 STRING ,
   application_time8 STRING ,
   application_time9 STRING
   ) USING HUDI
   TBLPROPERTIES (
   type = 'cow',
   primaryKey = 'application_id',
   'hoodie.datasource.write.precombine.field' = 'application_time1',
   'hoodie.datasource.write.hive_style_partitioning' = 'true',
   'hoodie.cleaner.policy' = 'KEEP_LATEST_BY_HOURS',
   'hoodie.cleaner.hours.retained' = '1',
   'hoodie.datasource.write.table.type' = 'COPY_ON_WRITE',
   'hoodie.datasource.write.operation' = 'upsert',
   'hoodie.datasource.meta.sync.enable' = 'true',
   'hoodie.datasource.hive_sync.mode' = 'hms',
   'hoodie.datasource.hive_sync.table' = 'hudi_table_t2722'
   );
   
   CREATE TABLE hudi_table_t2721 (application_id BIGINT ,
   application_time1 STRING ,
   application_time2 STRING ,
   application_time3 STRING ,
   application_time4 STRING ,
   application_time5 STRING ,
   application_time6 STRING ,
   application_time7 STRING ,
   application_time8 STRING ,
   application_time9 STRING
   ) USING PARQUET;
   
   spark-sql (default)> set hoodie.datasource.write.operation;
   key  value
   hoodie.datasource.write.operation    <undefined>
   
   spark-sql (default)> set hoodie.datasource.write.operation='upsert';
   key  value
   hoodie.datasource.write.operation    'upsert'
   
   spark-sql (default)> insert into hudi_table_t2722 select * from 
hudi_table_t2721 limit 1;
   25/08/25 05:25:30 ERROR SparkSQLDriver: Failed in [insert into 
hudi_table_t2722 select * from hudi_table_t2721 limit 1]
   org.apache.hudi.exception.HoodieException: Invalid value of Type.
        at 
org.apache.hudi.common.model.WriteOperationType.fromValue(WriteOperationType.java:108)
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.deduceOperation(HoodieSparkSqlWriter.scala:545)
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.writeInternal(HoodieSparkSqlWriter.scala:257)
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.$anonfun$write$1(HoodieSparkSqlWriter.scala:191)
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.write(HoodieSparkSqlWriter.scala:209)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:128)
        at 
org.apache.spark.sql.hudi.command.InsertIntoHoodieTableCommand$.run(InsertIntoHoodieTableCommand.scala:108)
        at 
org.apache.spark.sql.hudi.command.InsertIntoHoodieTableCommand.run(InsertIntoHoodieTableCommand.scala:68)
        at 
org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:113)
        at 
org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:111)
        at 
org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:125)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)
   
   spark-sql (default)> set hoodie.datasource.write.operation=upsert;
   key  value
   hoodie.datasource.write.operation    upsert
   
   spark-sql (default)> insert into hudi_table_t2722 select * from 
hudi_table_t2721 limit 1;
   ...
   Response code
   Time taken: 22.207 seconds
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to