Aiden-Dong opened a new issue, #5932:
URL: https://github.com/apache/paimon/issues/5932

   ### Search before asking
   
   - [x] I searched in the [issues](https://github.com/apache/paimon/issues) 
and found nothing similar.
   
   
   ### Paimon version
   
   master
   
   ### Compute Engine
   
   spark 3.3.1
   
   ### Minimal reproduce step
   
   create table 
   ```
   CREATE TABLE T( 
       f1 int, 
       f2 string, 
       f3 string, 
       f4 string) 
    TBLPROPERTIES (    
     'bucket'='1', 
     'primary-key'='f1', 
     'write-only'='true', 
     'merge-engine'='partial-update',   
     'file.format'='parquet', 
     'file.compression' = 'zstd'  
    )
   ```
   insert 
   ```
   INSERT INTO T(f1, f4) VALUES(1, 'test')
   ```
   
   ### What doesn't meet your expectations?
   
   ```
   Cannot write incompatible data for the table `test.T`, the number of data 
columns don't match with the table schema's.
   java.lang.RuntimeException: Cannot write incompatible data for the table 
`test.T`, the number of data columns don't match with the table schema's.
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis.resolveQueryColumnsByPosition(PaimonAnalysis.scala:135)
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis.org$apache$paimon$spark$catalyst$analysis$PaimonAnalysis$$resolveQueryColumns(PaimonAnalysis.scala:78)
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis$$anonfun$apply$1.applyOrElse(PaimonAnalysis.scala:43)
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis$$anonfun$apply$1.applyOrElse(PaimonAnalysis.scala:41)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:170)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:30)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:160)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:159)
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:30)
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis.apply(PaimonAnalysis.scala:41)
        at 
org.apache.paimon.spark.catalyst.analysis.PaimonAnalysis.apply(PaimonAnalysis.scala:39)
        at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:211)
        at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
        at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
        at scala.collection.immutable.List.foldLeft(List.scala:91)
        at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:208)
        at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:200)
   ```
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [x] I'm willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@paimon.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to