3216670078 opened a new issue, #7063:
URL: https://github.com/apache/paimon/issues/7063

   ### Search before asking
   
   - [x] I searched in the [issues](https://github.com/apache/paimon/issues) 
and found nothing similar.
   
   
   ### Paimon version
   
   1.3.1
   
   ### Compute Engine
   
   hive3.1.3
   
   ### Minimal reproduce step
   
   CREATE TABLE test_paimon (
       id INT,
       name STRING,
       salary DOUBLE,
       department STRING
   )
   STORED BY 'org.apache.paimon.hive.PaimonStorageHandler'
   TBLPROPERTIES (
       'primary-key' = 'id',
       'bucket' = '-1',
       'file.format' = 'parquet'
   );
   
   INSERT INTO TABLE test_paimon 
   VALUES 
   (1, 'Alice', 5000.0, 'IT'),  
   (2, 'Bob', 6000.0, 'HR'),
   (3, 'Charlie', 5500.0, 'IT');
   
   ### What doesn't meet your expectations?
   
   Diagnostic Messages for this Task:
   Error: java.lang.RuntimeException: 
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while 
processing writable (null)
           at 
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
           at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
           at 
org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37)
           at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:465)
           at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
           at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
           at java.security.AccessController.doPrivileged(Native Method)
           at javax.security.auth.Subject.doAs(Subject.java:422)
           at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
           at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
   Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime 
Error while processing writable (null)
           at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:568)
           at 
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:154)
           ... 9 more
   Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: 
Can't extract bucket from row in dynamic bucket mode, you should use 
'TableWrite.write(InternalRow row, int bucket)' method.
           at 
org.apache.paimon.hive.mapred.PaimonRecordWriter.write(PaimonRecordWriter.java:69)
           at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:987)
           at 
org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
           at 
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
           at 
org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
           at 
org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
           at 
org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
           at 
org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
           at 
org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
           at 
org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)
           at 
org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
           at 
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
           at 
org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
           at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
           at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)
           at 
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)
           at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)
           ... 10 more
   Caused by: java.lang.IllegalArgumentException: Can't extract bucket from row 
in dynamic bucket mode, you should use 'TableWrite.write(InternalRow row, int 
bucket)' method.
           at 
org.apache.paimon.table.sink.DynamicBucketRowKeyExtractor.bucket(DynamicBucketRowKeyExtractor.java:44)
           at 
org.apache.paimon.table.sink.TableWriteImpl.toSinkRecord(TableWriteImpl.java:213)
           at 
org.apache.paimon.table.sink.TableWriteImpl.writeAndReturn(TableWriteImpl.java:190)
           at 
org.apache.paimon.table.sink.TableWriteImpl.writeAndReturn(TableWriteImpl.java:179)
           at 
org.apache.paimon.table.sink.TableWriteImpl.write(TableWriteImpl.java:157)
           at 
org.apache.paimon.hive.mapred.PaimonRecordWriter.write(PaimonRecordWriter.java:67)
           ... 32 more
   
   
   2026-01-16 12:29:42,178 ERROR [Thread-35] exec.Task 
(SessionState.java:printError(1250)) -
   
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [ ] I'm willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to