gianm commented on pull request #10267:
URL: https://github.com/apache/druid/pull/10267#issuecomment-738621133


   > I didn't see the ClassCastException in the log for the failed batch test 
though.
   
   I finally ran this locally and was able to extract the full error. I think 
it's happening because `__time` is getting added to the dimensions list of an 
input row. (This makes a string dimension named `__time` get created, which 
violates all sorts of assumptions.)
   
   We'll need to prevent that from happening somehow. I'll look into it.
   
   ```
   Caused by: java.lang.ClassCastException: 
org.apache.druid.segment.incremental.IncrementalIndexRowHolder cannot be cast 
to org.apache.druid.segment.DimensionSelector
           at 
org.apache.druid.segment.StringDimensionIndexer.convertUnsortedValuesToSorted(StringDimensionIndexer.java:793)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.incremental.IncrementalIndexRowIterator.lambda$makeRowPointer$0(IncrementalIndexRowIterator.java:78)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) 
~[?:1.8.0_232]
           at java.util.Iterator.forEachRemaining(Iterator.java:116) 
~[?:1.8.0_232]
           at 
java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
 ~[?:1.8.0_232]
           at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) 
~[?:1.8.0_232]
           at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) 
~[?:1.8.0_232]
           at 
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:546) 
~[?:1.8.0_232]
           at 
java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
 ~[?:1.8.0_232]
           at 
java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:505) 
~[?:1.8.0_232]
           at 
org.apache.druid.segment.incremental.IncrementalIndexRowIterator.makeRowPointer(IncrementalIndexRowIterator.java:80)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.incremental.IncrementalIndexRowIterator.<init>(IncrementalIndexRowIterator.java:54)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.incremental.IncrementalIndexAdapter.getRows(IncrementalIndexAdapter.java:162)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.IndexMergerV9.makeMergedTimeAndDimsIterator(IndexMergerV9.java:1106)
 ~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.IndexMergerV9.makeIndexFiles(IndexMergerV9.java:255) 
~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.IndexMergerV9.merge(IndexMergerV9.java:999) 
~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.IndexMergerV9.persist(IndexMergerV9.java:864) 
~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.IndexMergerV9.persist(IndexMergerV9.java:833) 
~[druid-processing-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.persistHydrant(AppenderatorImpl.java:1334)
 ~[druid-server-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.realtime.appenderator.AppenderatorImpl.access$100(AppenderatorImpl.java:102)
 ~[druid-server-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
           at 
org.apache.druid.segment.realtime.appenderator.AppenderatorImpl$2.call(AppenderatorImpl.java:547)
 ~[druid-server-0.21.0-SNAPSHOT.jar:0.21.0-SNAPSHOT]
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to