KevinyhZou opened a new issue, #7633:
URL: https://github.com/apache/incubator-gluten/issues/7633

   ### Backend
   
   CH (ClickHouse)
   
   ### Bug description
   
   test table schema (id bigint, d1 array<string>, d2 array<string>)
   
   test sql
   ```
   select dc from (
   select id, d2
   from test_tbl1 lateral view explode(d1) as content
   ) lateral view explode(d2) as dc;
   ```
   
   exception
   ```
   Caused by: org.apache.gluten.exception.GlutenException: 
org.apache.gluten.exception.GlutenException: Not found column 
assumeNotNull(ifNull(d1,[]_26)) in block. There are only columns: 
assumeNotNull(ifNull(d2,[]_27)): While executing ArrayJoinTransform
   0. Poco::Exception::Exception(String const&, int) @ 0x0000000013dfc739
   1. DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 
0x000000000ba67bb9
   2. DB::Exception::Exception(PreformattedMessage&&, int) @ 0x000000000652d84c
   3. DB::Exception::Exception<String const&, String>(int, 
FormatStringHelperImpl<std::type_identity<String const&>::type, 
std::type_identity<String>::type>, String const&, String&&) @ 0x00000000065f7a4b
   4. DB::Block::getByName(String const&, bool) const @ 0x000000000ec0d20e
   5. DB::ArrayJoinResultIterator::ArrayJoinResultIterator(DB::ArrayJoinAction 
const*, DB::Block) @ 0x000000000f59bbdf
   6. std::__unique_if<DB::ArrayJoinResultIterator>::__unique_single 
std::make_unique[abi:v15007]<DB::ArrayJoinResultIterator, DB::ArrayJoinAction*, 
DB::Block>(DB::ArrayJoinAction*&&, DB::Block&&) @ 0x000000000f59f0f7
   7. DB::ArrayJoinAction::execute(DB::Block) @ 0x000000000f59b9fd
   8. DB::ArrayJoinTransform::consume(DB::Chunk) @ 0x00000000117012a9
   9. DB::IInflatingTransform::work() @ 0x00000000115c7c4a
   10. DB::ExecutionThreadContext::executeTask() @ 0x000000001145cda2
   11. DB::PipelineExecutor::executeStepImpl(unsigned long, std::atomic<bool>*) 
@ 0x00000000114511df
   12. DB::PipelineExecutor::executeStep(std::atomic<bool>*) @ 
0x0000000011450d49
   13. DB::PullingPipelineExecutor::pull(DB::Chunk&) @ 0x0000000011465254
   14. DB::PullingPipelineExecutor::pull(DB::Block&) @ 0x00000000114653b9
   15. local_engine::LocalExecutor::hasNext() @ 0x000000000bee9711
   16. Java_org_apache_gluten_vectorized_BatchIterator_nativeHasNext @ 
0x0000000006512e37
   
           at 
org.apache.gluten.iterator.ClosableIterator.hasNext(ClosableIterator.java:38)
           at 
org.apache.gluten.backendsapi.clickhouse.CollectMetricIterator.hasNext(CHIteratorApi.scala:346)
           at 
org.apache.gluten.vectorized.CloseableCHColumnBatchIterator.$anonfun$hasNext$1(CloseableCHColumnBatchIterator.scala:42)
           at 
scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
           at 
org.apache.gluten.metrics.GlutenTimeMetric$.withNanoTime(GlutenTimeMetric.scala:41)
           at 
org.apache.gluten.vectorized.CloseableCHColumnBatchIterator.hasNext(CloseableCHColumnBatchIterator.scala:42)
           at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
           at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
           at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:364)
           at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:890)
           at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:890)
           at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92)
           at org.apache.spark.scheduler.Task.run(Task.scala:136)
           at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
           at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
   ```
   
   ### Spark version
   
   Spark-3.3.x
   
   ### Spark configurations
   
   _No response_
   
   ### System information
   
   _No response_
   
   ### Relevant logs
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to