wForget opened a new issue, #2338: URL: https://github.com/apache/datafusion-comet/issues/2338
### Describe the bug Got an error on fuzz-testing: https://github.com/wForget/fuzz-test-spark-native/issues/271 ``` [ERROR] Query failed in native engine comet: Job aborted due to stage failure: Task 0 in stage 294.0 failed 1 times, most recent failure: Lost task 0.0 in stage 294.0 (TID 207) (pkrvm7jw40e0xgp.lwu44u4eecjufa4b0wdo1cfjyf.cx.internal.cloudapp.net executor driver): org.apache.comet.CometNativeException: internal error: entered unreachable code: CharacterLengthFunc at comet::errors::init::{{closure}}(__internal__:0) at std::panicking::rust_panic_with_hook(__internal__:0) at std::panicking::begin_panic_handler::{{closure}}(__internal__:0) at std::sys::backtrace::__rust_end_short_backtrace(__internal__:0) at __rustc::rust_begin_unwind(__internal__:0) at core::panicking::panic_fmt(__internal__:0) at <datafusion_functions::unicode::character_length::CharacterLengthFunc as datafusion_expr::udf::ScalarUDFImpl>::invoke_with_args(__internal__:0) at datafusion_expr::udf::ScalarUDF::invoke_with_args(__internal__:0) at <datafusion_physical_expr::scalar_function::ScalarFunctionExpr as datafusion_physical_expr_common::physical_expr::PhysicalExpr>::evaluate(__internal__:0) at <core::iter::adapters::GenericShunt<I,R> as core::iter::traits::iterator::Iterator>::next(__internal__:0) at <datafusion_physical_plan::projection::ProjectionStream as futures_core::stream::Stream>::poll_next(__internal__:0) at comet::execution::jni_api::Java_org_apache_comet_Native_executePlan::{{closure}}::{{closure}}(__internal__:0) at Java_org_apache_comet_Native_executePlan(__internal__:0) at <unknown>(__internal__:0) at org.apache.comet.Native.executePlan(Native Method) at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2(CometExecIterator.scala:165) at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2$adapted(CometExecIterator.scala:164) at org.apache.comet.vector.NativeUtil.getNextBatch(NativeUtil.scala:157) at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1(CometExecIterator.scala:164) at org.apache.comet.Tracing$.withTrace(Tracing.scala:31) at org.apache.comet.CometExecIterator.getNextBatch(CometExecIterator.scala:162) at org.apache.comet.CometExecIterator.hasNext(CometExecIterator.scala:213) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460) at org.apache.spark.util.random.SamplingUtils$.reservoirSampleAndCount(SamplingUtils.scala:41) at org.apache.spark.RangePartitioner$.$anonfun$sketch$1(Partitioner.scala:322) at org.apache.spark.RangePartitioner$.$anonfun$sketch$1$adapted(Partitioner.scala:320) at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:910) at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:910) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) at org.apache.spark.scheduler.Task.run(Task.scala:141) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:621) at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:624) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:840) Driver stacktrace:: ``` ### Steps to reproduce ``` test("test char length") { withTable("t1") { sql("create table t1 using parquet as select cast(id as binary) as c1 from range(10)") sql("SELECT c1, length(c1) AS x FROM t1 ORDER BY c1;").show() } } ``` ### Expected behavior _No response_ ### Additional context _No response_ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
