bersprockets opened a new pull request, #39518:
URL: https://github.com/apache/spark/pull/39518

   ### What changes were proposed in this pull request?
   
   Change `CheckOverflowInTableInsert` to accept a `Cast` wrapped by an 
`ExpressionProxy` as a child.
   
   ### Why are the changes needed?
   
   This insert statement fails:
   ```
   drop table if exists tbl1;
   create table tbl1 (a int, b int) using parquet;
   
   set spark.sql.codegen.wholeStage=false;
   set spark.sql.codegen.factoryMode=NO_CODEGEN;
   
   insert into tbl1
   select id as a, id as b
   from range(1, 5);
   ```
   It gets the following exception:
   ```
   java.lang.ClassCastException: 
org.apache.spark.sql.catalyst.expressions.ExpressionProxy cannot be cast to 
org.apache.spark.sql.catalyst.expressions.Cast
        at 
org.apache.spark.sql.catalyst.expressions.CheckOverflowInTableInsert.withNewChildInternal(Cast.scala:2514)
        at 
org.apache.spark.sql.catalyst.expressions.CheckOverflowInTableInsert.withNewChildInternal(Cast.scala:2512)
   ```
   The query produces 2 bigint values, but the table's schema expects 2 int 
values, so Spark wraps each output field with a `Cast`.
   
   Later, in `InterpretedUnsafeProjection`, `prepareExpressions` tries to wrap 
the two `Cast` expressions with an `ExpressionProxy`. However, the parent 
expression of each `Cast` is a `CheckOverflowInTableInsert` expression, which 
does not accept `ExpressionProxy` as a child.
   
   ### Does this PR introduce _any_ user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   New unit tests.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to