[ 
https://issues.apache.org/jira/browse/SPARK-23316?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16350188#comment-16350188
 ] 

Bogdan Raducanu edited comment on SPARK-23316 at 2/2/18 11:29 AM:
------------------------------------------------------------------

I think it's related to {{In.checkInputTypes}} in 2.2: it does not check 
nullability while in 2.3 it does , by using {{DataType.equalsStructurally}}


was (Author: bograd):
I think it's related to {{In.checkInputTypes}} in 2.2 it does not check 
nullability while in 2.3 it does , by using {{DataType.equalsStructurally}}

> AnalysisException after max iteration reached for IN query
> ----------------------------------------------------------
>
>                 Key: SPARK-23316
>                 URL: https://issues.apache.org/jira/browse/SPARK-23316
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Bogdan Raducanu
>            Priority: Major
>
> Query to reproduce:
> {code:scala}
> spark.range(10).where("(id,id) in (select id, null from range(3))").show
> {code}
> {code}
> 18/02/02 11:32:31 WARN BaseSessionStateBuilder$$anon$1: Max iterations (100) 
> reached for batch Resolution
> org.apache.spark.sql.AnalysisException: cannot resolve '(named_struct('id', 
> `id`, 'id', `id`) IN (listquery()))' due to data type mismatch:
> The data type of one or more elements in the left hand side of an IN subquery
> is not compatible with the data type of the output of the subquery
> Mismatched columns:
> []
> Left side:
> [bigint, bigint].
> Right side:
> [bigint, bigint].;;
> {code}
> The error message includes the last plan which contains ~100 useless Projects.
> Does not happen in branch-2.2.
> It has something to do with TypeCoercion, it is doing a futile attempt  to 
> change nullability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to