[ 
https://issues.apache.org/jira/browse/SPARK-20295?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16256570#comment-16256570
 ] 

Liang-Chi Hsieh commented on SPARK-20295:
-----------------------------------------

Btw, from the partial query plan you posted, looks like the coordinator 1, 2, 3 
all only have just one exchange (numExchanges = 1). So I'm not sure if you post 
the correct query plan or not.

> when  spark.sql.adaptive.enabled is enabled, have conflict with Exchange Resue
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-20295
>                 URL: https://issues.apache.org/jira/browse/SPARK-20295
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, SQL
>    Affects Versions: 2.1.0
>            Reporter: Ruhui Wang
>
> when run  tpcds-q95, and set  spark.sql.adaptive.enabled = true the physical 
> plan firstly:
> Sort
> :  +- Exchange(coordinator id: 1)
> :     +- Project***
> :        :-Sort **
> :        :  +- Exchange(coordinator id: 2)
> :        :     :- Project ***
> :        +- Sort
> :        :  +- Exchange(coordinator id: 3)
>  spark.sql.exchange.reuse is opened, then physical plan will become below:
> Sort
> :  +- Exchange(coordinator id: 1)
> :     +- Project***
> :        :-Sort **
> :        :  +- Exchange(coordinator id: 2)
> :        :     :- Project ***
> :        +- Sort
> :        :  +- ReusedExchange  Exchange(coordinator id: 2)
> If spark.sql.adaptive.enabled = true,  the code stack is : 
> ShuffleExchange#doExecute --> postShuffleRDD function --> 
> doEstimationIfNecessary . In this function, 
> assert(exchanges.length == numExchanges) will be error, as left side has only 
> one element, but right is equal to 2.
> If this is a bug of spark.sql.adaptive.enabled and exchange resue?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to