[ https://issues.apache.org/jira/browse/SPARK-23985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16625876#comment-16625876 ]
Yuming Wang edited comment on SPARK-23985 at 9/25/18 2:19 PM: -------------------------------------------------------------- [~uzadude] Seem we should not push down predicate. Pelase see these test case: https://github.com/apache/spark/blob/2c73d2a948bdde798aaf0f87c18846281deb05fd/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/FilterPushdownSuite.scala#L1086-L1144 was (Author: q79969786): Thanks [~uzadude] I will deep dive it. > predicate push down doesn't work with simple compound partition spec > -------------------------------------------------------------------- > > Key: SPARK-23985 > URL: https://issues.apache.org/jira/browse/SPARK-23985 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.0 > Reporter: Ohad Raviv > Priority: Minor > > while predicate push down works with this query: > {code:sql} > select * from ( > select *, row_number() over (partition by a order by b) from t1 > )z > where a>1 > {code} > it dowsn't work with: > {code:sql} > select * from ( > select *, row_number() over (partition by concat(a,'lit') order by b) from > t1 > )z > where a>1 > {code} > > I added a test to FilterPushdownSuite which I think recreates the problem: > {code} > test("Window: predicate push down -- ohad") { > val winExpr = windowExpr(count('b), > windowSpec(Concat('a :: Nil) :: Nil, 'b.asc :: Nil, UnspecifiedFrame)) > val originalQuery = testRelation.select('a, 'b, 'c, > winExpr.as('window)).where('a > 1) > val correctAnswer = testRelation > .where('a > 1).select('a, 'b, 'c) > .window(winExpr.as('window) :: Nil, 'a :: Nil, 'b.asc :: Nil) > .select('a, 'b, 'c, 'window).analyze > comparePlans(Optimize.execute(originalQuery.analyze), correctAnswer) > } > {code} > will try to create a PR with a correction -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org