[ https://issues.apache.org/jira/browse/SPARK-7114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan updated SPARK-7114: ------------------------------- Description: DataFrame.filter has 2 overloaded versions. One of it accept String parameter to represent condition expression. {code} val df = ... // df has 2 columns: key, value val agg = df.groupBy("key").count() agg.filter(df("count") > 1) // this success agg.filter("count > 1") // this failed {code} the error message is: {code} [1.7] failure: ``('' expected but `>' found count > 1 ^ java.lang.RuntimeException: [1.7] failure: ``('' expected but `>' found count > 1 ^ {code} was: DataFrame.filter has many overloaded versions. One of it accept String parameter to represent condition expression. {code} val df = ... // df has 2 columns: key, value val agg = df.groupBy("key").count() agg.filter(df("count") > 1) // this success agg.filter("count > 1") // this failed {code} the error message is: {code} [1.7] failure: ``('' expected but `>' found count > 1 ^ java.lang.RuntimeException: [1.7] failure: ``('' expected but `>' found count > 1 ^ {code} > parse error for DataFrame.filter after aggregate > ------------------------------------------------ > > Key: SPARK-7114 > URL: https://issues.apache.org/jira/browse/SPARK-7114 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Wenchen Fan > > DataFrame.filter has 2 overloaded versions. One of it accept String parameter > to represent condition expression. > {code} > val df = ... // df has 2 columns: key, value > val agg = df.groupBy("key").count() > agg.filter(df("count") > 1) // this success > agg.filter("count > 1") // this failed > {code} > the error message is: > {code} > [1.7] failure: ``('' expected but `>' found > count > 1 > ^ > java.lang.RuntimeException: [1.7] failure: ``('' expected but `>' found > count > 1 > ^ > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org