[ 
https://issues.apache.org/jira/browse/SPARK-7732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-7732.
------------------------------
    Resolution: Invalid

This is more of a question at this point, not a clear issue report. I'd ask on 
user@ first.

> org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-7732
>                 URL: https://issues.apache.org/jira/browse/SPARK-7732
>             Project: Spark
>          Issue Type: Story
>    Affects Versions: 1.3.0
>         Environment: Windows Eclipse Luna 
>            Reporter: Prem singh Bist
>
> Hi , 
> I am trying to register one UDF named extract in eclipse scala IDE LUNA. 
> The UDF is registered successfully using below command. 
>  sqlContext.udf.register("extract", (dateUnit: String, date : String) => 
> udf.extract(dateUnit,date ) )
> The below query runs successfully with spark scala shell. 
> But when I run the SQL query command over it like Select * from  abc where 
> year < extract('year', '2015-02-03') from the Scala Ecilipse IDE  
> It throws below excpetion.
> org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
> Aggregate false, [], [Coalesce(SUM(PartialCount#30L),0) AS count#28L]
>  Aggregate true, [], [COUNT(1) AS PartialCount#30L]
>   Project []
>    Limit 10
>     Filter (CAST(d_year#6, DoubleType) < CAST(scalaUDF(YEAR,2005-02-15), 
> DoubleType))
>      PhysicalRDD 
> [d_date_sk#0,d_date_id#1,d_date#2,d_month_seq#3,d_week_seq#4,d_quarter_seq#5,d_year#6,d_dow#7,d_moy#8,d_dom#9,d_qoy#10,d_fy_year#11,d_fy_quarter_seq#12,d_fy_week_seq#13,d_day_name#14,d_quarter_name#15,d_holiday#16,d_weekend#17,d_following_holiday#18,d_first_dom#19,d_last_dom#20,d_same_day_ly#21,d_same_day_lq#22,d_current_day#23,d_current_week#24,d_current_month#25,d_current_quarter#26,d_current_year#27],
>  JDBCRDD[0] at RDD at JDBCRDD.scala:205
>       at 
> org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:47)
>       at org.apache.spark.sql.execution.Aggregate.execute(Aggregate.scala:122)
>       at 
> org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:83)
>       at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:815)
>       at org.apache.spark.sql.DataFrame.count(DataFrame.scala:827)
>       at 
> org.spark.sql.SQLWrapper.ProxySQL.displayTeraResult(ProxySQL.scala:60)
>       at org.spark.sql.SQLWrapper.ProxyMain$.main(ProxyMain.scala:21)
>       at org.spark.sql.SQLWrapper.ProxyMain.main(ProxyMain.scala)
> Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: 
> execute, tree:
> Aggregate true, [], [COUNT(1) AS PartialCount#30L]
>  Project []
>   Limit 10
>    Filter (CAST(d_year#6, DoubleType) < CAST(scalaUDF(YEAR,2005-02-15), 
> DoubleType))
>     PhysicalRDD 
> [d_date_sk#0,d_date_id#1,d_date#2,d_month_seq#3,d_week_seq#4,d_quarter_seq#5,d_year#6,d_dow#7,d_moy#8,d_dom#9,d_qoy#10,d_fy_year#11,d_fy_quarter_seq#12,d_fy_week_seq#13,d_day_name#14,d_quarter_name#15,d_holiday#16,d_weekend#17,d_following_holiday#18,d_first_dom#19,d_last_dom#20,d_same_day_ly#21,d_same_day_lq#22,d_current_day#23,d_current_week#24,d_current_month#25,d_current_quarter#26,d_current_year#27],
>  JDBCRDD[0] at RDD at JDBCRDD.scala:205
>       at 
> org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:47)
>       at org.apache.spark.sql.execution.Aggregate.execute(Aggregate.scala:122)
>       at 
> org.apache.spark.sql.execution.Aggregate$$anonfun$execute$1.apply(Aggregate.scala:124)
>       at 
> org.apache.spark.sql.execution.Aggregate$$anonfun$execute$1.apply(Aggregate.scala:123)
>       at 
> org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:46)
>       ... 7 more
> Caused by: org.apache.spark.SparkException: Task not serializable



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to