Even through grouping by only on name, the issue (CassCastException) still be 
here.

----- 原始邮件 ----
发件人:ayan guha <guha.a...@gmail.com>
收件人:doovs...@sina.com
抄送人:user <user@spark.apache.org>
主题:Re: Spark SQL 1.3.1: java.lang.ClassCastException is thrown
日期:2015年04月25日 22点33分

Sorry if I am looking at the wrong issue, but your query is wrong.....you 
shoulf group by only on name.....
On Sat, Apr 25, 2015 at 11:59 PM,  <doovs...@sina.com> wrote:
Hi all,

When I query Postgresql based on Spark SQL like this:

      dataFrame.registerTempTable("Employees")

      val emps = sqlContext.sql("select name, sum(salary) from Employees group 
by name, salary")

      monitor {

        emps.take(10)

          .map(row => (row.getString(0), row.getDecimal(1)))

          .foreach(println)

      }



The type of salary column in data table is numeric(10, 2).



It throws the following exception:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost 
task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: 
java.math.BigDecimal cannot be cast to org.apache.spark.sql.types.Decimal



Who know this issue and how to solve it? Thanks.



Regards,

Yi

---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

For additional commands, e-mail: user-h...@spark.apache.org





-- 
Best Regards,
Ayan Guha
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to