Hi all,
When I query Postgresql based on Spark SQL like this:
      dataFrame.registerTempTable("Employees")
      val emps = sqlContext.sql("select name, sum(salary) from Employees group 
by name, salary")
      monitor {
        emps.take(10)
          .map(row => (row.getString(0), row.getDecimal(1)))
          .foreach(println)
      }

The type of salary column in data table is numeric(10, 2).

It throws the following exception:
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost 
task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: 
java.math.BigDecimal cannot be cast to org.apache.spark.sql.types.Decimal

Who know this issue and how to solve it? Thanks.

Regards,
Yi
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to