Hi guys

I have parquet data written by Impala:
Server version: impalad version 2.1.2-cdh5 RELEASE (build
36aad29cee85794ecc5225093c30b1e06ffb68d3)

When using Spark SQL 1.3.0 (spark-assembly-1.3.0-hadoop2.4.0) i get the
following error:

val correlatedEventData = sqlCtx.sql(
      s"""
        |SELECT
        |id,
        |action_id,
        |adv_saleamount
        |FROM ir_correlated_event_t
        |""".stripMargin)
    correlatedEventData.take(10).foreach(println)
    println("correlatedEventData count: " + correlatedEventData.count)

    println("Decimal value: " + correlatedEventData.first().getDecimal(2))

Exception in thread "main" java.lang.ClassCastException:
scala.runtime.BoxedUnit cannot be cast to java.math.BigDecimal

Neither does it work when I use, getFloat(2), getDouble(2), getAs[Float](2)
or get(2).asInstanceOf[Float] etc etc

Any assistance will be appreciated.

Regards



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-reading-parquet-decimal-tp22490.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to