Alex Liu created SPARK-12985:
--------------------------------

             Summary: Spark Hive thrift server big decimal data issue
                 Key: SPARK-12985
                 URL: https://issues.apache.org/jira/browse/SPARK-12985
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.6.0
            Reporter: Alex Liu
            Priority: Minor


I tested the trial version JDBC driver from Simba, it works for simple query. 
But there is some issue with data mapping. e.g.
{code}
java.sql.SQLException: [Simba][SparkJDBCDriver](500312) Error in fetching data 
rows: java.math.BigDecimal cannot be cast to 
org.apache.hadoop.hive.common.type.HiveDecimal;
        at 
com.simba.spark.hivecommon.api.HS2Client.buildExceptionFromTStatus(Unknown 
Source)
        at com.simba.spark.hivecommon.api.HS2Client.fetchNRows(Unknown Source)
        at com.simba.spark.hivecommon.api.HS2Client.fetchRows(Unknown Source)
        at com.simba.spark.hivecommon.dataengine.BackgroundFetcher.run(Unknown 
Source)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
Caused by: com.simba.spark.support.exceptions.GeneralException: 
[Simba][SparkJDBCDriver](500312) Error in fetching data rows: 
java.math.BigDecimal cannot be cast to 
org.apache.hadoop.hive.common.type.HiveDecimal;
        ... 8 more

{code}



To fix it
{code}
       case DecimalType() =>
 -        to += from.getDecimal(ordinal)
 +        to += HiveDecimal.create(from.getDecimal(ordinal))
{code}
to 
https://github.com/apache/spark/blob/master/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala#L87



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to