[ 
https://issues.apache.org/jira/browse/SPARK-5456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14386628#comment-14386628
 ] 

Karthik Gorthi commented on SPARK-5456:
---------------------------------------

One workaround we followed is to convert all Decimal Datatype columns to 
integer (truncated).
But, this is not possible when the application needs to connect to third-party 
databases were the datatype is obviously not under our control. 

So, this is a serious bug, IMHO, which prohibits using Spark in cases, unless 
there is a workaround?

> Decimal Type comparison issue
> -----------------------------
>
>                 Key: SPARK-5456
>                 URL: https://issues.apache.org/jira/browse/SPARK-5456
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0, 1.3.0
>            Reporter: Kuldeep
>
> Not quite able to figure this out but here is a junit test to reproduce this, 
> in JavaAPISuite.java
> {code:title=DecimalBug.java}
>   @Test
>   public void decimalQueryTest() {
>     List<Row> decimalTable = new ArrayList<Row>();
>     decimalTable.add(RowFactory.create(new BigDecimal("1"), new 
> BigDecimal("2")));
>     decimalTable.add(RowFactory.create(new BigDecimal("3"), new 
> BigDecimal("4")));
>     JavaRDD<Row> rows = sc.parallelize(decimalTable);
>     List<StructField> fields = new ArrayList<StructField>(7);
>     fields.add(DataTypes.createStructField("a", 
> DataTypes.createDecimalType(), true));
>     fields.add(DataTypes.createStructField("b", 
> DataTypes.createDecimalType(), true));
>     sqlContext.applySchema(rows.rdd(), 
> DataTypes.createStructType(fields)).registerTempTable("foo");
>     Assert.assertEquals(sqlContext.sql("select * from foo where a > 
> 0").collectAsList(), decimalTable);
>   }
> {code}
> Fails with
> java.lang.ClassCastException: java.math.BigDecimal cannot be cast to 
> org.apache.spark.sql.types.Decimal



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to