Al M created SPARK-14532:
----------------------------
Summary: Spark SQL IF/ELSE does not handle Double correctly
Key: SPARK-14532
URL: https://issues.apache.org/jira/browse/SPARK-14532
Project: Spark
Issue Type: Bug
Affects Versions: 1.6.1
Reporter: Al M
I am using Spark SQL to add new columns to my data. Below is an example
snipped in Scala:
{code}myDF.withColumn("newcol", new
Column(SqlParser.parseExpression(sparkSqlExpr))).show{code}
*What Works*
If sparkSqlExpr = "IF(1=1, 1, 0)" then i see 1 in the result as expected.
If sparkSqlExpr = "IF(1=1, 1.0, 1.5)" then i see 1.0 in the result as expected.
If sparkSqlExpr = "IF(1=1, 'A', 'B')" then i see 'A' in the result as expected.
*What does not Work*
If sparkSqlExpr = "IF(1=1, 1.0, 0.0)" then I see error
org.apache.spark.sql.AnalysisException: cannot resolve 'if ((1 = 1)) 1.0 else
0.0' due to data type mismatch: differing types in 'if ((1 = 1)) 1.0 else 0.0'
(decimal(2,1) and decimal(1,1)).;
If sparkSqlExpr = "IF(1=1, 1.0, 10.0)" then I see error If sparkSqlExpr =
"IF(1=1, 1.0, 0.0)" then I see error org.apache.spark.sql.AnalysisException:
cannot resolve 'if ((1 = 1)) 1.0 else 10.0' due to data type mismatch:
differing types in 'if ((1 = 1)) 1.0 else 10.0' (decimal(2,1) and
decimal(3,1)).;
If sparkSqlExpr = "IF(1=1, 1.1, 1.11)" then I see error
org.apache.spark.sql.AnalysisException: cannot resolve 'if ((1 = 1)) 1.1 else
1.11' due to data type mismatch: differing types in 'if ((1 = 1)) 1.1 else
1.11' (decimal(2,1) and decimal(3,2)).;
It looks like the Spark SQL typing system is seeing doubles as different types
depending on the number of digits before and after the decimal point
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]