BTW, I am seeing this issue in Spark 1.1.1. 

     On Sunday, January 4, 2015 7:29 PM, RK <prk...@yahoo.com.INVALID> wrote:
   

 When I use a single IF statement like "select IF(col1 != "", col1+'$'+col3, 
col2+'$'+col3) from my_table", it works fine.
However, when I use a nested IF like "select IF(col1 != "", col1+'$'+col3, 
IF(col2 != "", col2+'$'+col3, '$')) from my_table", I am getting the following 
exception.
Exception in thread "main" 
org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved 
attributes: if (NOT (col1#1 = )) (CAST((CAST(col1#1, DoubleType) + CAST($, 
DoubleType)), DoubleType) + CAST(col3#3, DoubleType)) else if (NOT (col2#2 = )) 
(CAST((CAST(col2#2, DoubleType) + CAST($, DoubleType)), DoubleType) + 
CAST(col3#3, DoubleType)) else $ AS c0#4, tree:Project [if (NOT (col1#1 = )) 
(CAST((CAST(col1#1, DoubleType) + CAST($, DoubleType)), DoubleType) + 
CAST(col3#3, DoubleType)) else if (NOT (col2#2 = )) (CAST((CAST(col2#2, 
DoubleType) + CAST($, DoubleType)), DoubleType) + CAST(col3#3, DoubleType)) 
else $ AS c0#4] Subquery my_table  SparkLogicalPlan (ExistingRdd 
[DB#0,col1#1,col2#2,col3#3], MappedRDD[97] at getCallSite at DStream.scala:294)
Does Spark SQL not support nested IF queries or is my query incorrect?
Thanks,RK 

   

Reply via email to