I tried this, but it is throwing an error that the method "when" is not 
applicable.
I am doing this in Java instead of scala.
Note:- I am using spark 1.6.1 version.

-----Original Message-----
From: Stuart White [mailto:stuart.whi...@gmail.com] 
Sent: Monday, November 28, 2016 10:26 AM
To: Hitesh Goyal
Cc: user@spark.apache.org
Subject: Re: if conditions

Use the when() and otherwise() functions.  For example:

import org.apache.spark.sql.functions._

val rows = Seq(("bob", 1), ("lucy", 2), ("pat", 3)).toDF("name", "genderCode") 
rows.show

+----+----------+
|name|genderCode|
+----+----------+
| bob|         1|
|lucy|         2|
| pat|         3|
+----+----------+

rows
  .withColumn("genderString", when('genderCode === 1, 
"male").otherwise(when('genderCode === 2,
"female").otherwise("unknown")))
  .show

+----+----------+------------+
|name|genderCode|genderString|
+----+----------+------------+
| bob|         1|        male|
|lucy|         2|      female|
| pat|         3|     unknown|
+----+----------+------------+





On Sun, Nov 27, 2016 at 10:45 PM, Hitesh Goyal <hitesh.go...@nlpcaptcha.com> 
wrote:
> Hi team,
>
> I am using Apache spark 1.6.1 version. In this I am writing Spark SQL 
> queries. I found 2 ways of writing SQL queries. One is by simple SQL 
> syntax and other is by using spark Dataframe functions.
>
> I need to execute if conditions by using dataframe functions. Please 
> specify how can I do that.
>
>
>
> Regards,
>
> Hitesh Goyal
>
> Simpli5d Technologies
>
> Cont No.: 9996588220
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to