never mind, found the solution.

Spark 2.0 +

val df1 = df.withColumn("newcol",map(lit("field1"),lit("fieldName1")))

scala> df1.show()
+---+--------------------+
|  a|              newcol|
+---+--------------------+
|  1|Map(field1 -> fie...|
|  2|Map(field1 -> fie...|
|  3|Map(field1 -> fie...|
|  4|Map(field1 -> fie...|
+---+--------------------+

Thanks for creating such useful functions, spark devs :)

Best
Ayan

On Thu, Jul 27, 2017 at 2:30 PM, ayan guha <guha.a...@gmail.com> wrote:

> Hi
>
> I want to create a static Map Type column to a dataframe.
>
> How I am doing now:
>
>       val fieldList = spark.sparkContext.parallelize(Array(Row(Map("field1"
> -> "someField"))))
>
>       val fieldListSchemaBase = new StructType()
>
>       val f = StructField("encrypted_field_list",MapType(StringType,
> StringType))
>
>       val fieldListSchema = fieldListSchemaBase.add(f)
>
>       val fieldListDF = spark.createDataFrame(fieldList,fieldListSchema)
>
>
> val saveFinalWithFieldList = saveFinal.join(fieldListDF)
>
>
> But it requires me to switch on cartesian join (which this join is). Is
> there any other simpler way to achieve this? Probably using withColumn API?
>
> I saw a similar post here
> <https://stackoverflow.com/questions/44223751/how-to-add-empty-map-type-column-to-dataframe>but
> not able to use the trick Jacek suggested.
> --
> Best Regards,
> Ayan Guha
>



-- 
Best Regards,
Ayan Guha

Reply via email to