Is this the scala syntax? Yes in scala I know how to do it by converting the df to a dataset. how for pyspark?
Thanks On 2022/2/9 10:24, oliver dd wrote:
df.flatMap(row => row.getAs[String]("value").split(" "))
--------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org