Something like this?
val json = myRDD.map(*map_obj* => new JSONObject(*map_obj*))
Here map_obj will be a map containing values (eg: *Map("name" -> "Akhil",
"mail" -> "xyz@xyz")*)
Performance wasn't so good with this one though.
Thanks
Best Regards
On Wed, Nov 5, 2014 at 3:02 AM, Yin Huai wr
Hello Andrejs,
For now, you need to use a JSON lib to serialize records of your datasets
as JSON strings. In future, we will add a method to SchemaRDD to let you
write a SchemaRDD in JSON format (I have created
https://issues.apache.org/jira/browse/SPARK-4228 to track it).
Thanks,
Yin
On Tue, N
Hi,
Can some one pleas sugest me, what is the best way to output spark data as
JSON file. (File where each line is a JSON object)
Cheers,
Andrejs