Hey, I have a few examples https://github.com/jgperrin/net.jgp.labs.spark. I recently worked on such problems, so there's definitely a solution there or I'll be happy to write one for you.
Look in l250 map... jg > On Sep 10, 2017, at 20:51, ayan guha <guha.a...@gmail.com> wrote: > > Sorry for side-line question, but for Python, isn't following the easiest: > > >>> import json > >>> df1 = df.rdd.map(lambda r: json.dumps(r.asDict())) > >>> df1.take(10) > ['{"id": 1}', '{"id": 2}', '{"id": 3}', '{"id": 4}', '{"id": 5}'] > > > > >> On Mon, Sep 11, 2017 at 4:22 AM, Riccardo Ferrari <ferra...@gmail.com> wrote: >> Hi Kant, >> >> You can check the getValuesMap. I found this post useful, it is in Scala but >> should be a good starting point. >> An alternative approach is combine the 'struct' and 'to_json' functions. I >> have not tested this in Java but I am using it in Python. >> >> Best, >> >>> On Sun, Sep 10, 2017 at 1:45 AM, kant kodali <kanth...@gmail.com> wrote: >>> toJSON on Row object. >>> >>>> On Sat, Sep 9, 2017 at 4:18 PM, Felix Cheung <felixcheun...@hotmail.com> >>>> wrote: >>>> toJSON on Dataset/DataFrame? >>>> >>>> From: kant kodali <kanth...@gmail.com> >>>> Sent: Saturday, September 9, 2017 4:15:49 PM >>>> To: user @spark >>>> Subject: How to convert Row to JSON in Java? >>>> >>>> Hi All, >>>> >>>> How to convert Row to JSON in Java? It would be nice to have .toJson() >>>> method in the Row class. >>>> >>>> Thanks, >>>> kant >>> >> > > > > -- > Best Regards, > Ayan Guha