Not exactly...I was not going to flatmap the rdd
In the end I amended my approach to the problem and managed to get the
flatmap on the dataset
Thx for answering
Kr

On Sep 16, 2017 4:53 PM, "Akhil Das" <ak...@hacked.work> wrote:

> scala> case class Fruit(price: Double, name: String)
> defined class Fruit
>
> scala> val ds = Seq(Fruit(10.0,"Apple")).toDS()
> ds: org.apache.spark.sql.Dataset[Fruit] = [price: double, name: string]
>
> scala> ds.rdd.flatMap(f => f.name.toList).collect
> res8: Array[Char] = Array(A, p, p, l, e)
>
>
> This is what you want to do?
>
> On Fri, Sep 15, 2017 at 4:21 AM, Marco Mistroni <mmistr...@gmail.com>
> wrote:
>
>> HI all
>>  could anyone assist pls?
>> i am trying to flatMap a DataSet[(String, String)] and i am getting
>> errors in Eclipse
>> the errors are more Scala related than spark -related, but i was
>> wondering if someone came across
>> a similar situation
>>
>> here's what i got. A DS of (String, String) , out of which i am using
>> flatMap to get a List[Char] of for the second element in the tuple.
>>
>> val tplDataSet = < DataSet[(String, String)] >
>>
>> val expanded = tplDataSet.flatMap(tpl  => tpl._2.toList,
>> Encoders.product[(String, String)])
>>
>>
>> Eclipse complains that  'tpl' in the above function is missing parameter
>> type....
>>
>> what am i missing? or perhaps i am using the wrong approach?
>>
>> w/kindest regards
>>  Marco
>>
>
>
>
> --
> Cheers!
>
>

Reply via email to