any specific reason you would like to use collectasmap only? You probably
move to normal RDD instead of a Pair.


On Monday, March 21, 2016, Mark Hamstra <m...@clearstorydata.com> wrote:

> You're not getting what Ted is telling you.  Your `dict` is an RDD[String]
>  -- i.e. it is a collection of a single value type, String.  But
> `collectAsMap` is only defined for PairRDDs that have key-value pairs for
> their data elements.  Both a key and a value are needed to collect into a
> Map[K, V].
>
> On Sun, Mar 20, 2016 at 8:19 PM, Shishir Anshuman <
> shishiranshu...@gmail.com
> <javascript:_e(%7B%7D,'cvml','shishiranshu...@gmail.com');>> wrote:
>
>> yes I have included that class in my code.
>> I guess its something to do with the RDD format. Not able to figure out
>> the exact reason.
>>
>> On Fri, Mar 18, 2016 at 9:27 AM, Ted Yu <yuzhih...@gmail.com
>> <javascript:_e(%7B%7D,'cvml','yuzhih...@gmail.com');>> wrote:
>>
>>> It is defined in:
>>> core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
>>>
>>> On Thu, Mar 17, 2016 at 8:55 PM, Shishir Anshuman <
>>> shishiranshu...@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','shishiranshu...@gmail.com');>> wrote:
>>>
>>>> I am using following code snippet in scala:
>>>>
>>>>
>>>> *val dict: RDD[String] = sc.textFile("path/to/csv/file")*
>>>> *val dict_broadcast=sc.broadcast(dict.collectAsMap())*
>>>>
>>>> On compiling It generates this error:
>>>>
>>>> *scala:42: value collectAsMap is not a member of
>>>> org.apache.spark.rdd.RDD[String]*
>>>>
>>>>
>>>> *val dict_broadcast=sc.broadcast(dict.collectAsMap())
>>>>                   ^*
>>>>
>>>
>>>
>>
>

Reply via email to