You're not getting what Ted is telling you.  Your `dict` is an RDD[String]
 -- i.e. it is a collection of a single value type, String.  But
`collectAsMap` is only defined for PairRDDs that have key-value pairs for
their data elements.  Both a key and a value are needed to collect into a
Map[K, V].

On Sun, Mar 20, 2016 at 8:19 PM, Shishir Anshuman <shishiranshu...@gmail.com
> wrote:

> yes I have included that class in my code.
> I guess its something to do with the RDD format. Not able to figure out
> the exact reason.
>
> On Fri, Mar 18, 2016 at 9:27 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> It is defined in:
>> core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
>>
>> On Thu, Mar 17, 2016 at 8:55 PM, Shishir Anshuman <
>> shishiranshu...@gmail.com> wrote:
>>
>>> I am using following code snippet in scala:
>>>
>>>
>>> *val dict: RDD[String] = sc.textFile("path/to/csv/file")*
>>> *val dict_broadcast=sc.broadcast(dict.collectAsMap())*
>>>
>>> On compiling It generates this error:
>>>
>>> *scala:42: value collectAsMap is not a member of
>>> org.apache.spark.rdd.RDD[String]*
>>>
>>>
>>> *val dict_broadcast=sc.broadcast(dict.collectAsMap())
>>>                   ^*
>>>
>>
>>
>

Reply via email to