You have to "import StreamingContext._ "
That imports implicit conversions that allow reduceByKey() to be applied on
DStreams with key-value pairs.

TD




On Tue, Feb 18, 2014 at 12:03 PM, bethesda <[email protected]> wrote:

> I am getting this error when trying to code from the following page in the
> shell:
>
>
> http://spark.incubator.apache.org/docs/latest/streaming-programming-guide.html
>
> I believe that DStream[(String, Int)] is precisely the class that should
> have this function so I am confused.
>
> Thanks!
>
> scala> val words = lines.flatMap(_.split(" "))
> words: org.apache.spark.streaming.dstream.DStream[String] =
> org.apache.spark.streaming.dstream.FlatMappedDStream@56731b34
>
> scala> val pairs = words.map(word => (word, 1))
> pairs: org.apache.spark.streaming.dstream.DStream[(String, Int)] =
> org.apache.spark.streaming.dstream.MappedDStream@3145888
>
> scala> val wordCounts = pairs.reduceByKey(_ + _)
> <console>:26: error: value reduceByKey is not a member of
> org.apache.spark.streaming.dstream.DStream[(String, Int)]
>        val wordCounts = pairs.reduceByKey(_ + _)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/reduceByKey-is-not-a-member-of-org-apache-spark-streaming-dstream-DStream-String-Int-tp1694.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to