You have to import the implicit conversions from SparkContext:

import org.apache.spark.SparkContext._



On Wed, Nov 6, 2013 at 5:44 PM, Shay Seng <[email protected]> wrote:

> Hi,
> I'm having some problem getting a piece of code that I can run in the REPL
> to compile....
>
> val aDay = day.map( n=>
>    ...
>   ((aInt,bInt),(cInt,dInt,eDbl,fInt,gDbl))
> )
>
> val seg = segments.map( n =>
>  ...
>  ((aInt,bInt), (......))
> )
>
> val allSegs = aDay.join(seg)
>
> error: value join is not a member of org.apache.spark.rdd.RDD[((Int, Int),
> (Int, Int, Double, Int, Double))]
>
> How do I make aDay a PairRddFunctions ?
>
>
>
>

Reply via email to