Do people usually important o.a.spark.rdd._ ? Also in order to maintain source and binary compatibility, we would need to keep both right?
On Thu, Nov 6, 2014 at 3:12 AM, Shixiong Zhu <zsxw...@gmail.com> wrote: > I saw many people asked how to convert a RDD to a PairRDDFunctions. I would > like to ask a question about it. Why not put the following implicit into > "pacakge object rdd" or "object rdd"? > > implicit def rddToPairRDDFunctions[K, V](rdd: RDD[(K, V)]) > (implicit kt: ClassTag[K], vt: ClassTag[V], ord: Ordering[K] = null) > = { > new PairRDDFunctions(rdd) > } > > If so, the converting will be automatic and not need to > import org.apache.spark.SparkContext._ > > I tried to search some discussion but found nothing. > > Best Regards, > Shixiong Zhu >