Thank you for clarification, Sean.
2015-02-13 14:16 GMT+04:00 Sean Owen :
> It is a wrapper whose API is logically the same, but whose method
> signature make more sense in Java. You can call the Scala API in Java
> without too much trouble, but it gets messy when you have to manually
> grapple w
It is a wrapper whose API is logically the same, but whose method
signature make more sense in Java. You can call the Scala API in Java
without too much trouble, but it gets messy when you have to manually
grapple with ClassTag from Java for example.
There is not an implicit conversion since it is
Thank's for reply. I solved porblem with importing
org.apache.spark.SparkContext._ by Imran Rashid suggestion.
In the sake of interest, does JavaPairRDD intended for use from java? What
is the purpose of this class? Does my rdd implicitly converted to it in
some circumstances?
2015-02-12 19:42 GM
Thank you. That worked.
2015-02-12 20:03 GMT+04:00 Imran Rashid :
> You need to import the implicit conversions to PairRDDFunctions with
>
> import org.apache.spark.SparkContext._
>
> (note that this requirement will go away in 1.3:
> https://issues.apache.org/jira/browse/SPARK-4397)
>
> On Thu,
You need to import the implicit conversions to PairRDDFunctions with
import org.apache.spark.SparkContext._
(note that this requirement will go away in 1.3:
https://issues.apache.org/jira/browse/SPARK-4397)
On Thu, Feb 12, 2015 at 9:36 AM, Vladimir Protsenko
wrote:
> Hi. I am stuck with how to
You can use JavaPairRDD which has:
override def wrapRDD(rdd: RDD[(K, V)]): JavaPairRDD[K, V] =
JavaPairRDD.fromRDD(rdd)
Cheers
On Thu, Feb 12, 2015 at 7:36 AM, Vladimir Protsenko
wrote:
> Hi. I am stuck with how to save file to hdfs from spark.
>
> I have written MyOutputFormat extends FileO