: Sunday, May 10, 2015 10:44 PM
To: donhoff_h165612...@qq.com;
Cc: useruser@spark.apache.org;
Subject: Re: Does NullWritable can not be used in Spark?
Looking at
./core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala :
* Load an RDD saved as a SequenceFile containing
Looking
at ./core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala :
* Load an RDD saved as a SequenceFile containing serialized objects,
with NullWritable keys and
* BytesWritable values that contain a serialized partition. This is
still an experimental storage
...
def
Hi, experts.
I wrote a spark program to write a sequence file. I found if I used the
NullWritable as the Key Class of the SequenceFile, the program reported
exceptions. But if I used the BytesWritable or Text as the Key Class, the
program did not report the exceptions.
Does spark not