I just misread the API doc, and forgot to pass the type information when 
calling this 

Best, 

-- 
Nan Zhu


On Monday, February 24, 2014 at 8:17 AM, Mridul Muralidharan wrote:

> Curious, what was the issue ?
> 
> - Mridul
> 
> On Sun, Feb 23, 2014 at 11:41 PM, Nan Zhu <zhunanmcg...@gmail.com 
> (mailto:zhunanmcg...@gmail.com)> wrote:
> > OK, I know where I was wrong
> > 
> > 
> > Best,
> > 
> > --
> > Nan Zhu
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > 
> > 
> > On Sunday, February 23, 2014 at 12:50 PM, Nan Zhu wrote:
> > 
> > > String, it should be get the following helper function
> > > 
> > > private[spark] def getKeyClass() = implicitly[ClassTag[K]].runtimeClass
> > > 
> > > private[spark] def getValueClass() = implicitly[ClassTag[V]].runtimeClass
> > > 
> > > and this is what I run
> > > 
> > > scala> val a = sc.textFile("/Users/nanzhu/code/incubator-spark/LICENSE", 
> > > 2).map(line => ("a", "b"))
> > > 
> > > scala> a.saveAsNewAPIHadoopFile("/Users/nanzhu/code/output_rdd")
> > > java.lang.InstantiationException
> > > at 
> > > sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> > > at java.lang.Class.newInstance(Class.java:374)
> > > at 
> > > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:632)
> > > at 
> > > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:590)
> > > at $iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
> > > at $iwC$$iwC$$iwC.<init>(<console>:20)
> > > at $iwC$$iwC.<init>(<console>:22)
> > > at $iwC.<init>(<console>:24)
> > > at <init>(<console>:26)
> > > at .<init>(<console>:30)
> > > at .<clinit>(<console>)
> > > at .<init>(<console>:7)
> > > at .<clinit>(<console>)
> > > at $print(<console>)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at 
> > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > > at 
> > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:606)
> > > at 
> > > org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:774)
> > > at 
> > > org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1042)
> > > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:611)
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:642)
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:606)
> > > at 
> > > org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:790)
> > > at 
> > > org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:835)
> > > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:747)
> > > at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:595)
> > > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:602)
> > > at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:605)
> > > at 
> > > org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:928)
> > > at 
> > > org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:878)
> > > at 
> > > org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:878)
> > > at 
> > > scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> > > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:878)
> > > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:970)
> > > at org.apache.spark.repl.Main$.main(Main.scala:31)
> > > at org.apache.spark.repl.Main.main(Main.scala)
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > --
> > > Nan Zhu
> > > 
> > > 
> > > On Sunday, February 23, 2014 at 11:06 AM, Nick Pentreath wrote:
> > > 
> > > > Hi
> > > > 
> > > > What KeyClass and ValueClass are you trying to save as the keys/values 
> > > > of
> > > > your dataset?
> > > > 
> > > > 
> > > > 
> > > > On Sun, Feb 23, 2014 at 10:48 AM, Nan Zhu <zhunanmcg...@gmail.com 
> > > > (mailto:zhunanmcg...@gmail.com)> wrote:
> > > > 
> > > > > Hi, all
> > > > > 
> > > > > I found the weird thing on saveAsNewAPIHadoopFile in
> > > > > PairRDDFunctions.scala when working on the other issue,
> > > > > 
> > > > > saveAsNewAPIHadoopFile throws java.lang.InstantiationException all 
> > > > > the time
> > > > > 
> > > > > I checked the commit history of the file, it seems that the API 
> > > > > exists for
> > > > > a long time, no one else found this? (that's the reason I'm confusing)
> > > > > 
> > > > > Best,
> > > > > 
> > > > > --
> > > > > Nan Zhu
> > > > > 
> > > > 
> > > > 
> > > 
> > > 
> > 
> > 
> 
> 
> 


Reply via email to