Oops, exception is below.
For local, it works and that's the case since TorrentBroadcast has if !isLocal, 
then that's the only time the broadcast actually happens. It really seems as if 
the Kryo wrapper didn't kick in for some reason. Do we have a unit test that 
tests the Kryo serialization that I can give a try?
Thanks,
Ron

Exception in thread "Driver" java.lang.reflect.InvocationTargetException at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606) at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180)
Caused by: java.io.NotSerializableException: 
org.apache.avro.generic.GenericData$Record - custom writeObject data (class 
"scala.collection.mutable.HashMap")


On Friday, August 8, 2014 10:16 AM, Reynold Xin <r...@databricks.com> wrote:
 


Looks like you didn't actually paste the exception message. Do you mind
doing that?




On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin <r...@databricks.com> wrote:

> Pasting a better formatted trace:
>
>
>
> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
> at
> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137)
> at
> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135)
> at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
> at
> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124)
> at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39)
> at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606) at
>  java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
> at
> java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
> at
> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
> at org.apache.spark.util.Utils$.serialize(Utils.scala:64)
> at
> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232)
> at
> org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85)
> at
> org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:66)
> at
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
> at
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
> at
>
>  
>org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
>
>
> On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez <
> zlgonza...@yahoo.com.invalid> wrote:
>
>> Hi,
>> I have a running spark app against the released version of 1.0.1. I
>> recently decided to try and upgrade to the trunk version. Interestingly
>> enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my
>> assembly in my app caused errors. In particular, it seems Kryo
>> serialization isn't taking. Replacing it with 1.0.1 automatically gets it
>> working again.
>>
>> Any thoughts? Is this a known issue?
>>
>> Thanks,
>> Ron
>>
>> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
>> at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at
>> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137)
>> at
>> scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135)
>> at
>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at
>> scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124)
>> at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at
>> scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606) at
>>  java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
>> at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
>> at
>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
>> at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at
>> java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at
>> org.apache.spark.util.Utils$.serialize(Utils.scala:64) at
>> org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232)
>> at
>> org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85)
>> at
>> org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:66)
>> at
>> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
>> at
>> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
>> at
>>  
>>org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
>> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)
>
>
>

Reply via email to