Sorry, that was the wrong error, here is the correct one.  I don't think it
has to do with Kyro...

java.lang.OutOfMemoryError: Java heap space
        at it.unimi.dsi.fastutil.bytes.ByteArrays.trim(ByteArrays.java:189)
        at
it.unimi.dsi.fastutil.io.FastByteArrayOutputStream.trim(FastByteArrayOutputStream.java:84)
        at
org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:825)
        at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:78)
        at
org.apache.spark.storage.BlockManager.liftedTree1$1(BlockManager.scala:552)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:546)
        at org.apache.spark.storage.BlockManager.put(BlockManager.scala:477)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:76)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:224)
        at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:29)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:237)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:226)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:34)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:237)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:226)
        at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:159)
        at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:100)
        at org.apache.spark.scheduler.Task.run(Task.scala:53)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:215)
        at
org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:50)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/OutOfMemoryError-with-basic-kmeans-tp1651p1657.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to