Hello,

I am new to Spark. I have downloaded Spark 1.1.0 and trying to run the
TallSkinnySVD.scala example with different input data sizes. I tried with
input data with 1000X1000 matrix, 5000X5000 matrix.
Though I had faced some Java Heap issues I added following parameters in
"spark-defaults.conf"
            spark.driver.memory              5g
            spark.executor.memory               6g

Now, I am trying with 7000X7000 input matrix, but it fails with OutofMemory
error.
I tried by setting executor memoryto 8g but didn't worked.
I also tried by setting persist to MEMORY_AND_DISK level but no luck.
          rows.persist(StorageLevel.MEMORY_AND_DISK)

Below is the exception stack.


14/09/24 15:40:34 INFO BlockManager: Removing block taskresult_56
14/09/24 15:40:34 INFO MemoryStore: Block taskresult_56 of size 196986164
dropped from memory (free 1669359430)
14/09/24 15:40:34 INFO BlockManagerInfo: Removed taskresult_56 on
CONSB2A.cnw.co.nz:53593 in memory (size: 187.9 MB, free: 1592.1 MB)
14/09/24 15:40:34 INFO BlockManagerMaster: Updated info of block
taskresult_56
14/09/24 15:40:34 INFO TaskSetManager: Finished task 3.0 in stage 2.0 (TID
56) in 10948 ms on localhost (4/4)
14/09/24 15:40:34 INFO DAGScheduler: Stage 2 (reduce at
RDDFunctions.scala:111) finished in 45.899 s
14/09/24 15:40:34 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks
have all completed, from pool 
14/09/24 15:40:34 INFO SparkContext: Job finished: reduce at
RDDFunctions.scala:111, took 330.708589452 s
14/09/24 15:40:38 INFO ContextCleaner: Cleaned shuffle 0
14/09/24 15:40:38 INFO BlockManager: Removing broadcast 4
14/09/24 15:40:38 INFO BlockManager: Removing block broadcast_4
14/09/24 15:40:38 INFO MemoryStore: Block broadcast_4 of size 2568 dropped
from memory (free 1669361998)
14/09/24 15:40:38 INFO ContextCleaner: Cleaned broadcast 4
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at breeze.linalg.svd$Svd_DM_Impl$.apply(svd.scala:48)
        at breeze.linalg.svd$Svd_DM_Impl$.apply(svd.scala:32)
        at breeze.generic.UFunc$class.apply(UFunc.scala:48)
        at breeze.linalg.svd$.apply(svd.scala:17)
        at
org.apache.spark.mllib.linalg.distributed.RowMatrix.computeSVD(RowMatrix.scala:231)
        at
org.apache.spark.mllib.linalg.distributed.RowMatrix.computeSVD(RowMatrix.scala:171)
        at
org.apache.spark.examples.mllib.TallSkinnySVD$.main(TallSkinnySVD.scala:74)
        at 
org.apache.spark.examples.mllib.TallSkinnySVD.main(TallSkinnySVD.scala)
14/09/24 15:40:39 INFO ShuffleBlockManager: Deleted all files for shuffle 0





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-OutOfMemoryError-while-running-SVD-MLLib-example-tp14972.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to