Re: Spark submit OutOfMemory Error in local mode

2017-08-29 Thread muthu
Are you getting OutOfMemory on the driver or on the executor? Typical cause of OOM in Spark can be due to fewer number of tasks for a job. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29117.html Sent

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread Naga G
t; Am 22.08.2017 um 20:16 schrieb shitijkuls : >> >> Any help here will be appreciated. >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-Ou

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread u...@moosheimer.com
ce etc Mit freundlichen Grüßen / best regards Kay-Uwe Moosheimer > Am 22.08.2017 um 20:16 schrieb shitijkuls : > > Any help here will be appreciated. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMem

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread shitijkuls
Any help here will be appreciated. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29096.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-09 Thread Tóth Zoltán
Thanks Zoltan. So far I got to a full repro which works both in docker and on a bigger real-world cluster. Also, the whole thing only happens in `cluster` mode. I issued a ticket for it. Any thoughts? https://issues.apache.org/jira/browse/SPARK-10487 On Mon, Sep 7, 2015 at 7:59 PM, Zsolt Tóth

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zsolt Tóth
Hi, I ran your example on Spark-1.4.1 and 1.5.0-rc3. It succeeds on 1.4.1 but throws the OOM on 1.5.0. Do any of you know which PR introduced this issue? Zsolt 2015-09-07 16:33 GMT+02:00 Zoltán Zvara : > Hey, I'd try to debug, profile ResolvedDataSource. As far as I know, your > write will b

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Zvara
Hey, I'd try to debug, profile ResolvedDataSource. As far as I know, your write will be performed by the JVM. On Mon, Sep 7, 2015 at 4:11 PM Tóth Zoltán wrote: > Unfortunately I'm getting the same error: > The other interesting things are that: > - the parquet files got actually written to HDFS

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Tóth Zoltán
Unfortunately I'm getting the same error: The other interesting things are that: - the parquet files got actually written to HDFS (also with .write.parquet() ) - the application gets stuck in the RUNNING state for good even after the error is thrown 15/09/07 10:01:10 INFO spark.ContextCleaner: C

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread boci
Hi, Can you try to using save method instead of write? ex: out_df.save("path","parquet") b0c1 -- Skype: boci13, Hangout: boci.b...@gmail.com On Mon, Sep 7, 2015 at 3:

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Tóth
Aaand, the error! :) Exception in thread "org.apache.hadoop.hdfs.PeerCache@4e000abf" Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "org.apache.hadoop.hdfs.PeerCache@4e000abf" Exception in thread "Thread-7" Exception: java.lang.OutOfMemoryError thrown from

OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Tóth
Hi, When I execute the Spark ML Logisitc Regression example in pyspark I run into an OutOfMemory exception. I'm wondering if any of you experienced the same or has a hint about how to fix this. The interesting bit is that I only get the exception when I try to write the result DataFrame into a fi

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-23 Thread Sourav Chandra
> > *at >>> > >>> org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:236)* >>> > * at >>> > >>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readObject$1.apply$mcV

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-23 Thread Sourav Chandra
bjectInputStream.readSerialData(ObjectInputStream.java:1866)* >> > *at >> > >> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)* >> > *at >> > java.io.ObjectInputStream.readObject0(ObjectIn

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-22 Thread Tathagata Das
gt; *at > > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)* > > * at > > > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)* > > *at > > java.io.ObjectInputStream.readObject0(

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-22 Thread Sourav Chandra
1771)* > *at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)* > *15/04/21 15:51:23 ERROR ExecutorUncaughtExceptionHandler: Uncaught > exception in thread Thread[Executor task launch worker-1,5,main]* > > > > On Wed, Apr 22, 2015 at 1:32 AM, Olivier Girardot

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-22 Thread Sourav Chandra
< > sourav.chan...@livestream.com> a écrit : > >> Hi, >> >> We are building a spark streaming application which reads from kafka, >> does updateStateBykey based on the received message type and finally stores >> into redis. >> >> After running for

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-21 Thread Olivier Girardot
stores into > redis. > > After running for few seconds the executor process get killed by throwing > OutOfMemory error. > > The code snippet is below: > > > *NoOfReceiverInstances = 1* > > *val kafkaStreams = (1 to NoOfReceiverInstances).map(* > * _ => K

Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-21 Thread Sourav Chandra
Hi, We are building a spark streaming application which reads from kafka, does updateStateBykey based on the received message type and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below

Re: OutOfMemory error in Spark Core

2015-01-15 Thread Akhil Das
ith incremental data in Avro > 3. doing timestamp based duplicate removal (including partitioning in > reduceByKey) and > 4. joining a couple of MySQL tables using JdbcRdd > > Of late, we are seeing major instabilities where the app crashes on a lost > executor which itself f

OutOfMemory error in Spark Core

2015-01-15 Thread Anand Mohan
bilities where the app crashes on a lost executor which itself failed due to a OutOfMemory error as below. This looks almost identical to https://issues.apache.org/jira/browse/SPARK-4885 even though we are seeing this error in Spark 1.1 2015-01-15 20:12:51,653 [handle-message-exec

RE: OutOfMemory Error

2014-08-20 Thread Shao, Saisai
/configuration.html Thanks Jerry From: MEETHU MATHEW [mailto:meethu2...@yahoo.co.in] Sent: Wednesday, August 20, 2014 4:48 PM To: Akhil Das; Ghousia Cc: user@spark.apache.org Subject: Re: OutOfMemory Error Hi , How to increase the heap size? What is the difference between spark executor memory and heap

Re: OutOfMemory Error

2014-08-20 Thread MEETHU MATHEW
n Mon, Aug 18, 2014 at 10:40 AM, Ghousia Taj >>wrote: >> >>Hi, >>> >>>I am trying to implement machine learning algorithms on Spark. I am working >>>on a 3 node cluster, with each node having 5GB of memory. Whenever I am >>>working with slightly

Re: OutOfMemory Error

2014-08-19 Thread Ghousia
w huge value, resulting in OutOfMemory Error. > > > > On Mon, Aug 18, 2014 at 12:34 PM, Akhil Das > wrote: > >> I believe spark.shuffle.memoryFraction is the one you are looking for. >> >> spark.shuffle.memoryFraction : Fraction of Java heap to use for

Re: OutOfMemory Error

2014-08-18 Thread Ghousia
But this would be applicable only to operations that have a shuffle phase. This might not be applicable to a simple Map operation where a record is mapped to a new huge value, resulting in OutOfMemory Error. On Mon, Aug 18, 2014 at 12:34 PM, Akhil Das wrote: > I beli

Re: OutOfMemory Error

2014-08-18 Thread Akhil Das
>> >> Thanks >> Best Regards >> >> >> On Mon, Aug 18, 2014 at 10:40 AM, Ghousia Taj >> wrote: >> >>> Hi, >>> >>> I am trying to implement machine learning algorithms on Spark. I am >>> working >>&

Re: OutOfMemory Error

2014-08-17 Thread Ghousia
10:40 AM, Ghousia Taj > wrote: > >> Hi, >> >> I am trying to implement machine learning algorithms on Spark. I am >> working >> on a 3 node cluster, with each node having 5GB of memory. Whenever I am >> working with slightly more number of records, I end up wi

Re: OutOfMemory Error

2014-08-17 Thread Akhil Das
rking > on a 3 node cluster, with each node having 5GB of memory. Whenever I am > working with slightly more number of records, I end up with OutOfMemory > Error. Problem is, even if number of records is slightly high, the > intermediate result from a transformation is huge and this resu

OutOfMemory Error

2014-08-17 Thread Ghousia Taj
Hi, I am trying to implement machine learning algorithms on Spark. I am working on a 3 node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high, the