Re: Spark Memory Issues

2014-08-05 Thread Sunny Khatri
Yeah, ran it on yarn-cluster mode. On Tue, Aug 5, 2014 at 12:17 PM, Akhil Das wrote: > Are you sure that you were not running SparkPi in local mode? > > Thanks > Best Regards > > > On Wed, Aug 6, 2014 at 12:43 AM, Sunny Khatri > wrote: > >> Well I was able to run the SparkPi, that also does th

Re: Spark Memory Issues

2014-08-05 Thread Akhil Das
Are you sure that you were not running SparkPi in local mode? Thanks Best Regards On Wed, Aug 6, 2014 at 12:43 AM, Sunny Khatri wrote: > Well I was able to run the SparkPi, that also does the similar stuff, > successfully. > > > On Tue, Aug 5, 2014 at 11:52 AM, Akhil Das > wrote: > >> For tha

Re: Spark Memory Issues

2014-08-05 Thread Sunny Khatri
Well I was able to run the SparkPi, that also does the similar stuff, successfully. On Tue, Aug 5, 2014 at 11:52 AM, Akhil Das wrote: > For that UI to have some values, your process should do some operation. > Which is not happening here ( 14/08/05 18:03:13 WARN > YarnClusterScheduler: Initial

Re: Spark Memory Issues

2014-08-05 Thread Akhil Das
For that UI to have some values, your process should do some operation. Which is not happening here ( 14/08/05 18:03:13 WARN YarnClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory ) Can you open up a

Re: Spark Memory Issues

2014-08-05 Thread Sunny Khatri
The only UI I have currently is the Application Master (Cluster mode), with the following executor nodes status: Executors (3) - *Memory:* 0.0 B Used (3.7 GB Total) - *Disk:* 0.0 B Used Executor IDAddressRDD BlocksMemory UsedDisk UsedActive TasksFailed TasksComplete TasksTotal TasksTask Tim

Re: Spark Memory Issues

2014-08-05 Thread Akhil Das
Are you able to see the job on the WebUI (8080)? If yes, how much memory are you seeing there specifically for this job? [image: Inline image 1] Here you can see i have 11.8Gb RAM on both workers and my app is using 11GB. 1. What are all the memory that you are seeing in your case? 2. Make sure

Spark Memory Issues

2014-08-05 Thread Sunny Khatri
Hi, I'm trying to run a spark application with the executor-memory 3G. but I'm running into the following error: 14/08/05 18:02:58 INFO DAGScheduler: Submitting Stage 0 (MappedRDD[5] at map at KMeans.scala:123), which has no missing parents 14/08/05 18:02:58 INFO DAGScheduler: Submitting 1 missin