Re: Spark timeout issue

2015-04-27 Thread Akhil Das
You need to look more deep into your worker logs, you may find GC error, IO
exceptions etc if you look closely which is triggering the timeout.

Thanks
Best Regards

On Mon, Apr 27, 2015 at 3:18 AM, Deepak Gopalakrishnan dgk...@gmail.com
wrote:

 Hello Patrick,

 Sure. I've posted this on user as well. Will be cool to get a response.

 Thanks
 Deepak

 On Mon, Apr 27, 2015 at 2:58 AM, Patrick Wendell pwend...@gmail.com
 wrote:

 Hi Deepak - please direct this to the user@ list. This list is for
 development of Spark itself.

 On Sun, Apr 26, 2015 at 12:42 PM, Deepak Gopalakrishnan
 dgk...@gmail.com wrote:
  Hello All,
 
  I'm trying to process a 3.5GB file on standalone mode using spark. I
 could
  run my spark job succesfully on a 100MB file and it works as expected.
 But,
  when I try to run it on the 3.5GB file, I run into the below error :
 
 
  15/04/26 12:45:50 INFO BlockManagerMaster: Updated info of block
 taskresult_83
  15/04/26 12:46:46 WARN AkkaUtils: Error sending message [message =
  Heartbeat(2,[Lscala.Tuple2;@790223d3,BlockManagerId(2,
  master.spark.com, 39143))] in 1 attempts
  java.util.concurrent.TimeoutException: Futures timed out after [30
 seconds]
  at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
  at
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
  at
 scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
  at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
  at scala.concurrent.Await$.result(package.scala:107)
  at
 org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:195)
  at
 org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:427)
  15/04/26 12:47:15 INFO MemoryStore: ensureFreeSpace(26227673) called
  with curMem=265897, maxMem=5556991426
  15/04/26 12:47:15 INFO MemoryStore: Block taskresult_92 stored as
  bytes in memory (estimated size 25.0 MB, free 5.2 GB)
  15/04/26 12:47:16 INFO MemoryStore: ensureFreeSpace(26272879) called
  with curMem=26493570, maxMem=5556991426
  15/04/26 12:47:16 INFO MemoryStore: Block taskresult_94 stored as
  bytes in memory (estimated size 25.1 MB, free 5.1 GB)
  15/04/26 12:47:18 INFO MemoryStore: ensureFreeSpace(26285327) called
  with curMem=52766449, maxMem=5556991426
 
 
  and the job fails.
 
 
  I'm on AWS and have opened all ports. Also, since the 100MB file works,
 it
  should not be a connection issue.  I've a r3 xlarge and 2 m3 large.
 
  Can anyone suggest a way to fix this?




 --
 Regards,
 *Deepak Gopalakrishnan*
 *Mobile*:+918891509774
 *Skype* : deepakgk87
 http://myexps.blogspot.com




Re: Spark timeout issue

2015-04-26 Thread Patrick Wendell
Hi Deepak - please direct this to the user@ list. This list is for
development of Spark itself.

On Sun, Apr 26, 2015 at 12:42 PM, Deepak Gopalakrishnan
dgk...@gmail.com wrote:
 Hello All,

 I'm trying to process a 3.5GB file on standalone mode using spark. I could
 run my spark job succesfully on a 100MB file and it works as expected. But,
 when I try to run it on the 3.5GB file, I run into the below error :


 15/04/26 12:45:50 INFO BlockManagerMaster: Updated info of block taskresult_83
 15/04/26 12:46:46 WARN AkkaUtils: Error sending message [message =
 Heartbeat(2,[Lscala.Tuple2;@790223d3,BlockManagerId(2,
 master.spark.com, 39143))] in 1 attempts
 java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
 at 
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
 at 
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
 at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
 at 
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
 at scala.concurrent.Await$.result(package.scala:107)
 at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:195)
 at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:427)
 15/04/26 12:47:15 INFO MemoryStore: ensureFreeSpace(26227673) called
 with curMem=265897, maxMem=5556991426
 15/04/26 12:47:15 INFO MemoryStore: Block taskresult_92 stored as
 bytes in memory (estimated size 25.0 MB, free 5.2 GB)
 15/04/26 12:47:16 INFO MemoryStore: ensureFreeSpace(26272879) called
 with curMem=26493570, maxMem=5556991426
 15/04/26 12:47:16 INFO MemoryStore: Block taskresult_94 stored as
 bytes in memory (estimated size 25.1 MB, free 5.1 GB)
 15/04/26 12:47:18 INFO MemoryStore: ensureFreeSpace(26285327) called
 with curMem=52766449, maxMem=5556991426


 and the job fails.


 I'm on AWS and have opened all ports. Also, since the 100MB file works, it
 should not be a connection issue.  I've a r3 xlarge and 2 m3 large.

 Can anyone suggest a way to fix this?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark timeout issue

2015-04-26 Thread Deepak Gopalakrishnan
Hello Patrick,

Sure. I've posted this on user as well. Will be cool to get a response.

Thanks
Deepak

On Mon, Apr 27, 2015 at 2:58 AM, Patrick Wendell pwend...@gmail.com wrote:

 Hi Deepak - please direct this to the user@ list. This list is for
 development of Spark itself.

 On Sun, Apr 26, 2015 at 12:42 PM, Deepak Gopalakrishnan
 dgk...@gmail.com wrote:
  Hello All,
 
  I'm trying to process a 3.5GB file on standalone mode using spark. I
 could
  run my spark job succesfully on a 100MB file and it works as expected.
 But,
  when I try to run it on the 3.5GB file, I run into the below error :
 
 
  15/04/26 12:45:50 INFO BlockManagerMaster: Updated info of block
 taskresult_83
  15/04/26 12:46:46 WARN AkkaUtils: Error sending message [message =
  Heartbeat(2,[Lscala.Tuple2;@790223d3,BlockManagerId(2,
  master.spark.com, 39143))] in 1 attempts
  java.util.concurrent.TimeoutException: Futures timed out after [30
 seconds]
  at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
  at
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
  at
 scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
  at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
  at scala.concurrent.Await$.result(package.scala:107)
  at
 org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:195)
  at
 org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:427)
  15/04/26 12:47:15 INFO MemoryStore: ensureFreeSpace(26227673) called
  with curMem=265897, maxMem=5556991426
  15/04/26 12:47:15 INFO MemoryStore: Block taskresult_92 stored as
  bytes in memory (estimated size 25.0 MB, free 5.2 GB)
  15/04/26 12:47:16 INFO MemoryStore: ensureFreeSpace(26272879) called
  with curMem=26493570, maxMem=5556991426
  15/04/26 12:47:16 INFO MemoryStore: Block taskresult_94 stored as
  bytes in memory (estimated size 25.1 MB, free 5.1 GB)
  15/04/26 12:47:18 INFO MemoryStore: ensureFreeSpace(26285327) called
  with curMem=52766449, maxMem=5556991426
 
 
  and the job fails.
 
 
  I'm on AWS and have opened all ports. Also, since the 100MB file works,
 it
  should not be a connection issue.  I've a r3 xlarge and 2 m3 large.
 
  Can anyone suggest a way to fix this?




-- 
Regards,
*Deepak Gopalakrishnan*
*Mobile*:+918891509774
*Skype* : deepakgk87
http://myexps.blogspot.com