[ 
https://issues.apache.org/jira/browse/SPARK-7236?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14519862#comment-14519862
 ] 

Bryan Cutler commented on SPARK-7236:
-------------------------------------

According to git blame, it looks like the default of {{retryInterval = 
Int.Max}} was added with SPARK-3822 
https://github.com/apache/spark/commit/1df05a40ebf3493b0aff46d18c0f30d2d5256c7b.

Maybe [~andrewor14] can comment if this was done for a specific reason?

> AkkaUtils askWithReply sleeps indefinitely when a timeout exception is thrown
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-7236
>                 URL: https://issues.apache.org/jira/browse/SPARK-7236
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Bryan Cutler
>            Priority: Trivial
>              Labels: quickfix
>         Attachments: SparkLongSleepAfterTimeout.scala
>
>
> When {{AkkaUtils.askWithReply}} gets a TimeoutException, the default 
> parameters {{maxAttempts = 1}} and {{retryInterval = Int.Max}} lead to the 
> thread sleeping for a good while.
> I noticed this issue when testing for SPARK-6980 and using this function 
> without invoking Spark jobs, so perhaps it acts differently in another 
> context.
> If this function is on its final attempt to ask and it fails, it should 
> return immediately.  Also, perhaps a better default {{retryInterval}} would 
> be {{0}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to