GitHub user rkrzr opened a pull request:

    https://github.com/apache/spark/pull/20607

    Don't block on cleanup tasks by default

    This PR sets `"spark.cleaner.referenceTracking.blocking"` to `false` by
    default. It had originally been set to `true` as a workaround for
    SPARK-3015.
    
    However, that issue has been resolved since 16/Aug/2014 already.
    I would therefore think that it's safe to make this non-blocking by
    default, which should help with cases where the cleanup thread can not
    keep up any more.
    
    If there are other reasons why this should stay blocking by default, I'd
    be interested to learn about them. In that case the comment should
    probably be updated.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/channable/spark dont_block_on_cleanup_tasks

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20607.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20607
    
----
commit f9e2f546b9b9beb136bb7110d0e4303365939927
Author: Robert Kreuzer <robert@...>
Date:   2018-02-14T10:43:46Z

    Don't block on cleanup tasks by default
    
    This PR sets `"spark.cleaner.referenceTracking.blocking"` to false by
    default. It had originally been set to true as a workaround for
    SPARK-3015.
    
    However, that issue has been resolved since 16/Aug/2014 already.
    I would therefore think that it's safe to make this non-blocking by
    default, which should help with cases where the cleanup thread can not
    keep up any more.
    
    If there are other reasons why this should stay blocking by default, I'd
    be interested to learn about them. In that case the comment should
    probably be updated.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to