GitHub user a-roberts opened a pull request:

    https://github.com/apache/spark/pull/15094

    [SPARK-17534] [TESTS] Increase timeouts for DirectKafkaStreamSuite tests

    **## What changes were proposed in this pull request?**
    There are two tests in this suite that are particularly flaky on the 
following hardware:
    
    2x Intel(R) Xeon(R) CPU E5-2697 v2 @ 2.70GHz and 16 GB of RAM, 1 TB HDD
    
    This simple PR increases the timeout times and batch duration so they can 
reliably pass
    
    **## How was this patch tested?**
    Existing unit tests with the two core box where I was seeing the failures 
often


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/a-roberts/spark patch-6

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/15094.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #15094
    
----
commit 9ddecc6338f17b65c7b74d0e8b1854eb7f36b945
Author: Adam Roberts <arobe...@uk.ibm.com>
Date:   2016-09-14T13:15:18Z

    [SPARK-17534] [TESTS] Increase timeouts for DirectKafkaStreamSuite tests
    
    There are two tests in this suite that are particularly flaky on the 
following hardware:
    
    2x Intel(R) Xeon(R) CPU E5-2697 v2 @ 2.70GHz and 16 GB of RAM, 1 TB HDD
    
    This simple PR increases the timeout times and batch duration so they can 
reliably pass

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to