GitHub user skanjila opened a pull request:

    https://github.com/apache/spark/pull/15848

    [SPARK-9487v2] Use the same num. worker threads in Scala/Python unit tests

    ## What changes were proposed in this pull request?
    
    Changed code to use local[4] instead of local[2] when creating SparkConf
    
    ## How was this patch tested?
    
    Ran unit tests across core, mllib,external, streaming without issues, note 
that this is the second version of the pull request as the first one got messed 
up by doing rebase incorrectly.  Once this gets committed I will work on python 
pieces next.
    
    Please review 
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark before 
opening a pull request.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/skanjila/spark spark-9487v2

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/15848.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #15848
    
----
commit cc9cbdc74a5769c165aea61a2a2786abbf1e7a4e
Author: saikan <[email protected]>
Date:   2016-11-11T00:40:51Z

    changes for using local[4] in all parts of java/scala codebase only

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to