Github user tdas commented on the pull request:

    https://github.com/apache/spark/pull/3141#issuecomment-62087166
  
    Why is this change needed? IF you run the example through bin/run-example, 
it should the SparkConf should be automatically populated with a master of 
"local[*]", which sets the number of working threads as the number of cores in 
the local system. And highly likely that your system has more than 1 cores. Is 
there are situation where this is not working?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to