Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/926#issuecomment-44901237
  
    @vanzin I was just copying-and-pasting the substance of the main() method 
from an example to modify it. You are right that most examples tell you to use 
`run-example` and that works. One asks you set a `master` arg directly but it's 
just one.
    
    Nothing wrong with that per se, but the code almost completely configures 
Spark, except for the master. At a glance, I thought it wasn't intentional, and 
that `spark.master` was supposed to default to `local[2]`.
    
    I don't quite like my change -- a global default is much simpler but has 
perhaps more implications. I am not really wedded to inserting a default.
    
    I could instead make sure every bit of example code notes in javadoc that 
`run-example` should be used. (Really this all started because I wanted to also 
suggest a few touch-ups to `KafkaInputDStream` too so wouldn't mind slipping 
that in -- that's how I got started...)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to