Github user dianacarroll commented on the pull request:

    https://github.com/apache/spark/pull/1246#issuecomment-47537578
  
    Why did you change the doc to recommend setting sc.master programmatically? 
 AFAIK we are actively discouraging that, now that it can be set in the 
spark-submit script.
    
    I disagree that this is a doc bug.  If SparkContext(sparkConf) works in 
Scala, it should also work in Python.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to