Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/86#issuecomment-38740468
  
    Looks like the issues with missing Client class is due to 
https://spark-project.atlassian.net/browse/SPARK-1330 not this pr. Once that 
was fixed I am able to run both cluster and client mode on yarn
    
    Another thing I noticed is that the spark-submit script uses --arg and the 
spark-class script uses --args.  Not a big deal just want to make sure we want 
arg vs args. I don't have a strong opinion on it but if people are used to 
using spark-class its just a change.
    
    It is a bit unfortunate we still have to specify the first arg as 
yarn-client or yarn-cluster for the spark examples so it can pass it to 
SparkContext but I guess there isn't much we can do about that since if it was 
real user code, the user could have that hardcoded or put it as any argument 
(not just the first one).
    
    Great work Sandy!  Its nice to have this easier interface.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to