Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/772#issuecomment-43175575
  
    Hey @ScrapCodes - this looks like a good start. In terms of how to pass 
options I think the nicest behavior would be to just make it identical to maven:
    
    ```
    sbt/sbt -Pyarn -Dspark.hadoop.version=1.2.3 ~compile
    ```
    
    Since we control the `sbt/sbt` script we could do this (I think) by just 
capturing the arguments and then putting them in an environment variable or 
passing them somehow so they can be read in the sbt build.
    
    It would also be good, for backwards compatibility, to do some automatic 
conversions:
    
    ```
    if SPARK_YARN is set ==> set 'yarn' profile
    if SPARK_GANGLIA_LGPL is set => set 'spark-ganglia-lgpl' profile
    if SPARK_HADOOP_VERSION is set ==> spark.hadoop.version property
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to