Github user zjffdu commented on the issue:

    https://github.com/apache/spark/pull/14639
  
    @shivaram @felixcheung @sun-rui  My previous commit didn't resolve the 
issue. It succeeded just due to it already download spark in cache dir.  
    I push another commit to fix the issue.  Overall, this PR resolve 3 things.
    * Cache dir issue in mac os x
    * Don't download spark when it is cluster mode. The key thing is that we 
should pass SparkConf from JVM to R, otherwise `master` and  `deployMode` in 
`sparkR.session` will always be empty.  
    * `appName` should be empty by default, otherwise we can not override in 
spark-submit throught arguments `--name`



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to