Github user sun-rui commented on the pull request:

    https://github.com/apache/spark/pull/6743#issuecomment-117859704
  
    @andrewor14, 
    I have tested this patch with a real YARN cluster. 
    
    For an R program, it can source other R files and call functions within 
them, or it can call functions exposed in other R packages. For sourcing R 
files, it can be supported via the existing --files (refer to SPARK-6833). For 
dependent R packages, SparkR now only supports passing the name list of the R 
packages to workers, but not distributing the packages themselves. It is 
expected that they are already installed in the R environment on worker nodes. 
Not sure if we support distributing dependent R packages or not, @shivaram, 
what do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to