Github user oscaroboto commented on the pull request: https://github.com/apache/spark/pull/5096#issuecomment-83934965 Additionally, one could think of setting sparkRLibDir as a way to remotely set .libPaths() on all of the workers. This is most useful when .libPaths() on the driver is different then on the workers. Note that it is implicitly assumed that all workers are configured identically, else setting sparkRLibDir will not work. If the function that is passed to parLapplyLB(cl,X,fun) calls other R packages the workers will look for these libraries sparkRLibDir, which by defalut is SPARK_HOME. There in lies the problem, SPARK_HOME is the spark home directory and not the path to the R packages on the workers.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org