Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10836#discussion_r50225325
--- Diff: R/README.md ---
@@ -1,6 +1,16 @@
# R on Spark
SparkR is an R package that provides a light-weight frontend to use Spark
from R.
+### Installing sparkR
+
+Libraries of sparkR need to be created in `$SPARK_HOME/R/lib`. This can be
done by running the script `$SPARK_HOME/R/install-dev.sh`.
--- End diff --
ok I think I get your point now.
I guess we are saying this README.md is more for developer, so I'm ok with
what you have here.
There are users that are not building Spark from source and are running
with the binary release, in which case the SPARK_HOME/R/lib is there and they
would not need to install the SparkR package. Similarly when running SparkR
with a cluster manager, on the worker nodes SparkR would not need to be
installed either. I agree they are possibly outside the scope of this file.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]