I'm sorry, I have no idea why it is failing on your side.I have been using
this for a while now and it works fine.All I can say is use version 1.4.0
but I don't think so it is going to make a big difference.This is the one
which I use,a/b are my directories.

Sys.setenv(SPARK_HOME="/a/b/spark-1.4.0")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
sc <- sparkR.init(master="local")
sqlContext <- sparkRSQL.init(sc) 

Well,I'm going to ask another basic question,did you try some other version
before from amplab github etc..
Can u remove the package remove.packages("SparkR") and run install-dev.sh
from R folder of your spark_home and then try again to see if it
works..Hopefully,it should work.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928p23938.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to