[ 
https://issues.apache.org/jira/browse/SPARK-8596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14604328#comment-14604328
 ] 

Vincent Warmerdam commented on SPARK-8596:
------------------------------------------

Rstudio doesn't want to be run as a root user in general. Rstudio doesn't have 
https out of the box, so any port sniffing suddenly becomes a security risk if 
somebody can use the webui to gain root access. 

Instead,  you can just go and add a new user 

$ useradd analyst 
$ passwd analyst

This user will then be able to log in. Note that in order to see rstudio, you 
will need to also edit the security group for the master node to allow TCP to 
connect to this port. 

I'd love to help out and spend some time on these issues by the way. I've got a 
small tutorial .md file ready, can I share that via Jira?  Would like to double 
check it with you guys because I may be doing a dirty trick. For Rstudio not to 
give errors, I remove a line of code in a shell script (because this new user 
is not root it cannot run `ulimit` commands). 

Dirty trick (run as root): 

sed -e 's/^ulimit/#ulimit/g' /root/spark/conf/spark-env.sh > 
/root/spark/conf/spark-env2.sh
mv /root/spark/conf/spark-env2.sh /root/spark/conf/spark-env.sh
ulimit -n 1000000

Installing the right R version always was a bit tricky. Will follow other Jira 
ticket as well. 


> Install and configure RStudio server on Spark EC2
> -------------------------------------------------
>
>                 Key: SPARK-8596
>                 URL: https://issues.apache.org/jira/browse/SPARK-8596
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2, SparkR
>            Reporter: Shivaram Venkataraman
>
> This will make it convenient for R users to use SparkR from their browsers 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to