Thanks for your help!! I will try it, and I will let you if it works for me!!
2015-10-12 23:17 GMT+02:00 moon soo Lee <m...@apache.org>: > Hi, > > Yes, of course. Zeppelin can run on different machine. > It is recommended to install Spark in the machine that runs Zeppelin and > point spark installation path in conf/zeppelin-env.sh using SPARK_HOME env > variable(in case of 0.6.0-SNAPSHOT). > > One thing you need to take care is, your spark workers need to connect to > Spark driver. ie. Zeppelin. So you'll need to configure your EC2 instance > to able to communicate with EMR cluster, vise versa. > > Hope this helps. > moon > > > > On 2015년 10월 12일 (월) at 오후 9:56 Pablo Torre <pablotorr...@gmail.com> > wrote: > >> Hi guys, >> >> I was wondering if you could help me with this scenario: I am using >> Amazon AWS, and I have running an EMR Hadoop Cluster. I want to install and >> configure Apache Zeppelin but in a different machine (EC2 instance). So I >> want to use Zeppelin to visualize information that is available in the HDFS >> in the cluster. >> >> Can I configure Zeppelin in a different machine? Do I need to install >> Apache Spark in any machine? >> >> I appreciate your help >> >> Thanks. >> >> >> -- >> Pablo Torre. >> Freelance software engineer and Ruby on Rails developer. >> Oleiros (Coruña) >> *Personal site <http://www.pablotorrerodriguez.com>* >> My blog <http://www.aboutechnologies.com> >> > -- Pablo Torre. Freelance software engineer and Ruby on Rails developer. Oleiros (Coruña) *Personal site <http://www.pablotorrerodriguez.com>* My blog <http://www.aboutechnologies.com>