Solved.

The problem is the following: the underlying Akka driver uses the INTERNAL interface address on the Amazon instance (the ones that start with 10.x.y.z) to present itself to the master, it does not use the external (public) IP!

Ognen

On 9/7/2014 3:21 PM, Sean Owen wrote:
Also keep in mind there is a non-trivial amount of traffic between the
driver and cluster. It's not something I would do by default, running
the driver so remotely. With enough ports open it should work though.

On Sun, Sep 7, 2014 at 7:05 PM, Ognen Duzlevski
<ognen.duzlev...@gmail.com> wrote:
Horacio,

Thanks, I have not tried that, however, I am not after security right now -
I am just wondering why something so obvious won't work ;)

Ognen


On 9/7/2014 12:38 PM, Horacio G. de Oro wrote:
Have you tryied with ssh? It will be much secure (only 1 port open),
and you'll be able to run spark-shell over the networ. I'm using that
way in my project (https://github.com/data-tsunami/smoke) with good
results.

I can't make a try now, but something like this should work:

ssh -tt ec2-user@YOUR-EC2-IP /path/to/spark-shell SPARK-SHELL-OPTIONS

With this approach you are way more secure (without installing a VPN),
you don't need spark/hadoop installed on your PC. You won't have acces
to local files, but you haven't mentioned that as a requirement :-)

Hope this help you.

Horacio
--

        Web: http://www.data-tsunami.com
     Email: hgde...@gmail.com
        Cel: +54 9 3572 525359
   LinkedIn: https://www.linkedin.com/in/hgdeoro


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to