Yes, it is possible to submit jobs to a remote spark cluster. Just make sure you follow the below steps.
1. Set spark.driver.host to your local ip (Where you runs your code, and it should be accessible from the cluster) 2. Make sure no firewall/router configurations are blocking/filtering the connection between your windows machine and the cluster. Best way to test would be to ping the windows machine's public ip from your cluster. (And if the pinging is working, then make sure you are portforwaring the required ports) 3. Also set spark.driver.port if you don't want to open up all the ports on your windows machine (default is random, so stick to one port) Thanks Best Regards On Wed, Nov 26, 2014 at 5:49 AM, Yingkai Hu <yingka...@gmail.com> wrote: > Hi All, > > I have spark deployed to an EC2 cluster and were able to run jobs > successfully when drive is reside within the cluster. However, job was > killed when I tried to submit it from local. My guess is spark cluster > can’t open connection back to the driver since it is on my machine. > > I’m wondering if spark actually support submitting jobs from local? If so, > would you please advise? > > Many thanks in advance! > > YK > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >