Hi All,
I have spark deployed to an EC2 cluster and were able to run jobs successfully
when drive is reside within the cluster. However, job was killed when I tried
to submit it from local. My guess is spark cluster can’t open connection back
to the driver since it is on my machine.
I’m
Yes, it is possible to submit jobs to a remote spark cluster. Just make
sure you follow the below steps.
1. Set spark.driver.host to your local ip (Where you runs your code, and it
should be accessible from the cluster)
2. Make sure no firewall/router configurations are blocking/filtering the