Hi,
As the ec2 launch script provided by spark uses
https://github.com/mesos/spark-ec2 to download and configure all the tools
in the cluster (spark, hadoop etc). You can create your own git repository
to achieve your goal. More precisely:
1. Upload your own version of spark in s3 at address
2.
Is there a way to use the ec2 launch script with a locally built version of
spark? I launch and destroy clusters pretty frequently and would like to not
have to wait each time for the master instance to compile the source as happens
when I set the -v tag with the latest git commit. To be clear,