Hi Thanks for reporting this. Some relevant changes that address these issues:
- (1) is being fixed by Josh Rosen in https://github.com/mesos/spark-ec2/pull/22 - For (2), Patrick had a change that we discussed before at https://github.com/mesos/spark-ec2/pull/17 . I think he is re-working that and will have another PR for this. You can file a bug in JIRA if you want to track this. Thanks Shivaram On Fri, Oct 4, 2013 at 6:00 PM, Shay Seng <[email protected]> wrote: > Hi, > > I've been trying to use the spark-ec2 launch scripts have have some comments > on it, not sure if this is the best place to post ... > > (1) On the AMI image, most of the modeule's init.sh file has the following > idiom: > if [ -d "spark" ]; then > echo "Spark seems to be installed. Exiting." > exit 0 > else > # Github tag: > if [[ "$SPARK_VERSION" == *\|* ]] > then > ... > > I think the "exit 0" is an error - what it will do is cause the setup.sh > script which sources this init.sh script to also exit, causing the bootstrap > to stop at that point. This is certainly not what is wanted. > > You can try see this by loading the AMI, installing spark by hand, and using > the new AMI as a source to spark-ec2.py > > No other modules will be installed or setup. > > (2) I want to push my AWS secret key-pair into the hadoop core-site.xml file > at startup. Currently I have to dive into the guts of the code and perform > surgery. It would be nice to have a more generic way to add template > variables etc. > > Currently I have to modify the deploy_templates.py on the AMI to enable a > new template param. > > Cheers, > shay >
