I'm trying to launch Spark cluster on AWS EC2 with custom AMI (Ubuntu) using
the following:

./ec2/spark-ec2 --key-pair=*** --identity-file='/home/***.pem'
--region=us-west-2 --zone=us-west-2b --spark-version=1.2.1 --slaves=2
--instance-type=t2.micro --ami=ami-29ebb519 --user=ubuntu launch
spark-ubuntu-cluster

Everything starts OK and instances are launched:

Found 1 master(s), 2 slaves
Waiting for all instances in cluster to enter 'ssh-ready' state.
Generating cluster's SSH key on master.

But then I'm getting the following SSH errors until it stops trying and
quits:

bash: git: command not found
Connection to ***.us-west-2.compute.amazonaws.com closed.
Error executing remote command, retrying after 30 seconds: Command '['ssh',
'-o', 'StrictHostKeyChecking=no', '-i', '/home/***t.pem', '-o',
'UserKnownHostsFile=/dev/null', '-t', '-t',
u'ubuntu@***.us-west-2.compute.amazonaws.com', 'rm -rf spark-ec2 && git
clone https://github.com/mesos/spark-ec2.git -b v4']' returned non-zero exit
status 127

I know that Spark EC2 scripts are not guaranteed to work with custom AMIs
but still, it should work... Any advice would be greatly appreciated!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Launching-Spark-cluster-on-EC2-with-Ubuntu-AMI-tp21757.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to