try to use EMR-4.1.0. it has spark-1.5.0 running on yarn replace subnet-xxx with correct one
$ aws emr create-cluster --name emr41_3 --release-label emr-4.1.0 --instance-groups InstanceCount=1,Name=sparkMaster,InstanceGroupType=MASTER,InstanceType=r3.2xlarge InstanceCount=3,BidPrice=2.99,Name=sparkSlave,InstanceGroupType=CORE,InstanceType=r3.2xlarge --applications Name=Spark --ec2-attributes KeyName=spark,SubnetId=subnet-xxx --region us-east-1 --tags Name=emr41_3 --use-default-roles --configurations file:///tmp/emr41.json /tmp/emr41.json [ { "Classification": "spark-defaults", "Properties": { "spark.driver.extraJavaOptions": "-Dfile.encoding=UTF-8", "spark.executor.extraJavaOptions": "-Dfile.encoding=UTF-8" } }, { "Classification": "spark", "Properties": { "maximizeResourceAllocation": "true" } }, { "Classification": "spark-log4j", "Properties": { "log4j.logger.com.amazon": "WARN", "log4j.logger.com.amazonaws": "WARN", "log4j.logger.amazon.emr": "WARN", "log4j.logger.akka": "WARN" } }, { "Classification": "yarn-site", "Properties": { "yarn.nodemanager.pmem-check-enabled": "false", "yarn.nodemanager.vmem-check-enabled": "false" } } ] On Fri, Nov 6, 2015 at 3:30 PM, Emaasit <daniel.emaa...@gmail.com> wrote: > Hello, > I followed the instructions for launching Spark 1.5.1 on my AWS EC2 but the > script is not installing all the folders/files required to initialize > Spark. > Since the log message is long, I have created a gist here: > https://gist.github.com/Emaasit/696145959bbbd989bfe1 > > Please help. I have been going at this for more than 6 hours now to no > success. > > > > ----- > Daniel Emaasit, > Ph.D. Research Assistant > Transportation Research Center (TRC) > University of Nevada, Las Vegas > Las Vegas, NV 89154-4015 > Cell: 615-649-2489 > www.danielemaasit.com > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/spark-ec2-script-doest-not-install-necessary-files-to-launch-spark-tp25311.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >