When I spin up an AWS Spark cluster per the Spark EC2 script:

According to AWS: 
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/spot-requests.html#fixed-duration-spot-instances

there is a way of reserving for a fixed duration Spot cluster through AWSCLI
and the web portal but I can't find anything that works with the Spark
script. 

My current (working) script will allow me to get Spot requests but I can't
specify a duration: 
./spark-ec2 \
--key-pair=<my_key> \
--identity-file=<my_key_path> \
--instance-type=r3.8xlarge \
-s 2  \
--spot-price=0.75 \
--block-duration-minutes 120 \
launch spark_rstudio_h2o_cluster

I've tried --block-duration-minutes and --spot-block with no success. 

Thanks. 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Specifying-Fixed-Duration-Spot-Block-for-AWS-Spark-EC2-Cluster-tp27278.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to