Re: is there a master for spark cluster in ec2

2015-02-02 Thread Robin East
There is a file $SPARK_HOME/conf/spark-env.sh which comes readily configured 
with the MASTER variable. So if you start pyspark or spark-shell from the ec2 
login machine you will connect to the Spark master.


On 29 Jan 2015, at 01:11, Mohit Singh  wrote:

> Hi,
>   Probably a naive question.. But I am creating a spark cluster on ec2 using 
> the ec2 scripts in there..
> But is there a master param I need to set..
> ./bin/pyspark --master [ ] ??
> I don't yet fully understand the ec2 concepts so just wanted to confirm this??
> Thanks
> 
> -- 
> Mohit
> 
> "When you want success as badly as you want the air, then you will get it. 
> There is no other secret of success."
> -Socrates


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: is there a master for spark cluster in ec2

2015-01-29 Thread Arush Kharbanda
Hi Mohit,

You can set the master instance type with -m.

To setup a cluster you need to use the ec2/spark-ec2 script.

You need to create a AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in your
aws web console under Security Credentials. And pass it on to script above.
Once you do that you should be able to setup your cluster using spark-ec2
options.

Thanks
Arush



On Thu, Jan 29, 2015 at 6:41 AM, Mohit Singh  wrote:

> Hi,
>   Probably a naive question.. But I am creating a spark cluster on ec2
> using the ec2 scripts in there..
> But is there a master param I need to set..
> ./bin/pyspark --master [ ] ??
> I don't yet fully understand the ec2 concepts so just wanted to confirm
> this??
> Thanks
>
> --
> Mohit
>
> "When you want success as badly as you want the air, then you will get it.
> There is no other secret of success."
> -Socrates
>



-- 

[image: Sigmoid Analytics] 

*Arush Kharbanda* || Technical Teamlead

ar...@sigmoidanalytics.com || www.sigmoidanalytics.com