Re: Spark Expand Cluster

2015-12-01 Thread Alexander Pivovarov
.html#spark-dynamic-allocation) >> you can request Spark Dynamic Resource Allocation as the default >> configuration at cluster creation. >> >> >> >> Best regards, >> >> Christopher >> >> >> >> >> >> *From:*

Re: Spark Expand Cluster

2015-11-24 Thread Dinesh Ranganathan
; > *From:* Dinesh Ranganathan [mailto:dineshranganat...@gmail.com] > *Sent:* Monday, November 16, 2015 4:57 AM > *To:* Sabarish Sasidharan > *Cc:* user > *Subject:* Re: Spark Expand Cluster > > > > Hi Sab, > > > > I did not specify number of executors when I

RE: Spark Expand Cluster

2015-11-20 Thread Bozeman, Christopher
Allocation as the default configuration at cluster creation. Best regards, Christopher From: Dinesh Ranganathan [mailto:dineshranganat...@gmail.com] Sent: Monday, November 16, 2015 4:57 AM To: Sabarish Sasidharan Cc: user Subject: Re: Spark Expand Cluster Hi Sab, I did not specify number of executors

Re: Spark Expand Cluster

2015-11-16 Thread Dinesh Ranganathan
y cluster by adding more Core instances on AWS, >> I >> don't see Spark picking up the new instances dynamically. Is there >> anything >> I can do to tell Spark to pick up the newly added boxes?? >> >> Dan >> >> >> >> -- >> View t

Re: Spark Expand Cluster

2015-11-16 Thread Sabarish Sasidharan
k up the newly added boxes?? > > Dan > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Expand-Cluster-tp25393.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > ---

Spark Expand Cluster

2015-11-16 Thread dineshranganathan
Dan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Expand-Cluster-tp25393.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-uns