.html#spark-dynamic-allocation)
>> you can request Spark Dynamic Resource Allocation as the default
>> configuration at cluster creation.
>>
>>
>>
>> Best regards,
>>
>> Christopher
>>
>>
>>
>>
>>
>> *From:*
;
> *From:* Dinesh Ranganathan [mailto:dineshranganat...@gmail.com]
> *Sent:* Monday, November 16, 2015 4:57 AM
> *To:* Sabarish Sasidharan
> *Cc:* user
> *Subject:* Re: Spark Expand Cluster
>
>
>
> Hi Sab,
>
>
>
> I did not specify number of executors when I
Allocation as the default configuration
at cluster creation.
Best regards,
Christopher
From: Dinesh Ranganathan [mailto:dineshranganat...@gmail.com]
Sent: Monday, November 16, 2015 4:57 AM
To: Sabarish Sasidharan
Cc: user
Subject: Re: Spark Expand Cluster
Hi Sab,
I did not specify number of executors
y cluster by adding more Core instances on AWS,
>> I
>> don't see Spark picking up the new instances dynamically. Is there
>> anything
>> I can do to tell Spark to pick up the newly added boxes??
>>
>> Dan
>>
>>
>>
>> --
>> View t
k up the newly added boxes??
>
> Dan
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Expand-Cluster-tp25393.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---
Dan
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Expand-Cluster-tp25393.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-uns