[
https://issues.apache.org/jira/browse/SPARK-12485?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15106820#comment-15106820
]
Mark Hamstra commented on SPARK-12485:
--------------------------------------
Actually, Sean, I'd argue that they are not the same and that something like
"elastic scaling" be retained for another concept. What Spark is doing with
dynamic allocation is to re-allocate resources from an essentially fixed pool
of cluster resources -- a particular Application may be re-assigned a different
number of Executors, for example, but the total number of Worker nodes remains
the same. Elastic scaling, on the other hand, I would argue should be applied
to dynamically changing the number of Worker nodes or other cluster resources
-- changing the size of the pool, not just re-allocating from a constant size
pool.
> Rename "dynamic allocation" to "elastic scaling"
> ------------------------------------------------
>
> Key: SPARK-12485
> URL: https://issues.apache.org/jira/browse/SPARK-12485
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Reporter: Andrew Or
> Assignee: Andrew Or
>
> Fewer syllables, sounds more natural.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]