Github user markgrover commented on the pull request:

    https://github.com/apache/spark/pull/8998#issuecomment-146006957
  
    Pointing out the obvious here, but we should document this new property so 
someone doesn't have to read the source code to figure out what's the name of 
the property to enable dynamic allocation in streaming.
    
    Also, I personally prefer the name 
`spark.streaming.dynamicAllocation.allowed` over 
`spark.streaming.dynamicAllocation.enabled` because one will still have to 
enable dynamic allocation for the entire spark context separately. And, as much 
as I don't like the idea of adding yet another property I don't really see a 
better way to do this, essentially we want to decouple folks from enabling 
dynamic allocation for streaming vs. non-streaming workloads. Currently, we use 
the same property, we could decouple by having 2 separate properties - one for 
streaming and one for the rest, or, alternatively, as is being suggested here, 
have a second property to allow/disallow dynamic allocation for streaming. The 
latter seems less hairy so I am for it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to