[
https://issues.apache.org/jira/browse/SPARK-20472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15997874#comment-15997874
]
Shahbaz Hussain commented on SPARK-20472:
-----------------------------------------
Yes ,the idea is to have a way by which we can persist configuration in
memory,like for Ex: Batch Interval ,sql shuffle partitions etc ,primarily these
are Spark Specific configuration.
JVM configuration are global and cant be changed ,this request is not for
Dyncamic Configuration for JVM but for Spark application specific.
> Support for Dynamic Configuration
> ---------------------------------
>
> Key: SPARK-20472
> URL: https://issues.apache.org/jira/browse/SPARK-20472
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit
> Affects Versions: 2.1.0
> Reporter: Shahbaz Hussain
>
> Currently Spark Configuration can not be dynamically changed.
> It requires Spark Job be killed and started again for a new configuration to
> take in to effect.
> This bug is to enhance Spark ,such that configuration changes can be
> dynamically changed without requiring a application restart.
> Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to
> reduce it to 5 seconds,currently it requires a re-submit of the job.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]