[ https://issues.apache.org/jira/browse/SPARK-20472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15984461#comment-15984461 ]
Sean Owen commented on SPARK-20472: ----------------------------------- I don't think this is generally possible, because some config is global, it is needed and has effect at startup, and can't be changed even if you wanted (think: JVM heap size). I doubt this is achievable. You will probably have to narrow this down much further. > Support for Dynamic Configuration > --------------------------------- > > Key: SPARK-20472 > URL: https://issues.apache.org/jira/browse/SPARK-20472 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 2.1.0 > Reporter: Shahbaz Hussain > > Currently Spark Configuration can not be dynamically changed. > It requires Spark Job be killed and started again for a new configuration to > take in to effect. > This bug is to enhance Spark ,such that configuration changes can be > dynamically changed without requiring a application restart. > Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to > reduce it to 5 seconds,currently it requires a re-submit of the job. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org