Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5929#discussion_r29729030
  
    --- Diff: 
streaming/src/main/scala/org/apache/spark/streaming/StreamingContext.scala ---
    @@ -563,13 +563,22 @@ class StreamingContext private[streaming] (
     
       /**
        * Stop the execution of the streams immediately (does not wait for all 
received data
    +   * to be processed). The underlying SparkContext will also be stopped. 
Note that this can
    +   * be configured using the SparkConf configuration 
spark.streaming.stopSparkContextByDefault.
    +   */
    +  def stop(): Unit = synchronized {
    +    stop(conf.getBoolean("spark.streaming.stopSparkContextByDefault", 
true), false)
    +  }
    +
    +  /**
    +   * Stop the execution of the streams immediately (does not wait for all 
received data
        * to be processed).
        *
        * @param stopSparkContext if true, stops the associated SparkContext. 
The underlying SparkContext
        *                         will be stopped regardless of whether this 
StreamingContext has been
        *                         started.
        */
    -  def stop(stopSparkContext: Boolean = true): Unit = synchronized {
    +  def stop(stopSparkContext: Boolean): Unit = synchronized {
    --- End diff --
    
    I thought it wasnt, but it seems to be. I will implement it differently.
    On May 5, 2015 6:16 PM, "Hari Shreedharan" <[email protected]> wrote:
    
    > In
    > streaming/src/main/scala/org/apache/spark/streaming/StreamingContext.scala
    > <https://github.com/apache/spark/pull/5929#discussion_r29728863>:
    >
    > >     * to be processed).
    > >     *
    > >     * @param stopSparkContext if true, stops the associated 
SparkContext. The underlying SparkContext
    > >     *                         will be stopped regardless of whether 
this StreamingContext has been
    > >     *                         started.
    > >     */
    > > -  def stop(stopSparkContext: Boolean = true): Unit = synchronized {
    > > +  def stop(stopSparkContext: Boolean): Unit = synchronized {
    >
    > Hmm, this is actually api incompatible change no? I am wondering if there
    > is a workaround here which does not break compat.
    >
    > —
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/spark/pull/5929/files#r29728863>.
    >



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to