[ 
https://issues.apache.org/jira/browse/SPARK-6077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14342704#comment-14342704
 ] 

Nicholas Chammas commented on SPARK-6077:
-----------------------------------------

Please disregard the comments on SPARK-2463 and focus on the description. The 
comments veer off into a separate issue from the one put forward in the 
description.

> Multiple spark streaming tabs on UI when reuse the same sparkcontext
> --------------------------------------------------------------------
>
>                 Key: SPARK-6077
>                 URL: https://issues.apache.org/jira/browse/SPARK-6077
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming, Web UI
>            Reporter: zhichao-li
>            Priority: Minor
>
> Currently we would create a new streaming tab for each streamingContext even 
> if there's already one on the same sparkContext which would cause duplicate 
> StreamingTab created and none of them is taking effect. 
> snapshot: 
> https://www.dropbox.com/s/t4gd6hqyqo0nivz/bad%20multiple%20streamings.png?dl=0
> How to reproduce:
> 1)
> import org.apache.spark.SparkConf
> import org.apache.spark.streaming.{Seconds, StreamingContext}
> import org.apache.spark.storage.StorageLevel
> val ssc = new StreamingContext(sc, Seconds(1))
> val lines = ssc.socketTextStream("localhost", 9999, 
> StorageLevel.MEMORY_AND_DISK_SER)
> val words = lines.flatMap(_.split(" "))
> val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
> wordCounts.print()
> ssc.start()
> .....
> 2)
> ssc.stop(false)
> val ssc = new StreamingContext(sc, Seconds(1))
> val lines = ssc.socketTextStream("localhost", 9999, 
> StorageLevel.MEMORY_AND_DISK_SER)
> val words = lines.flatMap(_.split(" "))
> val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
> wordCounts.print()
> ssc.start()



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to