If local[2] is expected, then the streaming doc is actually misleading? 

as the given example is 

import org.apache.spark.api.java.function._
import org.apache.spark.streaming._
import org.apache.spark.streaming.api._
// Create a StreamingContext with a local master
val ssc = new StreamingContext("local", "NetworkWordCount", Seconds(1))

http://spark.apache.org/docs/latest/streaming-programming-guide.html

I created a JIRA and a PR 

https://github.com/apache/spark/pull/924 

-- 
Nan Zhu


On Friday, May 30, 2014 at 1:53 PM, Patrick Wendell wrote:

> Yeah - Spark streaming needs at least two threads to run. I actually
> thought we warned the user if they only use one (@tdas?) but the
> warning might not be working correctly - or I'm misremembering.
> 
> On Fri, May 30, 2014 at 6:38 AM, Sean Owen <so...@cloudera.com 
> (mailto:so...@cloudera.com)> wrote:
> > Thanks Nan, that does appear to fix it. I was using "local". Can
> > anyone say whether that's to be expected or whether it could be a bug
> > somewhere?
> > 
> > On Fri, May 30, 2014 at 2:42 PM, Nan Zhu <zhunanmcg...@gmail.com 
> > (mailto:zhunanmcg...@gmail.com)> wrote:
> > > Hi, Sean
> > > 
> > > I was in the same problem
> > > 
> > > but when I changed MASTER="local" to MASTER="local[2]"
> > > 
> > > everything back to the normal
> > > 
> > > Hasn't get a chance to ask here
> > > 
> > > Best,
> > > 
> > > --
> > > Nan Zhu
> > > 
> > 
> > 
> 
> 
> 


Reply via email to