in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-and-the-spark-shell-tp3347p19296.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
I'm working with spark streaming using spark-shell, and hoping folks could
answer a few questions I have.
I'm doing WordCount on a socket stream:
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.Seconds
var
2. I notice that once I start ssc.start(), my stream starts processing and
continues indefinitely...even if I close the socket on the server end (I'm
using unix command nc to mimic a server as explained in the streaming
programming guide .) Can I tell my stream to detect if it's lost a
Seems like the configuration of the Spark worker is not right. Either the
worker has not been given enough memory or the allocation of the memory to
the RDD storage needs to be fixed. If configured correctly, the Spark
workers should not get OOMs.
On Thu, Mar 27, 2014 at 2:52 PM, Evgeny