All  examples of Spark Stream programming that I see assume streams of lines 
that are then tokenised and acted upon (like the WordCount example).
How do I process Streams that span multiple lines? Are there examples that I 
can use? 

Reply via email to