There exists an Spark Streaming example of the classic word count, using
apache kafka connector:

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java

(maybe you already know)

The point is, what are the benefits from using Kafka, instead of a lighter
solution like yours. Maybe anybody could help us. Anyway, when I try it out,
I'll give you feedback.

On the other hand, have you got ,by any chance, the same script written on
Scala, Phyton or Java ?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Join-streams-Apache-Spark-tp28603p28658.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to