Hi I'm running a job to collect some analytics on spark jobs by analyzing
their event logs. We write the event logs to a single HDFS folder and then
pick them up in another job. I'd like to differentiate between regular spark
jobs and spark streaming jobs in the event logs, i was wondering if there is
an event/property/key that is different between the two.

thanks!,

Franklyn



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Differentiate-Spark-streaming-in-event-logs-tp25126.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to