Use Spark job server https://github.com/spark-jobserver/spark-jobserver
Additional:
1. You can also write your on job server with spray (a Scala REST
framework).
2. Create Thrift server and pass states of each job states (Thrift Object )
between your different Jobs.
-
Software
Flume could be interesting for you.
> On 19 Dec 2015, at 00:27, SRK wrote:
>
> Hi,
>
> How to run multiple Spark jobs that takes Spark Streaming data as the
> input as a workflow in Oozie? We have to run our Streaming job first and
> then have a workflow of Spark
Hi,
How to run multiple Spark jobs that takes Spark Streaming data as the
input as a workflow in Oozie? We have to run our Streaming job first and
then have a workflow of Spark Batch jobs to process the data.
Any suggestions on this would be of great help.
Thanks!
--
View this message in