Thanks Jean, this will be quite useful. I am wondering if this will require
a new partitioning construct in the feed as well like micro-batches, etc.

Sharad

On Wed, Feb 11, 2015 at 2:34 PM, Jean-Baptiste Onofré <[email protected]>
wrote:

> Hi Sharad,
>
> I sent an e-mail last week about support of Spark (SparkStreaming) in
> workflow/process. It's basically very close to what you propose.
>
> IMHO, it should be a new impl of workflow or at least the support of a new
> kind of processes (it's what I have in mind).
>
> Regards
> JB
>
>
> On 02/11/2015 09:38 AM, Sharad Agarwal wrote:
>
>> I am looking for a generic schema aware feed construct for streaming
>> workflow. The schema can be managed by a catalog service like HCatalog.
>> The
>> streaming workflow executor would be a system like
>> Storm/SparkStreaming/Samza.
>>
>> I want to know if this is the right thing to be supported in Falcon and if
>> yes what is the plugging interface for that. Would this be a new
>> implementation of workflow engine ?
>>
>> Thanks
>> Sharad
>>
>>
> --
> Jean-Baptiste Onofré
> [email protected]
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>

Reply via email to