Hi Srinath,

Yeah, the data is always cleaned up when the task is run. Basically, the
task reads all the data in the column family, send each event to the target
stream, and in the same time, deletes it from the data store.

The notification is totally customizable by the user, what this simply does
is send any arbitrary data to a stream, at the point when possibly some
"insert" statement is executed from the Hive script, which can be at the
end of the script or anywhere. After the event comes to a stream, the user
can do anything with it, either run an CEP query against it, or directly
passthrough it to some transport like email or sms.

Cheers,
Anjana.


On Tue, Oct 22, 2013 at 2:30 PM, Srinath Perera <[email protected]> wrote:

> Hi Anjana,
>
> Basically, we are polling the cassandra location. I think it is OK. But we
> need to make sure we clean up these tasks when we detected that job has
> finished.
>
> What does the notification says? does it says job has finished or can user
> give an condition when to send the notification? We eventually need that.
>
> --Srinath
>
>
> On Tue, Oct 22, 2013 at 2:03 PM, Anjana Fernando <[email protected]> wrote:
>
>> Hi,
>>
>> For BAM notification, the approach we have at the moment is, using CEP,
>> which we do ship by default with BAM now. But, there is another limitation,
>> where we cannot trigger any notifications from Hive scripts, which is what
>> is used mostly.
>>
>> So the requirement is, somehow, we should be able to send messages from
>> Hive to a stream to send out notifications, that is, when messages comes to
>> a stream, we can use (CEPs) message builder/formatters to send out
>> email/sms etc.. So I've implemented a simple mechanism to do this, where
>> when Hive wants to send out a message to a stream, it will write a data row
>> to a pre-defined Cassandra CF ("bam_notification_messages"), where it will
>> have a column with the name "streamId", and other columns (maps to
>> "payload" section of a stream). And then, in the BAM server, there is a
>> scheduled task running, where it polls the data in that CF (5 second
>> intervals), to get the existing rows, and reads the streamId and other
>> columns to generate an event to be send to the target stream, and processed
>> rows will be deleted. So with this approach, effectively, we can now send
>> events to a specific stream from Hive.
>>
>> I've tested this feature in BAM. And hope this approach is fine for the
>> requirement.
>>
>> Cheers,
>> Anjana.
>>
>> --
>> *Anjana Fernando*
>> Technical Lead
>> WSO2 Inc. | http://wso2.com
>> lean . enterprise . middleware
>>
>
>
>
> --
> ============================
> Srinath Perera, Ph.D.
>   Director, Research, WSO2 Inc.
>   Visiting Faculty, University of Moratuwa
>   Member, Apache Software Foundation
>   Research Scientist, Lanka Software Foundation
>   Blog: http://srinathsview.blogspot.com/
>   Photos: http://www.flickr.com/photos/hemapani/
>    Phone: 0772360902
>



-- 
*Anjana Fernando*
Technical Lead
WSO2 Inc. | http://wso2.com
lean . enterprise . middleware
_______________________________________________
Architecture mailing list
[email protected]
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to