Hi all,
I wrote a simple rule (Drools) and I'm trying to fire it, when I
fireAllRules nothing happen neither exceptions. . . do I need to setup
configurations?

Thanks

A G

2015-05-22 12:22 GMT+02:00 Dibyendu Bhattacharya <
[email protected]>:

> Hi,
>
> Sometime back I played with Distributed Rule processing by integrating
> Drool with HBase Co-Processors ..and invoke Rules on any incoming data ..
>
> https://github.com/dibbhatt/hbase-rule-engine
>
> You can get some idea how to use Drools rules if you see this
> RegionObserverCoprocessor ..
>
>
> https://github.com/dibbhatt/hbase-rule-engine/blob/master/src/main/java/hbase/rule/HBaseDroolObserver.java
>
>
> Idea is basically to create a stateless Ruleengine from the "drl" file and
> fire the rule on incoming data ..
>
> Even though the code is for invoking rules on HBase PUT object , but you
> can get an idea ..and modify it for Spark..
>
> Dibyendu
>
>
>
> On Fri, May 22, 2015 at 3:49 PM, Evo Eftimov <[email protected]>
> wrote:
>
>> I am not aware of existing examples but you can always “ask” Google
>>
>>
>>
>> Basically from Spark Streaming perspective, Drools is a third-party
>> Software Library, you would invoke it in the same way as any other
>> third-party software library from the Tasks (maps, filters etc) within your
>> DAG job
>>
>>
>>
>> *From:* Antonio Giambanco [mailto:[email protected]]
>> *Sent:* Friday, May 22, 2015 11:07 AM
>> *To:* Evo Eftimov
>> *Cc:* [email protected]
>> *Subject:* Re: Spark Streaming and Drools
>>
>>
>>
>> Thanks a lot Evo,
>>
>> do you know where I can find some examples?
>>
>> Have a great one
>>
>>
>> A G
>>
>>
>>
>> 2015-05-22 12:00 GMT+02:00 Evo Eftimov <[email protected]>:
>>
>> You can deploy and invoke Drools as a Singleton on every Spark Worker
>> Node / Executor / Worker JVM
>>
>>
>>
>> You can invoke it from e.g. map, filter etc and use the result from the
>> Rule to make decision how to transform/filter an event/message
>>
>>
>>
>> *From:* Antonio Giambanco [mailto:[email protected]]
>> *Sent:* Friday, May 22, 2015 9:43 AM
>> *To:* [email protected]
>> *Subject:* Spark Streaming and Drools
>>
>>
>>
>> Hi All,
>>
>> I'm deploying and architecture that uses flume for sending log
>> information in a sink.
>>
>> Spark streaming read from this sink (pull strategy) e process al this
>> information, during this process I would like to make some event
>> processing. . . for example:
>>
>> Log appender writes information about all transactions in my trading
>> platforms,
>>
>> if a platform user sells more than buy during a week I need to receive an
>> alert on an event dashboard.
>>
>> How can I realize it? Is it possible with drools?
>>
>> Thanks so much
>>
>>
>>
>
>

Reply via email to