+1 for the idea, and looks very feasible!

May be we need to decide on a voting criteria, if we already don't have
any, such as whether to assign similar weights to all the classifiers, or
to assigns weights on their accuracy at the validation phase, etc..

Thanks,
Supun


On Thu, Sep 24, 2015 at 11:54 AM, Nirmal Fernando <[email protected]> wrote:

> Hi All,
>
> In statistics and *machine learning*, *ensemble* methods use multiple
> *learning* algorithms to obtain better predictive performance that could
> be obtained from any of the constituent *learning* algorithms.
>
> We thought of implementing ensemble in CEP-ML extension. CEP-ML extension
> will be initialized using a list of ML model paths. When an event is
> received, CEP-ML extension will perform predictions using all the models
> and output the majority vote.
>
> We can implement the same, in ESB-ML extension.
>
> Thoughts are welcome!
>
>
> ---------- Forwarded message ----------
> From: Manorama Perera <[email protected]>
> Date: Thu, May 14, 2015 at 3:35 PM
> Subject: CEP Extension for Machine Learner Predictions
> To: architecture <[email protected]>
> Cc: Nirmal Fernando <[email protected]>, Srinath Perera <[email protected]>,
> Supun Sethunga <[email protected]>, Upul Bandara <[email protected]>,
> Sriskandarajah Suhothayan <[email protected]>, Maheshakya Wijewardena <
> [email protected]>
>
>
> Hi,
>
> We are in the process of implementing a CEP extension for Machine Learner
> Predictions. This extension allows the machine learning models generated by
> WSO2 ML to be used within CEP for predictions.
>
> To use this, following ML features need to be installed in CEP.
>
>    - Machine Learner Core feature
>    - Machine Learner Commons feature
>    - Machine Learner Database Service feature
>
> This extension is implemented as a *StreamProcessor*.
>
> *The syntax :*
>
> There are two possible ways to use the extension.
>
> *<stream-name>#ml:predict(‘<path-to-ML-model>’) *
>
> *<stream-name>#ml:predict('<path-to-ML-model>', attribute 1, attribute 2,
> .......)*
>
> *path-to-MLModel*
>
> The storage location of the Machine learning model can be either registry
> or file system.
>
> If the model is stored in the registry, *path-to-ML-model* should have
> the prefix *registry:*
> If the model is stored in the file system, *path-to-ML-model* should have
> the prefix *file:*
>
> *attribute 1, attribute 2, ….*
>
> These are the attribute names of the stream. The values of these
> attributes are sent to the MLModel as feature input values. When the
> attribute names are not explicitly given, the extension will map the
> attribute names of the stream with the feature names of the ML model.
>
> The output events will contain the attribute* prediction* which holds the
> prediction result for that particular event.
>
> Thanks.
>
> --
> Manorama Perera
> Software Engineer
> WSO2, Inc.;  http://wso2.com/
> Mobile : +94716436216
>
>
>
> --
>
> Thanks & regards,
> Nirmal
>
> Team Lead - WSO2 Machine Learner
> Associate Technical Lead - Data Technologies Team, WSO2 Inc.
> Mobile: +94715779733
> Blog: http://nirmalfdo.blogspot.com/
>
>
>


-- 
*Supun Sethunga*
Software Engineer
WSO2, Inc.
http://wso2.com/
lean | enterprise | middleware
Mobile : +94 716546324
_______________________________________________
Architecture mailing list
[email protected]
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to