Hi all,
I will update this thread once the documentation is updated with
information about the Spark scripts to be executed for this scenario.
Best Regards,
Rukshani.
On Wed, May 18, 2016 at 9:57 AM, Nirmal Fernando wrote:
> Hi Shavantha,
>
> Alerts are generated real-time of
Hi Shavantha,
Alerts are generated real-time of course and API subscribers can control
whether they'd want to receive alerts or not.
On Wed, May 18, 2016 at 9:49 AM, Shavantha Weerasinghe
wrote:
> Hi Nirmal
>
> Shouldn't we give the API subscriber the option to
Hi Shavantha,
Alerts will be generated as and when an abnormal event comes after the
learning period or after the initial script execution. After the initial
script execution users will get the alerts every time an alert triggering
event comes.
On Wed, May 18, 2016 at 9:49 AM, Shavantha
Hi Nirmal
Shouldn't we give the API subscriber the option to decide/control on what
time lines he she wants the alerts generated.
Regards,
Shavantha Weerasinghe
Senior Software Engineer QA
WSO2, Inc.
lean.enterprise.middleware.
http://wso2.com
http://wso2.org
Tel : 94 11 214 5345
Fax :94 11
Hi Iranga,
Since these are data-driven alerts, IMO it is fine to send out alerts after
a some time period. In fact, we have some alert types which require a
learning phase, hence alerts will not get triggered during the learning
phase.
+1 for documenting the spark script scheduling times.
On
Hi,
In API analytics for alert generation for use cases such as abnormal API
Request count it's required to the spark scripts be executed before to get
the percentile calculations. The spark scripts by default are scheduled for
12pm every day ( in this scenario) meaning at the start alerts will