That’s a great resource. Thanks Subutai. 

Nick


> On Oct 21, 2014, at 7:34 PM, Subutai Ahmad <[email protected]> wrote:
> 
> 
> We don't have such a document unfortunately. Scott gave a good presentation 
> on this at the workshop, including the details. That video should be up some 
> time in the next few weeks.
> 
> Your second question is a good one. It is something we've struggled with as 
> well. Some companies have internally developed algorithms, but they don't 
> release them. You could look at Skyline, which is a pretty decent codebase 
> for anomaly detection.  We have also started an effort to create a good 
> benchmark for streaming anomaly detection. The current repo is here:
> 
> https://github.com/numenta/NAB <https://github.com/numenta/NAB>
> 
> We discussed this briefly at the workshop. We are actively looking for people 
> to help us with NAB. In particular we want to collect a lot more data and 
> finish implementing the scoring mechanisms. If you want to participate that 
> would be great.
> 
> --Subutai
> 
> On Tue, Oct 21, 2014 at 8:44 AM, Nicholas Mitri <[email protected] 
> <mailto:[email protected]>> wrote:
> Thanks for the tip Subutai!
> The wiki page I’m reading doesn’t go into anomaly likelihood. Would you 
> happen to have a document similar to the one you posted about the CLA 
> classifier that I can dig more into for the mathematical formulation? 
> Something that reflects the work done in :
> https://github.com/numenta/nupic/blob/b6e5cf3c566e2d6ec60aeae24c4da4db27744138/nupic/algorithms/anomaly.py
>  
> <https://github.com/numenta/nupic/blob/b6e5cf3c566e2d6ec60aeae24c4da4db27744138/nupic/algorithms/anomaly.py>
> 
> I’d also be interested in what algorithms you think would be suitable to test 
> HTM’s temporal anomaly detection against. I’m unfamiliar in my research with 
> any alg that’s comparable. 
> 
> thanks,
> Nick
> 
> 
>> On Oct 21, 2014, at 6:15 PM, Subutai Ahmad <[email protected] 
>> <mailto:[email protected]>> wrote:
>> 
>> Hi Nick,
>> 
>> At Numenta we use the difference between predicted and active columns, plus 
>> the anomaly likelihood calculations.  We've had very good results with that 
>> combination.  As mentioned on the wiki page when we tried out confidences 
>> (over a year ago) we didn't get good results. 
>> 
>> --Subutai
>> 
>> On Tue, Oct 21, 2014 at 1:59 AM, Nicholas Mitri <[email protected] 
>> <mailto:[email protected]>> wrote:
>> Thanks Scott. 
>> For the temporal anomaly detector, the wiki mentions using confidence 
>> parameters to calculate the anomaly score but the actual code uses 
>> predictive states instead. Is the latter the final approach Nupic is going 
>> with? Or should I be looking into reintroducing confidence based anomaly 
>> scores?
>> 
>> thanks,
>> Nick
>> 
>>> On Oct 21, 2014, at 2:21 AM, Scott Purdy <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> The algorithms are pretty geared around temporal data. If you have purely 
>>> spatial data like your chart then I wouldn't recommend using NuPIC. You 
>>> could use the spatial pooler and use the average overlap of active columns 
>>> with the input bits to approximate it if you really wanted to use NuPIC.
>>> 
>>> On Sun, Oct 19, 2014 at 12:28 PM, Nicholas Mitri <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> Hey all,
>>> 
>>> I was just reading the anomaly page on the wiki and was curious if there’s 
>>> an implementation of the non-temporal anomaly detection. 
>>> I’m running an older build of nupic and I can’t seem to find an anomaly.py 
>>> file like the one available in the current codebase. 
>>> 
>>> I’d like to try it out against other spatial anomaly detectors (euc, 
>>> manhattan, mahalanobis, etc) and see what kind of boundary it creates in a 
>>> 2D feature space. 
>>> The image below is the result of using 1-class SVM as a novelty detector 
>>> (from scikit-learn tutorials). I’d like to investigate what kind of 
>>> visualization the spatial pooler and the non-temporal detector would 
>>> produce. 
>>> 
>>> <figure_1.png>
>>> 
>>> best,
>>> Nick
>>> 
>> 
>> 
> 
> 

Reply via email to