Yep. Make sure you are checking the new search string starts with the old
search string. If it doesn't then filter from the original data, otherwise
filter from the last filter.  And create new lists, not remove from old
ones; removal is slow and costly with array backed lists.
The trick is to use originally sorted lists, cached, and to also cache the
last filter.. don't use compound queries.

Oh and also don't use data structures like a Trie: that'd be a complete
waste of time if you are doing a substring search as you'd still have to
search the entire tree.


|-----Original Message-----
|From: [email protected] [mailto:ozdotnet-
|[email protected]] On Behalf Of Tiang Cheng
|Sent: Monday, 17 May 2010 1:19 PM
|To: ozDotNet
|Subject: RE: Filtering algorithm strategies to mimic intellisense
|
|After you've applied the filter function on the first keystroke, save the
result set
|and filter on the result set? Repeat for each keystroke.
|
|
|
|From: [email protected] [mailto:ozdotnet-
|[email protected]] On Behalf Of Winston Pang
|Sent: Monday, 17 May 2010 7:41 AM
|To: ozDotNet
|Subject: Filtering algorithm strategies to mimic intellisense
|
|
|
|Hey everyone,
|
|So I'm building a intellisense like autocomplete. I've stumbled on some
perf
|issues because its literally just iterating over the list and re-applying a
filter
|function on every item, based on every key stroke. The perf degrades
obviously
|as more times is in the list.
|
|I was wondering, does anyone have any strategies or past expeirence with
|smarter ways to filter a list. Now this list is just a list of a custom
objects, with a
|particular display field that is a string. Filtering is filtered based on a
contains
|for each item. So it's essentially mimicing VS2010's contains for
intellisense.
|
|The VS2010 intellisense seems to be extremely speedy.
|
|Cheers,
|
|
|Winston


Reply via email to