Malte,
Beyond the ideas already presented, the only thing I can think of -
and this would be a bit of work - is that if there are particular
fields you know you will want to filter on, you could maintain a
*sorted* copy of dgdata. For example, if you had a copy of dgdata
sorted by name, you could filter on name very quickly using a binary
search.
Hey all,
I just try to implement a life search on a datagrid. I was doing
this with the dgtext property. However this turns out to be too slow
on elder machines if there are many records (30k +) Now I am trying
to instead of setting the dgtext, to work with the dgdata. This
could speed up the whole process quite a lot, as the data would not
need to be turned into an array again by the data grid. Problem:
arrays can not be filtered. So what I would like to do is find the
quickest script that simulates array filtering in n dimensions if at
any rate possible. My clumsy first try looks like this. This only
"filters" the second level yet, so turning this into a function for
n-levels deep would be ideal. :)
on mouseUp
local testarray,tprocess,test
repeat with i=1 to 30000
put any item of "meier,müller,john,doe" into testarray[i]["name"]
end repeat
answer the number of lines of the keys of testarray
put the millisecs into test
put the keys of testarray into tprocess
repeat for each line theLine in tprocess
if testarray[theline]["name"]="john" then
delete variable testarray[theline]
end if
end repeat
answer the number of lines of the keys of testarray&cr&the
millisecs-test
end mouseUp
This runs in 31 ms on my machine (Intel MacBook first gen, 2.16
GHz). I would like to have this quicker if possible. Also I´d like
the runtime on your machines, especially Macs pre Intel era.
Any thoughts highly appreciated.
All the best,
Malte
_______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution