Just so I'm clear, I think the proper behavior would be:

1) unfiltered dataset RowCount of 2
2) filter() called; RowCount of 1
3) new() called; new record added to the filtered dataset even though it may 
not 
match the filter.
4) RowCount of 2
5) removeFilter[s](); RowCount of 3

The developer could filter() again after step 3 to make the new record 
disappear, if 
that's what the requirements of their app demand (although I can't see why 
anyone 
would ever want or expect this) but in the default case the new record is there 
as 
part of the current filtered dataset, whether or not it matches the filter, 
because 
that is what I believe you as a programmer or user would expect.

Similarly, say you filter() a biz, and then in one of the filtered records you 
change 
the value of the filtered field so it no longer matches the filter. I'd expect 
that 
record to stay in the current dataset instead of immediately disappearing.

Perhaps we could add a biz.refilter() method that reapplies the existing filter 
criteria without adding it to the stack of filters (so the 
filter()/removeFilter() 
cycle stays in sync). Then the developer could refilter() whenever appropriate, 
and 
we have the best of both worlds.

Paul


_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-dev
Searchable Archives: http://leafe.com/archives/search/dabo-dev
This message: http://leafe.com/archives/byMID/[email protected]

Reply via email to