On 13.09.2012, at 01:40, Thomas Mueller <[email protected]> wrote:

> Sounds good. I guess we could do both: always prefetch 20 nodes, and if
> there is still time, fetch more up to 0.1 seconds or so, or at most 200
> nodes. I guess 200 should be enough to for a GUI to decide what to display
> (10 pages for 20 nodes per page).

The idea with the timeout sounds good, but what should we recommend an 
application to do if getSize() takes too long and returns -1? 

Imagine while paging search results, the first page query is fast enough 
(getSize() returns something), but the second is too long and now returns -1: 
should the application give up and "loose" the page navigation or should it 
count itself, now taking ages again....

In that case the oak getSize(max) is probably a better fit. (BTW, there should 
probably be a way to indicate the difference between search results == max and 
search results > max, for example by returning max + 1 in the latter case).

Cheers,
Alex

Reply via email to