--- Taras <[EMAIL PROTECTED]> wrote: > Adam McDaniel wrote: > >Taras wrote: > > >I second that opinion. It is very annoying if you > > >have pages that are basicly long tables of > > >contents and they get cut into 3-4 pages. Makes > > >scrolling to find the right word very painful. > > > > >Let's say that you have an HTML document that's > >200k long. When you parse it it splits it up into > >6 pages in the viewer (~32k each, etc etc). To view > >just one of those pages there is a load time of 10 > >seconds until you can actually begin to read and > >scrolldown on that one page. > > > >Would you be willing to instead have a loadtime of > >60 seconds just so the viewer can properly re-build > >everything into one complete page, just like the > >original? > > >However can't you do incremental loading? So >somewhere the length of the document would be stored >and scrollbar would be sized accordingly. then as you >get close enough to the unloaded segment it would >load it. I suspect that's how isilo gets around the >problem
I'm not a Palm OS programmer, just a user, but I wonder if anyone has looked at the source code used by Weasel Reader (GNU GPL). I can open The Three Musketeers (DOC format, SD card) in less than 30 seconds and at top scroll speed (about 22,000 cpm) I notice no blips in the first 100k or so (I got bored after that). I realize that Plucker might not be able to achieve quite that speed due depending on the decompression algorithm but, like the others who have commented, I think the front end time spent is worth it in back end usability. Dave __________________________________________________ Do you Yahoo!? Yahoo! Mail Plus - Powerful. Affordable. Sign up now. http://mailplus.yahoo.com _______________________________________________ plucker-list mailing list [EMAIL PROTECTED] http://lists.rubberchicken.org/mailman/listinfo/plucker-list

