Thanks, it's a good start. I'll look into the matter and let you know if I end up with something better. :-)
Mihai On Sun, 14 Sep 2003, Hans de Graaff wrote: > On Sat, 13 Sep 2003, Mihai T. Lazarescu wrote: > > >> > The 1-3Mb amount is arbitrary chosen in doing by hand, but gtkg > >> > can do automatically a much better job in backtracking in, say, > >> > 128kb increments, up to achieving a match. It would be nice > >> > to have such a feature, it very often saves several hundred > >> > Mb of downloads. > >> > >> I have a patch for this approach lying around somewhere. I could > >> dig it up and make it work with current CVS if people are > >> interested. > > Ok, here is the patch. It shows I was really lazy when I wrote > this. Some things to note: > > * I wasn't sure where the boundary was in the d download structure, so > I cleared one more byte to avoid off-by-one errors. May not be > needed. > > * The 10.000 bytes number should be at least a define in the file and > perhaps even an option. It might also be useful to make it a power > of 2. From what I've seen while testing the patch it could be a bit > larger than 10.000 to be more effective. However, if it is too large > then one rogue file will have too much impact on downloading. > > * The patch does not check for borderline cases (like the begin offset > being smaller than 0. > > * This patch does not solve the case where we've downloaded a swarming > part which is not matching, because there is no 'continuity check' > at the end of a just-downloaded chunk (and because of this, this > patch did not work for me in the end). > > * Having Tiger Tree Hash support is a much better solution than trying > to patch things up this way. > > Hans > > > > ------------------------------------------------------- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf _______________________________________________ Gtk-gnutella-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/gtk-gnutella-devel
