On Tue, Feb 05, 2008 at 06:03:28PM +0300, Dmitry Kurochkin wrote:
> 2008/2/5, David Roundy <[EMAIL PROTECTED]>:
> > On Tue, Feb 05, 2008 at 03:30:23AM +0300, Dmitry Kurochkin wrote:
> > > Some fixes and cleanups for configure checks.
> > >
> > > And fix for copyRemotesNormal - if we use copyUrlFirst the first patch
> > > is downloaded last. So we lock in waitUrl and progress indication stops.
> >
> > Thanks for these changes! Applied.  (once they pass tests and all that)
> >
> > > Tue Feb  5 02:59:06 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
> > >   * Append CPPFLAGS to GHCFLAGS in autoconf.mk.in instead of configure.ac.
> > >
> > > Tue Feb  5 03:01:09 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
> > >   * Fix configure check for curl_multi_timeout.
> >
> > Very nice! I'd suspected this would reduce the amount of code considerably,
> > but I hadn't expected quite *this* much.  :)
> 
> Me too :)
> 
> >
> > > Tue Feb  5 03:07:17 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
> > >   * Cleanup curl pipelining configure check.
> > >
> > > Tue Feb  5 03:18:41 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
> > >   * Cleanup libwww configure check.
> > >
> > > Tue Feb  5 03:21:42 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
> > >   * Use copyUrl in copyRemotesNormal.
> >
> > The only downside of this fix is that it means that we download all
> > "speculatively" scheduled files before any of those that are needed
> > strictly.  I don't think this is a real problem, because copyRemotes is
> > rarely used, and I don't think ever follows speculative downloads.  But I
> > do think a more correct approach might be to do something like copyUrlFirst
> > on the reverse of the list.  Although that'll also have issues of getting
> > the files-already-downloaded count off by five or so...
> 
> What about doing waitUrl on reverse list? I am not familiar with
> progress reporting code and I do not understand why counter would be
> off by five...
> 
> How speculateFileOrUrl will be used? I do not see it used anywhere now.

It's used in Darcs.Repository.Prefs, and it's used heuristically when
useage patterns suggest that we may want to download a file soon (e.g. if
we download one patch file, it's likely we'll want the next few patches in
sequence, or if we download a directory-listing from a pristine cache,
we'll probably want to download the file contents listed there.  I'd like
to increase the use of speculateFileOrUrl.  Just the few annotations I've
added increased the speed of the darcs get of a hashed repository on my
laptop (at home over wireless and a slow cable modem connection) by a
factor of three or four.  But there's a tradeoff that we also don't want to
do *too* much downloading if it's not needed (e.g. you don't want a "darcs
changes -s --last 1" to trigger downloading thousands of patch files).  I
suspect it's self-throttling, because we only feed more file requests into
the libwww pipeline when waitUrl is called, which means once we stop
needing files, we stop pipelining them.

Anyhow, it's just that if you call waitUrlFirst on a reverse list (which is
what I meant to suggest), you'd download the last few files before the
first file, which would mean that the first waitUrl wouldn't complete until
after the last few URLs were downloaded *and* the first.
-- 
David Roundy
Department of Physics
Oregon State University
_______________________________________________
darcs-devel mailing list
darcs-devel@darcs.net
http://lists.osuosl.org/mailman/listinfo/darcs-devel

Reply via email to