On Thu, Feb 9, 2023 at 1:31 PM Daniel Stenberg <dan...@haxx.se> wrote:
>
> On Thu, 9 Feb 2023, Jeroen Ooms wrote:
>
> > OK, I had expected multiplexing to replace the need for
> > multi-connections.
>
> It does up to the point where the connection is "full" of streams and you ask
> for even more transfers. Then libcurl creates a new connection. Unless you
> limit the number of connections it can use.

Ah ok that is better than I thought. I was under the impression that
it would immediately start with 6 connections, even before considering
multiplexing.


> > Do browsers still make multiple connections to hosts that support http/2
> > multiplex?
>
> My guess: browsers probably only do that in certain situations but mostly no.
>
> > Perhaps a desirable default would be to do one or another, but not both?
>
> If you want to limit the number of used connections, libcurl offers the
> options to do so. Or you can wait with adding some of the transfer(s). The
> default libcurl behavior is generally to perform the transfer you ask for
> sooner rather than later.

Right. In my case I want to download 25k files, and let curl handle
the scheduling.

However I noticed that even when setting CURLMOPT_MAX_HOST_CONNECTIONS
to 1, GitHub still drops the connections at some point. Perhaps the
issue isn't just the concurrency but we are hitting a
proxy_read_timeout or something, because the downloads are idling for
too long, due to the high concurrency...

I did find that the problems disappear when I disable multiplexing,
and performance isn't much worse (about 6 minutes for downloading the
25k files), so this solves my immediate problem.
-- 
Unsubscribe: https://lists.haxx.se/listinfo/curl-library
Etiquette:   https://curl.se/mail/etiquette.html

Reply via email to