On Saturday, 28 November 2015 at 10:46:11 UTC, tcak wrote:
The only case that would make sense is if the server limits the
upload speed of each TCP socket. Unless you are in this
position, I do not expect to see any difference by opening
multiple sockets and requesting different parts of same
On Saturday, 28 November 2015 at 07:05:55 UTC, Mike McKee wrote:
Hey guys, as it turns out, someone on stackoverflow.com pointed
out in a Perl version of this question that the Bash example
that was given is really buggy and doesn't make sense. They say
that trying to download a single file
On Saturday, 28 November 2015 at 06:40:49 UTC, Mike McKee wrote:
How could I achieve something like that in D? (Note, I'm using
OSX.)
I did it with vibe.d and http byte ranges.
In general I'm trying to see if I can make a command line zip
file downloader that downloads faster than Curl for
After weighing options, I'll use a CDN to get the faster
download, and stick with curl rather than recoding it in D.
Hey guys, as it turns out, someone on stackoverflow.com pointed
out in a Perl version of this question that the Bash example that
was given is really buggy and doesn't make sense. They say that
trying to download a single file using two socket handles will
not speed up the download. So, this
Looking at this example that's an all-Bash technique, someone has
figured out on OSX (and Unix, Linux, FreeBSD, etc.) how to use
/dev/tcp and download a zip file in a multithreaded way (they use
only two threads, but you get the point):