Hi, I often use the http client in ffmpeg from various websites and, most of the time, it works fine.
However, in some instances the website doesn't allocate too much bandwidth to a single connection and would benefit from using multiple simultaneous http connections (i.e., the bandwidth is the bottleneck, not local processing speed). This appears to kind of work when performing it manually, say by the following... (1) download the file like the below (using a command line tool like aria2 which allows for downloading the file sequentially with multiple connections) aria2c --stream-piece-selector=inorder --enable-http-pipelining=true --min-split-size=4m --file-allocation=none --max-connection-per-server=10 --split=10 "[http link]" -o temp.mkv (2) after letting the above run for a short time (maybe a minute or so if even that long - whereas the file could take an hour or so to download) I would run something like this ffmpeg -re -i temp.mkv -vcodec copy acodec mp3 -b:a 256k temp2.mkv -in this case, ffmpeg won't overun the content since it's being forced to run at 1x with "-re" though this is just incidental given various bitrates/bandwidth/io involved (though, it could theoretically go too quickly and crash) --> Is there some CLI based download manager (on windows) that would allow me to pipe to ffmpeg? (I haven't been able to with aria2 as it doesn't, obviously, support stdout) --> Is there some other way to achieve my goal? Thanks, Dave _______________________________________________ ffmpeg-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/ffmpeg-user
