In what way does it depend on the formats? The files are streamed to/from in a nut container.
I've managed to fake it on the segmenting side by buffering through a file, i.e. the segmenter writes to disk and then I wait for the segment_list items and start uploading and then remove from disk once finished. Unfortunately this solution will only upload completed segments, it cannot upload while writing since the segment_list is only updated for finished segments. On the concat side I haven't found any workaround. On Sun, Apr 26, 2015 at 11:20 AM, Nicolas George <[email protected]> wrote: > Le septidi 7 floréal, an CCXXIII, Robert Nagy a écrit : > > I'm using ffmpeg to segment/concat a 120Mbit/s file in 2 second segments > > to/from a HTTP server. The problem is that ffmpeg only writes/reads one > > segment at a time which doesn't fully use all available bandwidth. > > > > Each request to/from the server has a limit of 10MB/s. However using 5+ > > concurrent request it can get all the way up to 50+ MB/s. > > > > Is there some way I can get the segment/concat muxer/demuxer to run > several > > requests concurrently? > > Not currently. > > I can see several manual solutions, based on using an external HTTP client > instead of the one built into FFmpeg, but it all depends on the exact > formats you use. > > Regards, > > -- > Nicolas George > > _______________________________________________ > ffmpeg-user mailing list > [email protected] > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > > _______________________________________________ ffmpeg-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/ffmpeg-user
