I would like to get a live monitor of the output of an ffmpeg process (it's
actually a capture and compress process).

I believe I could achieve this by adding a streaming output, and then
separately starting an ffplay process to display that (I haven't tried this
yet, but have got most of the elements working separately at least).

However, this network layer (even as local loopback) seems like a small,
but undesirable, overhead. Can anyone tell me: a) is my concern entirely
unfounded? and b) is it possible to do this more efficiently, presumably in
the main ffmpeg process? and c) is there some potential benefit to the
local-network-between-two-separate-processes approach that I've not thought
of?

TIA, Simon
_______________________________________________
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to