I would like to merge the two programs, to create a program that does
the video capturing, encoding, and stream all together. The
multi-threaded encode demo is too complex with speech, on-screen display
etc. But the venc example only encodes from a file. I tried using the
v4l2-loopback example that came with the previous SDK with the venc
example, but the video came out very poor. I'm not sure how to change
the settings either. I'm not too familiar with the live555 library
either so I wouldn't be sure how to add it to one of the demos/examples
to add streaming functionality to them. I was thinking I would need to
only change the writer thread of the encode demo using the live555
library, anybody have any idea as how to do this? Or how I could do this
with the venc example?
Stephen Berry wrote:
You can try and connect the two programs with a named pipe. Use
'mkfifo /tmp/test.mp4' to create the pipe, run 'encode -v
/tmp/test.mp4' and then stream the /tmp/test.mp4 with live555.
I don't know if this will work, but it is worth a shot. If the
streamer tries to seek on the pipe it will fail, and you'll have to
find another way - like merging the two programs.
Anthony Gutierrez wrote:
I can stream videos that are encoded from the composite input (using
encode demo), or from a yuv file (using venc example), using live555.
But this is a two step process, first encode and store to .m4e file,
then run live555 test program to stream. How can I modify either the
encode demo or venc example, or the live555 source to stream in real
time without storing to the disk?
_______________________________________________
Davinci-linux-open-source mailing list
[email protected]
http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
_______________________________________________
Davinci-linux-open-source mailing list
[email protected]
http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source