Hello, 

I have a use case that is conceptually pretty simple: I need to 
programmatically (i.e. in code using the ffmpeg libraries) capture video from a 
webcam, and stream it to a server. The devil's always in the details, here are 
mine: 

1. I am capturing video from a webcam on Mac OS X using QTKit (already works, 
no problem there). 
2. I need to feed the frames captured in 1 to ffmpeg, and transcode the video 
to flv format. 
3. I need to push the flv video frames via RTMP to Wowza streaming server. 

Now, I have proofed very near to this concept by successfully using the ffmpeg 
binary to transcode and stream to Wowza using the following command from the 
console: 

ffmpeg -i ~/SampleVideo.mp4 -re -r 24 -b:v 1000k -f flv 
rtmp://localhost/live/SAMPLE_STREAM

This works, and I can successfully consume this published video to Wowza in a 
video player, so that would seem to be a solid indication that ffmpeg can do 
what is being attempted. The guidance I need from all of you ffmpeg gurus on 
the list, is some guidance for how to convert the command line above to code, 
i.e. what libraries to use (and how to use them programmatically) to accomplish 
the same transcoding and streaming. 

If anyone can lend some guidance, sample code, and/or maybe a good coffee to 
drink while figuring this out, I would be very appreciative. 

Thanks in advance, 

Brad

Brad O'Hearne
Founder/Lead Developer
Big Hill Software LLC
http://www.bighillsoftware.com



_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to