Please consider these two questions. Thanks very much. BACKGROUND: I adapted testH264VideoStreamer.cpp and DeviceSouce.cpp/hpp so that I could send a live video feed out over the network. I currently test using VLC. I generate h.264 encoded frames using a Microsoft Media Foundation (MF) back end to my existing MF streaming software. There is a call to env->taskScheduler().doEventLoop() in my adapted testH264VideoStreamer.cpp, which is in a separate thread. Thus, my primary thread handles the MF streaming, and this secondary thread makes sure doEventLoop() happens.
QUESTION ONE: My h.264 encoding repeatedly creates new [output] buffers that I pass to the live555 side, using an adapted signalNewFrameData() function that goes through the scheduler trigger event. This is fine. But how can I figure out when the buffer is no longer in use? I need to return it to the pool, de-allocate it, or otherwise free it up. Otherwise, I'll leak memory usage indefinitely. QUESTION TWO: Right now, the video I see using VLC is truly horrible. I'm no h.264 or mpeg-4 expert, but it appears as if half my detail frames are missing. It's like watching satellite TV during a rain storm. This could be related to the above, but I don't think so due to a prior single-thread solution that looked the same but guaranteed appropriate buffer allocation/de-allocation. So why might my video look so bad, or how can I go about figuring out why it's so? Being sufficiently ignorant, I can only guess at how you may answer to assist me. One thing I guess about is a method to trace the h.264 encoded output to see if it's all getting though, I frames P frames we call frame for Ice cream, or whatever. Thanks very much.
_______________________________________________ live-devel mailing list [email protected] http://lists.live555.com/mailman/listinfo/live-devel
