Hi, I'm experimenting with an minor add-on for Blender. Blender is a 3D modeler, renderer and audio/video sequencer. It can save the final rendering in a variety of video formats, but of course not the particular one *I* want it to =). I could of course just re-encode, but this will take up allot of extra disk space and time, potentially harm quality, and I want a user-friendly one-button-push operation.
What I want to do, seen from the user's point of view, is make a generic way to losslessly stream the raw images and audio into a user-defined command that processes it. The audio and video could either be encoded in separate streams and fed to the command though named pipes, or be encoded in the same stream and instead be piped to the command though stdin. I must admit that in my daily use the command will be /usr/bin/ffmpeg with options to encode the stream(s) into MP4(x264/AAC). However, even though this is avlib feeding ffmpeg, I'd like to stress that this feature should be as generic as possible, and allow other utilities (mencoder, to be able read the stream as well; so no use aflib/ffmpeg-specific formats. What I want to do, in order to implement this, is use libav to create a stream from scratch and populate it with raw video and audio frames from Blender. So far I've only been able to find code examples that reads streams. Blender already has accepts some user-define parameters and uses libav to encode a video stream. I've duplicated this and modified it as good as I could though experimenting, but the result is far from perfect; I get a video with red noise all over it. I haven't tried outputting audio yet. And I'm not sure if this code is a good place to start from in the first place for what I'm trying to do; it may even be outdated. I feel inspired by this http://ffmpeg.org/faq.html#SEC30 to separate streams: PCM and YUV4MPEG pipe. Is this better for quality and/or easier to implement than combining the streams? How do I create a stream from scratch, set it to lossless, and add video frames to it? How do I create a stream for PCM audio? Blender provides the video as an array of rgb pixels and width and height. The audio is not as obvious, but I haven't really looked into that yet. This is the existing code in Blender: http://asklandd.dk/tmp/blender/src/writeffmpeg.c This is my code, which is currently not working: http://asklandd.dk/tmp/blender/src/writepipe.c I really hope that anyone can help me! I also hope I provided enough information =) -- Stephan _______________________________________________ libav-user mailing list [email protected] https://lists.mplayerhq.hu/mailman/listinfo/libav-user
