> I haven't put together a video from an OSG app yep, but its inevitable
> one day I'll need to so I thought I'd pen down what I think might be a
> nice way to tackle it.  This might also be a good example to have
> added to the OSG, so users are welcome to add one :-)
> 
> My thought was to have a slave Camera that tracks the master camera,
> but with an FBO or Pbuffer set to PAL dimensions (or whatever target
> res you want for the video), this will record the frames at the
> correct size, and without any toolboxes etc interfering.
> 
> Second up would be to attach a post draw callback to the slave Camera,
> and in this callback you'd do the osg::Image::readPixels to do this
> for you.  In theory one could set up a PixelBufferObject to help speed
> to read up.  Next up you have to save this image to disk, or just
> cache it.   Now writing to disk will be slow, so you don't want to do
> this in the rendering thread, rather spawn an OperationThread to do
> the writing to disk, with a custom ImageWriteOperation added to the
> OperationThread.  Due to the asynchronous nature of write one will
> need to use a circular buffer of osg::Image, one at the head being
> written to by the rendering thread, and the rest being used by
> ImageWriteOperations, when the image is written to disk one simply
> then put the Image back into the buffer ready to be reused.
> 
> Then once you've stopped your recording of images, you wait till the
> last frame is written to disk (the OperationThread's queue is empty)
> then you span a tool to glue all these frames together as a video.
> Potentially you could use video library like ffmpeg to do this in a
> background thread.
> 
> Other little changes one could make would be to cut the framerate of
> the app down to 25fps to match the needs of the video, or perhaps even
> let the app run at a multiple, say 50 or 75fps and then blend frames
> together, and only write out at 25fps.
> 
> Robert.

Hi Robert,

Just some comments from my own personal perspective and experience as I
am currently going through the same thing that it appears that many
others are doing (or wanting to do).  

For the video captures, it was relayed to me that the best result would
be to capture individual images that could be later streamed together
for a video.  I had already tried using various video capture utilities,
but there was always frame drops and the performance and quality was
just horrible.  So this was enlightening news.  What I was told was that
doing it this way would allow for every frame to be captured at the
desired resolution and quality, and a video could then be created using
these high quality images for streaming at a desired frame rate.  

It appears that many in the community are doing it this way too.  But, I
just wanted to emphasize that, at least for me, in doing it this way,
the actual frame rate of my application did not matter.  What mattered
most was that I captured high quality images of every frame because the
resulting video could be made to exhibit whatever frame rate that I
desired.  In fact, I now set my capture to be 1280x960 (2x 640x480) with
the resulting video using scaled down images.  This captures at about
5Hz, but again this doesn't matter to me.  Well, faster would be nice,
but I'm not looking at rendering performance when doing image captures.

In my code, I use my own render thread with the osgViewer, and I did
basically what you've proposed (albeit a much cruder implementation).  I
added a writeImage to my render thread loop to save the individual
images.  I didn't do it in a callback, but I was trying a 2-hour
feasibility solution.  So, it wasn't the most thought-about solution.
The issue that I had with this was the potential frame loss due to the
fact that the rendering was faster than the image capture (due to the
disk writes).  I could have put a check in the code to only capture if
the camera's view matrix changed, but again the feasibility solution was
a much lower effort than this would have required.  :-)

So, my suggestion is that the image capture not be forced to capture at
a "rendering" speed.  And possibly even block further rendering at each
frame until the image capture has been completed.  Maybe an option flag
could be used to set how the capture is performed.

I would also like to just mention that some of the image plug-ins do not
have the writeImage method of the ReaderWriter class implemented.  So,
without further community support in this area, the option to write a
desired image format may not be available.

Just my two cents ... thanks for reading.

chuck

_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to