HI Gianni,

There isn't a directly means in the OSG or OpenGL for convention between
RGB and YUV.  You'll have to convert it on the CPU, or more efficiently
create render to texture and post process stage on the GPU that converts
the RGB frame buffer to YUV, then copy the resulting RGB frame buffer back
to the CPU via a callback and treat it as a YUV.

Robert.

On 23 November 2012 14:39, Gianni Ambrosio <[email protected]> wrote:

> Hi All,
> I'm implementing a movie generator. I used ScreenCaptureHandler and
> implemented my own CaptureOperation that uses ffmpeg library to encode a
> movie. Reimplementing the CaptureOperator virtual operator I start from an
> osg::Image I have to convert in an AVFrame to encode it with ffmpeg. AFAIK
> ffmpeg needs YUV color space for AVFrame.
>
> Now, is there a way to get YUV components from an osg::Image?
>
> Regards
> Gianni
>
> ------------------
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=51229#51229
>
>
>
>
>
> _______________________________________________
> osg-users mailing list
> [email protected]
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to