We are not ready to support this use case yet. It works on the G1 now only because the codec and the display processor use the same YUV format. We can get away with that because it's buried in the media framework and doesn't rely on an application having special knowledge of the format. The software renderer does not do full color conversion. On the emulator, that means you will only see the Y plane.
We are hashing out a new API that will provide deeper access to the video rendering pipeline in a future release. On Jan 16, 4:39 am, iblues <[email protected]> wrote: > Hi, > > My requirement needs me to draw the YUV data from the framework layer. > Is this possible? From the android code, I seem to understand the > following : > > 1. Create an ISurface object. > 2. Extract frame data in form of YUV > 3. Create a IMemory object with the YUV data. > 4. Call mSurface->RegisterBuffer() > > My doubts are :: > 1. Is my understanding right? Is this approach feasible? > 2. How do we create IMemory object referencing to the frame data? Say, > I set the yuv file( in the SD card) as a data source to the framework > class. > > Thanks & Regards, > iblues --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "android-framework" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-framework?hl=en -~----------~----~----~----~------~----~------~--~---
