Hi Bhoj,

If you looking to render some YUV dump on the screen from your
application, you can do it using Canvas object.

You will need to convert your YUVs to RGB and then to Color Pixel
array and create a Bitmap object using the array data. And render the
same. If you are talking of rendering YUVs in real-time buffering,
this would not be an apt solution.

I do remember someone trying this approach with the Camera Raw data in
Android-Developers forum. Plz do check over that. You will have some
idea.

Have a nice day!

Regards,
Syed Ibrahim


On Feb 25, 2:11 pm, bhoj <[email protected]> wrote:
> iblues,
>
> I am trying to display some yuv dump on to the screen .Can you tell me
> how u did it in ur case ? did u write a jni wrapper to access the
> display surface ? Is it possible to render on to the display without a
> jni wrapper ? I don't want to introduce JNI at present . I know one
> cannot access the display from a native application but I have read
> that it is possible which myt not work later. As of now i just want to
> display something on the screen.
>
> On Jan 23, 6:21 am, iblues <[email protected]> wrote:
>
>
>
> > Thanks Dave.
>
> > Regards,
> > iblues
>
> > On Jan 23, 10:40 am, Dave Sparks <[email protected]> wrote:
>
> > > 1. ISurface is the remote interface for SurfaceFlinger. When you call
> > > an ISurface method like postBuffer, you are executing an RPC in
> > > SurfaceFlinger. There is only one way to render 2D graphics through
> > > the window system and that is using SurfaceFlinger. In the future, you
> > > could use the overlay interface, but that is aimed more at hardware
> > > pipelined video rather than software rendering.
>
> > > 2. ashmem and pmem are very similar. Both are used for sharing memory
> > > between processes. ashmem uses virtual memory, whereas pmem uses
> > > physically contiguous memory. One big difference is that with ashmem,
> > > you have a ref-counted object that can be shared equally between
> > > processes. For example, if two processes are sharing an ashmem memory
> > > buffer, the buffer reference goes away when both process have removed
> > > all their references by closing all their file descriptors. pmem
> > > doesn't work that way because it needs to maintain a physical to
> > > virtual mapping. This requires the process that allocates a pmem heap
> > > to hold the file descriptor until all the other references are closed.
>
> > > 3. You have the right idea for using shared memory. The choice between
> > > ashmem and pmem depends on whether you need physically contiguous
> > > buffers. In the case of the G1, we use the hardware 2D engine to do
> > > scaling, rotation, and color conversion, so we use pmem heaps. The
> > > emulator doesn't have a pmem driver and doesn't really need one, so we
> > > use ashmem in the emulator. If you use ashmem on the G1, you lose the
> > > hardware 2D engine capability, so SurfaceFlinger falls back to its
> > > software renderer which does not do color conversion, which is why you
> > > see the monochrome image.
>
> > > On Jan 22, 8:46 pm, iblues <[email protected]> wrote:
>
> > > > Hi Dave,
>
> > > > I was able to display the YUV frame onto the ISurface. But as you had
> > > > mentioned, the emulator is considering only the Y-data and displaying
> > > > a gray-scale video. I just have few clarifications from my above
> > > > exercise :
>
> > > > 1. In most of the posts and also in the source code, whenever we talk
> > > > to posting to the display, all talk of Surface Flinger whereas the
> > > > Camera and the Media Player uses ISurface to render the display images
> > > > onto the screen.  I also see the implementation of the Surface Flinger
> > > > to be more of  a wrapper to the ISurface. Which of these objects would
> > > > you recommend to be used for real-time rendering of YUV data?
>
> > > > 2. I seem to not to understand the difference between ashmemory and
> > > > pmem memory. Can you point out some overall differences between the
> > > > same?
>
> > > > 3. In my requirement, I would have a native library A render the YUV
> > > > frames and map it to a Memory Heap. If another library say B wants to
> > > > have access to this memory heap , as per my understanding, I will have
> > > > to wrap the memory heap into IMemory and pass the same via a callback
> > > > from A to B right? If so, should my memory heap type be ashmem or
> > > > pmem?
>
> > > > Please correct me if my understandings are wrong anywhere.
>
> > > > Regards,
> > > > Syed Ibrahim M
>
> > > > --------------------------
>
> > > > On Jan 22, 12:33 am, Dave Sparks <[email protected]> wrote:
>
> > > > > Yes, you need to make your class a friend of SurfaceHolder.- Hide 
> > > > > quoted text -
>
> > > - Show quoted text -- Hide quoted text -
>
> - Show quoted text -
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to