A possible way is to use SurfaceComposerClient to create a surface and set
its z-order big enough if you just want to play with it.


   1.     int pid = getpid();
   2.     sp<SurfaceComposerClient> videoClient = new
    SurfaceComposerClient;
   3.     sp<surface> videoSurface(videoClient->createSurface(pid, 0,
176, 144, PIXEL_FORMAT_OPAQUE,
   ISurfaceComposer::eFXSurfaceNormal|ISurfaceComposer::ePushBuffers));
   4.     videoClient->openTransaction();
   5.     //set toppest z-order
   6.     nState = videoSurface->setLayer(INT_MAX);
   7.     nState = videoSurface->show();
   8.     videoClient->closeTransaction();


Then it will always keep visible.

On Tue, Mar 24, 2009 at 11:25 AM, El Sol <[email protected]> wrote:

> I just wanted to drive the playerdriver from that spot in particular, I
> wanted to create an executable and run it through the adb shell. But I don't
> know how to instantiate the ISurface from that spot.
>
> thx
>
>
> On Mon, Mar 23, 2009 at 8:09 PM, Dave Sparks <[email protected]>wrote:
>
>>
>> There is no "official" way to do this from middleware. You need to go
>> through WindowManager to create a SurfaceView object.
>>
>> Why do you need to do this from middleware? The surface needs to be
>> associated with an app anyway to manage the activity life cycle.
>>
>> On Mar 23, 12:07 pm, MR <[email protected]> wrote:
>> > Hi Dave
>> >
>> > I'm currently trying to implement an app that is sitting on top of the
>> > PVPlayer, looks like I need to pass an ISurface instance when calling
>> > setVideoSurface in order to be able to render video, it also looks
>> > liken this is being created in the upper layers and just being passed
>> > down. Is there a way to instantiate ISurface at the middleware level?
>> >
>> > thx
>> > MR
>> >
>> > On Jan 22, 10:40 pm, Dave Sparks <[email protected]> wrote:
>> >
>> > > 1.ISurfaceis the remote interface for SurfaceFlinger. When you call
>> > > anISurfacemethod like postBuffer, you are executing an RPC in
>> > > SurfaceFlinger. There is only one way to render 2D graphics through
>> > > the window system and that is using SurfaceFlinger. In the future, you
>> > > could use the overlay interface, but that is aimed more at hardware
>> > > pipelined video rather than software rendering.
>> >
>> > > 2. ashmem and pmem are very similar. Both are used for sharing memory
>> > > between processes. ashmem uses virtual memory, whereas pmem uses
>> > > physically contiguous memory. One big difference is that with ashmem,
>> > > you have a ref-counted object that can be shared equally between
>> > > processes. For example, if two processes are sharing an ashmem memory
>> > > buffer, the buffer reference goes away when both process have removed
>> > > all their references by closing all their file descriptors. pmem
>> > > doesn't work that way because it needs to maintain a physical to
>> > > virtual mapping. This requires the process that allocates a pmem heap
>> > > to hold the file descriptor until all the other references are closed.
>> >
>> > > 3. You have the right idea for using shared memory. The choice between
>> > > ashmem and pmem depends on whether you need physically contiguous
>> > > buffers. In the case of the G1, we use the hardware 2D engine to do
>> > > scaling, rotation, and color conversion, so we use pmem heaps. The
>> > > emulator doesn't have a pmem driver and doesn't really need one, so we
>> > > use ashmem in the emulator. If you use ashmem on the G1, you lose the
>> > > hardware 2D engine capability, so SurfaceFlinger falls back to its
>> > > software renderer which does not do color conversion, which is why you
>> > > see the monochrome image.
>> >
>> > > On Jan 22, 8:46 pm, iblues <[email protected]> wrote:
>> >
>> > > > Hi Dave,
>> >
>> > > > I was able to display the YUV frame onto theISurface. But as you had
>> > > > mentioned, the emulator is considering only the Y-data and
>> displaying
>> > > > a gray-scale video. I just have few clarifications from my above
>> > > > exercise :
>> >
>> > > > 1. In most of the posts and also in the source code, whenever we
>> talk
>> > > > to posting to the display, all talk of Surface Flinger whereas the
>> > > > Camera and the Media Player usesISurfaceto render the display images
>> > > > onto the screen.  I also see the implementation of the Surface
>> Flinger
>> > > > to be more of  a wrapper to theISurface. Which of these objects
>> would
>> > > > you recommend to be used for real-time rendering of YUV data?
>> >
>> > > > 2. I seem to not to understand the difference between ashmemory and
>> > > > pmem memory. Can you point out some overall differences between the
>> > > > same?
>> >
>> > > > 3. In my requirement, I would have a native library A render the YUV
>> > > > frames and map it to a Memory Heap. If another library say B wants
>> to
>> > > > have access to this memory heap , as per my understanding, I will
>> have
>> > > > to wrap the memory heap into IMemory and pass the same via a
>> callback
>> > > > from A to B right? If so, should my memory heap type be ashmem or
>> > > > pmem?
>> >
>> > > > Please correct me if my understandings are wrong anywhere.
>> >
>> > > > Regards,
>> > > > Syed Ibrahim M
>> >
>> > > > --------------------------
>> >
>> > > > On Jan 22, 12:33 am, Dave Sparks <[email protected]> wrote:
>> >
>> > > > > Yes, you need to make your class a friend of SurfaceHolder.
>>
>>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to