[android-developers] Re: Camera and Surface problems with 1.6

2009-11-11 Thread rajaram
Hi Tom and Anders,

In our test program, we did not create any SurfaceView or
SurfaceHolder. We just created, opened the camera, set the preview
callback and started the preview all in the OnStart of our activity.
We were getting the preview callback without anything being displayed
on the screen.

This seems to be addressing the problem you are refering to here. Am I
missing something?

The problem we are having is after we do this, we run into memory
problems. I see a lot of Low Memory: No more background processes
errors and then our activity itself seems to have got killed.

This does not happen if we just run the test program. But if we either
run this inside our application (which as it is takes a good bit of
memory) or run the test program after running some other activites and
pausing them, this problem occurs.

Is this memory problem related to the way we are getting the preview
frames?

Thanks,

Rajaram


On Oct 1, 6:49 pm, Anders Johansson svi...@gmail.com wrote:
 Hi Tom,

 It does indeed seem like we're in the same situation.

 I'm going to play around some more with the position, size and
 visibility of my PUSH_BUFFERSsurfaceto see what works best. All in
 all though, I completely agree with your conclusion...this is a hack,
 and there's no telling how long it will work.

 I also agree with your point about the mysteriousCameraAPI, here are
 my main gripes:

 1. There's a callback for receiving viewfinder data, but it never
 occurred to the designer that 3rd parties might want to use that
 _instead_ of letting the framework draw it directly to asurface.

 2. The callback for viewfinder data only let's you receive YUV data.
 This is good from a performance perspective as it comes directly from
 thecamera, but IMHO causes the following issues:

 a) The YUV format is dependent on the sensor/vendor hardware. Yes,
 there is an API to query the format, but considering the not so
 stellar qualcomm driver implementation for HTC, this may not be
 trustworthy.

 b) 3rd parties wishing to use viewfinder data will have to be able to
 decode all forms of YUV data in order to be safe on future devices,
 assuming that the API mentioned in the point above is trustworthy.

 c) The requirement to decode the YUV raises the bar quite a bit for
 3rd party developers who might have ideas in this field.

 Thus, it would seem reasonable to provide an option to receive the
 data in either YUV or RGB format.

 Unfortunately, even if the API's are modified in a future release we
 will be stuck with maintaining different code for different releases
 and/or OEMs...

 regards
 Anders

 On Oct 1, 1:51 pm, Tom Gibara m...@tomgibara.com wrote:





  If I understand correctly, you're doing something very similar to what I'm
  doing in my Moseycode application. In my case I render thecameraYUV data
  via a GLSurfaceView.
  I can't say whether this will work for certain on all/any 1.6 devices, but
  my approach since 1.5 has been to make the PUSH_BUFFERSsurfacevery small
  and to position it off-screen (a nasty hack that works in the emulator at
  least). I think the smallest supported dimensions that preserve the aspect
  ratio are 20px x 15px.

  That said, I'm just waiting for this circuitous implementation to blow-up on
  me. Why thecamerademands asurfacein order to provide preview data is a
  mystery to me (as is so much of theCameraAPI's operation).

  Tom

  2009/10/1 Anders Johansson svi...@gmail.com

   Hi all,

   My company has so far developed four differentcamera-based
   applications that all work by manipulating the viewfinder feed from
   thecamera. The AndroidcameraAPI expects aSurfaceto draw the
   viewfinder feed to, however in our apps we rely on sidestepping the
   direct drawing and grabbing the YUV_420_SP data for manipulation and
   rendering to aSurface.

   On 1.5, we achieved this by changing theSurfacetype from
   PUSH_BUFFERS to NORMAL, which would in one stroke disable the direct
   feed to thesurfacefrom thecameraas well as giving us aSurface
   onto which we could render the manipulated feed.

   The problem arises when upgrading to 1.6, as it appears that this
   hole has been plugged. TheCameraclass now refuses to start the
   preview feed if its associated preview displaysurfaceis of the wrong
   type (such as NORMAL). I realize that this is probably correct as per
   design, unfortunately it also makes our type of app very difficult to
   implement...

   I have tried to work around it by creating a dummysurfaceview to set
   as preview display, and although I have managed to hide it, I haven't
   been able to stop the direct feed, which of course means that
   performance slows to a crawl as both the direct feed and manipulated
   feed are active and drawing at the same time.

   I would be most grateful for any suggestions on how to resolve this
   issue...

   best regards
   Anders Johansson

  --
  Tom Gibara
  email: m...@tomgibara.com
  web:http://www.tomgibara.com
  

[android-developers] Re: Camera and Surface problems with 1.6

2009-10-03 Thread Pascal Merle

Are you sure that the missing PUSH_BUFFERS are the reason?

I have tried with and without on 1.6 but the preview data is still
random.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Camera and Surface problems with 1.6

2009-10-01 Thread Tom Gibara
If I understand correctly, you're doing something very similar to what I'm
doing in my Moseycode application. In my case I render the camera YUV data
via a GLSurfaceView.
I can't say whether this will work for certain on all/any 1.6 devices, but
my approach since 1.5 has been to make the PUSH_BUFFERS surface very small
and to position it off-screen (a nasty hack that works in the emulator at
least). I think the smallest supported dimensions that preserve the aspect
ratio are 20px x 15px.

That said, I'm just waiting for this circuitous implementation to blow-up on
me. Why the camera demands a surface in order to provide preview data is a
mystery to me (as is so much of the Camera API's operation).

Tom

2009/10/1 Anders Johansson svi...@gmail.com


 Hi all,

 My company has so far developed four different camera-based
 applications that all work by manipulating the viewfinder feed from
 the camera. The Android camera API expects a Surface to draw the
 viewfinder feed to, however in our apps we rely on sidestepping the
 direct drawing and grabbing the YUV_420_SP data for manipulation and
 rendering to a Surface.

 On 1.5, we achieved this by changing the Surface type from
 PUSH_BUFFERS to NORMAL, which would in one stroke disable the direct
 feed to the surface from the camera as well as giving us a Surface
 onto which we could render the manipulated feed.

 The problem arises when upgrading to 1.6, as it appears that this
 hole has been plugged. The Camera class now refuses to start the
 preview feed if its associated preview display surface is of the wrong
 type (such as NORMAL). I realize that this is probably correct as per
 design, unfortunately it also makes our type of app very difficult to
 implement...

 I have tried to work around it by creating a dummy surface view to set
 as preview display, and although I have managed to hide it, I haven't
 been able to stop the direct feed, which of course means that
 performance slows to a crawl as both the direct feed and manipulated
 feed are active and drawing at the same time.

 I would be most grateful for any suggestions on how to resolve this
 issue...

 best regards
 Anders Johansson

 



-- 
Tom Gibara
email: m...@tomgibara.com
web: http://www.tomgibara.com
blog: http://blog.tomgibara.com
twitter: tomgibara

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Camera and Surface problems with 1.6

2009-10-01 Thread Anders Johansson

Hi Tom,

It does indeed seem like we're in the same situation.

I'm going to play around some more with the position, size and
visibility of my PUSH_BUFFERS surface to see what works best. All in
all though, I completely agree with your conclusion...this is a hack,
and there's no telling how long it will work.

I also agree with your point about the mysterious Camera API, here are
my main gripes:

1. There's a callback for receiving viewfinder data, but it never
occurred to the designer that 3rd parties might want to use that
_instead_ of letting the framework draw it directly to a surface.

2. The callback for viewfinder data only let's you receive YUV data.
This is good from a performance perspective as it comes directly from
the camera, but IMHO causes the following issues:

a) The YUV format is dependent on the sensor/vendor hardware. Yes,
there is an API to query the format, but considering the not so
stellar qualcomm driver implementation for HTC, this may not be
trustworthy.

b) 3rd parties wishing to use viewfinder data will have to be able to
decode all forms of YUV data in order to be safe on future devices,
assuming that the API mentioned in the point above is trustworthy.

c) The requirement to decode the YUV raises the bar quite a bit for
3rd party developers who might have ideas in this field.

Thus, it would seem reasonable to provide an option to receive the
data in either YUV or RGB format.


Unfortunately, even if the API's are modified in a future release we
will be stuck with maintaining different code for different releases
and/or OEMs...

regards
Anders

On Oct 1, 1:51 pm, Tom Gibara m...@tomgibara.com wrote:
 If I understand correctly, you're doing something very similar to what I'm
 doing in my Moseycode application. In my case I render the camera YUV data
 via a GLSurfaceView.
 I can't say whether this will work for certain on all/any 1.6 devices, but
 my approach since 1.5 has been to make the PUSH_BUFFERS surface very small
 and to position it off-screen (a nasty hack that works in the emulator at
 least). I think the smallest supported dimensions that preserve the aspect
 ratio are 20px x 15px.

 That said, I'm just waiting for this circuitous implementation to blow-up on
 me. Why the camera demands a surface in order to provide preview data is a
 mystery to me (as is so much of the Camera API's operation).

 Tom

 2009/10/1 Anders Johansson svi...@gmail.com





  Hi all,

  My company has so far developed four different camera-based
  applications that all work by manipulating the viewfinder feed from
  the camera. The Android camera API expects a Surface to draw the
  viewfinder feed to, however in our apps we rely on sidestepping the
  direct drawing and grabbing the YUV_420_SP data for manipulation and
  rendering to a Surface.

  On 1.5, we achieved this by changing the Surface type from
  PUSH_BUFFERS to NORMAL, which would in one stroke disable the direct
  feed to the surface from the camera as well as giving us a Surface
  onto which we could render the manipulated feed.

  The problem arises when upgrading to 1.6, as it appears that this
  hole has been plugged. The Camera class now refuses to start the
  preview feed if its associated preview display surface is of the wrong
  type (such as NORMAL). I realize that this is probably correct as per
  design, unfortunately it also makes our type of app very difficult to
  implement...

  I have tried to work around it by creating a dummy surface view to set
  as preview display, and although I have managed to hide it, I haven't
  been able to stop the direct feed, which of course means that
  performance slows to a crawl as both the direct feed and manipulated
  feed are active and drawing at the same time.

  I would be most grateful for any suggestions on how to resolve this
  issue...

  best regards
  Anders Johansson

 --
 Tom Gibara
 email: m...@tomgibara.com
 web:http://www.tomgibara.com
 blog:http://blog.tomgibara.com
 twitter: tomgibara
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Camera and Surface problems with 1.6

2009-10-01 Thread Tom Gibara

 1. There's a callback for receiving viewfinder data, but it never

occurred to the designer that 3rd parties might want to use that

_instead_ of letting the framework draw it directly to a surface.


I agree, it seems like an obvious use-case. Given the high quality of many
of the Android APIs I always have the suspicion that there are good reasons
why the API is this way, but I simply don't know them.

2. The callback for viewfinder data only let's you receive YUV data.

This is good from a performance perspective as it comes directly from

the camera, but IMHO causes the following issues:


This one bothers me less. I'm happy to get the data in its rawest form
because it allows for the best performance. For example, with Moseycode, I
only use the luminance data to do the image processing - YUV is good for
this. And because I support scanning still images, I need to support RGB
anyway. Library functions for performing these conversions might be useful
though.

There are lots of other gripes - like the lack of a way to select the
preview frame size, or even to interrogate the api for the available sizes.
But my biggest annoyance is the garbage collection problem. I did post a
thread to the now defunct android-framework group with a proposed solution,
but it didn't solicit any response, and realistically I don't have the time
to tackle it right now.

My suggestion can be found here:

http://groups.google.com/group/android-framework/browse_thread/thread/4e9a968660ec6885#

Tom.

2009/10/1 Anders Johansson svi...@gmail.com


 Hi Tom,

 It does indeed seem like we're in the same situation.

 I'm going to play around some more with the position, size and
 visibility of my PUSH_BUFFERS surface to see what works best. All in
 all though, I completely agree with your conclusion...this is a hack,
 and there's no telling how long it will work.

 I also agree with your point about the mysterious Camera API, here are
 my main gripes:

 1. There's a callback for receiving viewfinder data, but it never
 occurred to the designer that 3rd parties might want to use that
 _instead_ of letting the framework draw it directly to a surface.

 2. The callback for viewfinder data only let's you receive YUV data.
 This is good from a performance perspective as it comes directly from
 the camera, but IMHO causes the following issues:

 a) The YUV format is dependent on the sensor/vendor hardware. Yes,
 there is an API to query the format, but considering the not so
 stellar qualcomm driver implementation for HTC, this may not be
 trustworthy.

 b) 3rd parties wishing to use viewfinder data will have to be able to
 decode all forms of YUV data in order to be safe on future devices,
 assuming that the API mentioned in the point above is trustworthy.

 c) The requirement to decode the YUV raises the bar quite a bit for
 3rd party developers who might have ideas in this field.

 Thus, it would seem reasonable to provide an option to receive the
 data in either YUV or RGB format.


 Unfortunately, even if the API's are modified in a future release we
 will be stuck with maintaining different code for different releases
 and/or OEMs...

 regards
 Anders

 On Oct 1, 1:51 pm, Tom Gibara m...@tomgibara.com wrote:
  If I understand correctly, you're doing something very similar to what
 I'm
  doing in my Moseycode application. In my case I render the camera YUV
 data
  via a GLSurfaceView.
  I can't say whether this will work for certain on all/any 1.6 devices,
 but
  my approach since 1.5 has been to make the PUSH_BUFFERS surface very
 small
  and to position it off-screen (a nasty hack that works in the emulator at
  least). I think the smallest supported dimensions that preserve the
 aspect
  ratio are 20px x 15px.
 
  That said, I'm just waiting for this circuitous implementation to blow-up
 on
  me. Why the camera demands a surface in order to provide preview data is
 a
  mystery to me (as is so much of the Camera API's operation).
 
  Tom
 
  2009/10/1 Anders Johansson svi...@gmail.com
 
 
 
 
 
   Hi all,
 
   My company has so far developed four different camera-based
   applications that all work by manipulating the viewfinder feed from
   the camera. The Android camera API expects a Surface to draw the
   viewfinder feed to, however in our apps we rely on sidestepping the
   direct drawing and grabbing the YUV_420_SP data for manipulation and
   rendering to a Surface.
 
   On 1.5, we achieved this by changing the Surface type from
   PUSH_BUFFERS to NORMAL, which would in one stroke disable the direct
   feed to the surface from the camera as well as giving us a Surface
   onto which we could render the manipulated feed.
 
   The problem arises when upgrading to 1.6, as it appears that this
   hole has been plugged. The Camera class now refuses to start the
   preview feed if its associated preview display surface is of the wrong
   type (such as NORMAL). I realize that this is probably correct as per
   design, unfortunately