[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-12-02 Thread dmanpearl

I decoded the RGB color data from the Android Camera PreviewCallback
onPreviewFrame() frame.
My function that decodes the YUV byte[] buffer from the preview
callback and converts it into an ARGB_ int[] buffer is presented
below.

The luminance buffer takes up the first width * height bytes of the
byte buffer and can be displayed as a gray-scale image if desiered.
The chromonance follows with each U or V value representing a 2x2
square region of 4 luminance values. The bytes are packed as follows
(display in monospaced font to view as grid):
Y1a Y1b Y2c Y2d Y3e Y3f
Y1g Y1h Y2i Y2j Y3k Y3l
Y4m Y4n Y5o Y5p Y6q Y6r
Y4s Y4t Y5u Y5v Y6w Y6x
U1  V1  U2  V2  U3  V3
U4  V4  U5  V5  U6  V6

I strongly believe that Google, you should have told us this
yourselves and saved me several days.

This code has been tested on the HTC G1 device processing 480x320 size
frames.  Similar functions downrez in the same single pass through the
image to a smaller frame size for full display on the screen at a
faster rate.  Note: this processing must be done in a separate Thread
from onPreviewFrame().

// decode Y, U, and V values on the YUV 420 buffer described as
YCbCr_422_SP by Android
// David Manpearl 081201
public static void decodeYUV(int[] out, byte[] fg, int width, int
height) throws NullPointerException, IllegalArgumentException {
final int sz = width * height;
if(out == null) throw new NullPointerException(buffer 'out' is
null);
if(out.length  sz) throw new IllegalArgumentException(buffer 'out'
size  + out.length +   minimum  + sz);
if(fg == null) throw new NullPointerException(buffer 'fg' is null);
if(fg.length  sz) throw new IllegalArgumentException(buffer 'fg'
size  + fg.length +   minimum  + sz * 3/ 2);
int i, j;
int Y, Cr = 0, Cb = 0;
for(j = 0; j  height; j++) {
int pixPtr = j * width;
final int jDiv2 = j  1;
for(i = 0; i  width; i++) {
Y = fg[pixPtr]; if(Y  0) Y += 255;
if((i  0x1) != 1) {
final int cOff = sz + jDiv2 * width + (i  1) 
* 2;
Cb = fg[cOff];
if(Cb  0) Cb += 127; else Cb -= 128;
Cr = fg[cOff + 1];
if(Cr  0) Cr += 127; else Cr -= 128;
}
int R = Y + Cr + (Cr  2) + (Cr  3) + (Cr  5);
if(R  0) R = 0; else if(R  255) R = 255;
int G = Y - (Cb  2) + (Cb  4) + (Cb  5) - (Cr  
1) + (Cr 
3) + (Cr  4) + (Cr  5);
if(G  0) G = 0; else if(G  255) G = 255;
int B = Y + Cb + (Cb  1) + (Cb  2) + (Cb  6);
if(B  0) B = 0; else if(B  255) B = 255;
out[pixPtr++] = 0xff00 + (B  16) + (G  8) + R;
}
}
}

On Dec 1, 8:52 am, Dave Sparks [EMAIL PROTECTED] wrote:
 The G1 preview format is YUV 420 semi-planar (U and V are subsampled
 by 2 in both X and Y). The Y plane is first, followed by UV pairs - I
 believe the U sample comes first in the pair.

 Technically it's YCbCr 420 semi-planar, but very few people use that
 term.

 On Nov 26, 6:27 pm, dmanpearl [EMAIL PROTECTED] wrote:



  Hello Blindfold,

  Thanks for your help.  I solved the user interface problems I was
  experiencing by using a separate thread to do my image processing. I'm
  still using an ImageView, and without problems.  Perhaps I will try a
  Canvas in a SurfaceHolder later in this exercise to compare speeds.

  MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY

  As you know, I want to display live camera data through a custom
  filter.

  I got most of the way through theYCbCr_422_SPdata buffer returned to
  Android's Camera.PreviewCallback onCameraFrame() callback function,
  and now I am looking for help decyphering the U, V portion of the
  buffer.  I verified that the first (width*height) bytes are simple Y
  luminance values that can be displayed (via Bitmap and ImageView) to
  make a viable gray-scale image.  The total number of bytes are (width
  * height * 3 / 2).

  The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr)
  data.  Therefore, there are 1/4 image bytes for each U, V component
  (i.e. each U, V component is used for 4 pixels of the image).  This
  looks like 411 or 420 data, not 422, but we have bigger fish to fry.

  I cannot determine if the U V data is aligned adjacently, in
  alternating rows, or in squares as described in this Wikipedia
  graphical description:  http://en.wikipedia.org/wiki/Image:Yuv420.svg.
  Once I finally determine the structure of the U, V data, I have
  several equations to convert from YUV to RGB and I have tried many
  ways of combining the UV data with the luminance data of the first 2/3
  of the buffer to no avail.  So far I can only display 

[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-12-01 Thread Dave Sparks

The G1 preview format is YUV 420 semi-planar (U and V are subsampled
by 2 in both X and Y). The Y plane is first, followed by UV pairs - I
believe the U sample comes first in the pair.

Technically it's YCbCr 420 semi-planar, but very few people use that
term.

On Nov 26, 6:27 pm, dmanpearl [EMAIL PROTECTED] wrote:
 Hello Blindfold,

 Thanks for your help.  I solved the user interface problems I was
 experiencing by using a separate thread to do my image processing. I'm
 still using an ImageView, and without problems.  Perhaps I will try a
 Canvas in a SurfaceHolder later in this exercise to compare speeds.

 MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY

 As you know, I want to display live camera data through a custom
 filter.

 I got most of the way through the YCbCr_422_SP data buffer returned to
 Android's Camera.PreviewCallback onCameraFrame() callback function,
 and now I am looking for help decyphering the U, V portion of the
 buffer.  I verified that the first (width*height) bytes are simple Y
 luminance values that can be displayed (via Bitmap and ImageView) to
 make a viable gray-scale image.  The total number of bytes are (width
 * height * 3 / 2).

 The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr)
 data.  Therefore, there are 1/4 image bytes for each U, V component
 (i.e. each U, V component is used for 4 pixels of the image).  This
 looks like 411 or 420 data, not 422, but we have bigger fish to fry.

 I cannot determine if the U V data is aligned adjacently, in
 alternating rows, or in squares as described in this Wikipedia
 graphical description:  http://en.wikipedia.org/wiki/Image:Yuv420.svg.
 Once I finally determine the structure of the U, V data, I have
 several equations to convert from YUV to RGB and I have tried many
 ways of combining the UV data with the luminance data of the first 2/3
 of the buffer to no avail.  So far I can only display mono-chrome.

 If you or others on this list can decode the Android YCbCr_422_SP
 data, please post the solution as soon as possible.  Your efforts and
 generosity are greatly appreciated.  I am convinced that
 representatives from Google/Android and others monitoring this list
 know how to do this.  Please share the information.  It is crutial to
 our project.  I do not care about the Emulator and it's different
 encoding.  I realize that Google is probably waiting to implement a
 unified solution and share it through an API update, but we cannot
 wait.

  - Thank you, David Manpearl

 On Nov 26, 10:23 am, blindfold [EMAIL PROTECTED] wrote:

  Hi David,

   I can't seem to make coexist: SurfaceHolder for the camera  ImageView
   for the filtered Bitmap to display.

  ...

   Do you know why I can't make the Camera's Surface and an ImageView
   Bitmap simultaneous members of the same active ViewGroup?

  I do not use ImageView myself so I cannot really judge your problem. I
  draw my filtered Bitmap to a Canvas in a SurfaceView. No ImageView
  anywhere.

  Regards
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-11-26 Thread blindfold

Hi David,

 I can't seem to make coexist: SurfaceHolder for the camera  ImageView
 for the filtered Bitmap to display.

...

 Do you know why I can't make the Camera's Surface and an ImageView
 Bitmap simultaneous members of the same active ViewGroup?

I do not use ImageView myself so I cannot really judge your problem. I
draw my filtered Bitmap to a Canvas in a SurfaceView. No ImageView
anywhere.

Regards

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-11-26 Thread dmanpearl

Hello Blindfold,

Thanks for your help.  I solved the user interface problems I was
experiencing by using a separate thread to do my image processing. I'm
still using an ImageView, and without problems.  Perhaps I will try a
Canvas in a SurfaceHolder later in this exercise to compare speeds.

MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY

As you know, I want to display live camera data through a custom
filter.

I got most of the way through the YCbCr_422_SP data buffer returned to
Android's Camera.PreviewCallback onCameraFrame() callback function,
and now I am looking for help decyphering the U, V portion of the
buffer.  I verified that the first (width*height) bytes are simple Y
luminance values that can be displayed (via Bitmap and ImageView) to
make a viable gray-scale image.  The total number of bytes are (width
* height * 3 / 2).

The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr)
data.  Therefore, there are 1/4 image bytes for each U, V component
(i.e. each U, V component is used for 4 pixels of the image).  This
looks like 411 or 420 data, not 422, but we have bigger fish to fry.

I cannot determine if the U V data is aligned adjacently, in
alternating rows, or in squares as described in this Wikipedia
graphical description:  http://en.wikipedia.org/wiki/Image:Yuv420.svg.
Once I finally determine the structure of the U, V data, I have
several equations to convert from YUV to RGB and I have tried many
ways of combining the UV data with the luminance data of the first 2/3
of the buffer to no avail.  So far I can only display mono-chrome.

If you or others on this list can decode the Android YCbCr_422_SP
data, please post the solution as soon as possible.  Your efforts and
generosity are greatly appreciated.  I am convinced that
representatives from Google/Android and others monitoring this list
know how to do this.  Please share the information.  It is crutial to
our project.  I do not care about the Emulator and it's different
encoding.  I realize that Google is probably waiting to implement a
unified solution and share it through an API update, but we cannot
wait.

 - Thank you, David Manpearl

On Nov 26, 10:23 am, blindfold [EMAIL PROTECTED] wrote:
 Hi David,

  I can't seem to make coexist: SurfaceHolder for the camera  ImageView
  for the filtered Bitmap to display.

 ...

  Do you know why I can't make the Camera's Surface and an ImageView
  Bitmap simultaneous members of the same active ViewGroup?

 I do not use ImageView myself so I cannot really judge your problem. I
 draw my filtered Bitmap to a Canvas in a SurfaceView. No ImageView
 anywhere.

 Regards
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-11-25 Thread blindfold

How come I recognize so many of these findings? ;-)

 4. I believe that processing breaks down whenever I spend too much
 time in the onPreviewFrame function.

That's what I observed too: so do the heavy duty image processing
outside onPreviewFrame(), with supplementary frame skipping and
subsampling as needed. Then there is no problem quitting the app
either.

 I'd incurr an extra copy of the YUV_422 byte[] buffer from
 onCameraFrame into the Thread prior to processing.

Yes, you need to take care of the limited data[] lifetime in
onPreviewFrame().

 Between this and skipping frames that overlap the frame-processing thread,
 this might drastically reduce the filter/display speed.

I doubt that the buffer copying matters (especially when using
arraycopy) as compared to the time lost in decoding the preview image
(pixel-by-pixel) for lack of an Android API with a native decoding
method for the preview format(s).

 I've looked at various offsets within each expected set of 6 bytes for each 2 
 pixels.

For the Y (luminance) part just use the first block in data[]: the
colors are not interleaved with Y; Y is in one block of bytes with
offset zero in data[], with one byte per pixel. So decoding Y is easy,
and identical for emulator and G1 (except for the stride due to
preview image size differences: default 176x144 vs 480x320).

Regards
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame

2008-11-25 Thread dmanpearl

Peter and Jeff,

Thanks for each of your fantastic help, I am very close to achieving a
filtered preview display as required for my project - Modification of
the Android Camera Display from onPreviewFrame.

Here is my current stumbling block:
I can't seem to make coexist: SurfaceHolder for the camera  ImageView
for the filtered Bitmap to display.

One must assign the Camera preview display into a SurfaceHolder in
order to get preview callbacks:
mCamera.setPreviewDisplay(surfaceHolder);
The SurfaceHolder's Surface must be a child of the current View in
order to receive surfaceCreated, surfaceChanged, and surfaceDestroyed
callbacks.
I have been using an ImageView object to display the Bitmap into which
I am writing my decoded preview data via an int[] array.

Therefore, both the SurfaceView and the ImageView seem to have to be
children of the same ViewGroup, which is currently displayed.
However, as soon as I add the ImageView to the ViewGroup via:
linearLayout.addView(imageView);  or
absoluteLayout.addView(imageView); etc.
My Activity stops receiving Camera preview callbacks onPreviewFrame
().  LogCat indicates an AndroidRuntime ERROR: thread attach failed
error.  I am not sure if this is related.

I have tried setting the Surface to the same size as the Camera
preview size setting in the surfaceChanged callback:
Camera.Parameters parameters = mCamera.getParameters();
final Size preSz = parameters.getPreviewSize();
mHolder.setFixedSize(preSz.width, preSz.height);

I tried to create and manage the Surface (needed by Camera preview)
outside of the SurfaceHolder callbacks using the Activity's onResume
and onPause methods.  I did this so that I would not have to put the
SurfaceView into the active display hierarchy (i.e. remove as child of
current View) so that it would not conflict with the ImageView as
above.  Unfortunately, this causes a Camera exception app passed NULL
surface - directly after my call to:
surfaceView = new SurfaceView(this);
mHolder = surfaceView.getHolder();
// mHolder.addCallback(this);


Do you know why I can't make the Camera's Surface and an ImageView
Bitmap simultaneous members of the same active ViewGroup?

Thanks again.  Your generous responses help me more than you might
realize.

 - Regards, David Manpearl

On Nov 25, 12:47 am, blindfold [EMAIL PROTECTED] wrote:
 How come I recognize so many of these findings? ;-)

  4. I believe that processing breaks down whenever I spend too much
  time in the onPreviewFrame function.

 That's what I observed too: so do the heavy duty image processing
 outside onPreviewFrame(), with supplementary frame skipping and
 subsampling as needed. Then there is no problem quitting the app
 either.

  I'd incurr an extra copy of the YUV_422 byte[] buffer from
  onCameraFrame into the Thread prior to processing.

 Yes, you need to take care of the limited data[] lifetime in
 onPreviewFrame().

  Between this and skipping frames that overlap the frame-processing thread,
  this might drastically reduce the filter/display speed.

 I doubt that the buffer copying matters (especially when using
 arraycopy) as compared to the time lost in decoding the preview image
 (pixel-by-pixel) for lack of an Android API with a native decoding
 method for the preview format(s).

  I've looked at various offsets within each expected set of 6 bytes for each 
  2 pixels.

 For the Y (luminance) part just use the first block in data[]: the
 colors are not interleaved with Y; Y is in one block of bytes with
 offset zero in data[], with one byte per pixel. So decoding Y is easy,
 and identical for emulator and G1 (except for the stride due to
 preview image size differences: default 176x144 vs 480x320).

 Regards
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---