The G1 preview format is YUV 420 semi-planar (U and V are subsampled
by 2 in both X and Y). The Y plane is first, followed by UV pairs - I
believe the U sample comes first in the pair.

Technically it's YCbCr 420 semi-planar, but very few people use that
term.

On Nov 26, 6:27 pm, dmanpearl <[EMAIL PROTECTED]> wrote:
> Hello Blindfold,
>
> Thanks for your help.  I solved the user interface problems I was
> experiencing by using a separate thread to do my image processing. I'm
> still using an ImageView, and without problems.  Perhaps I will try a
> Canvas in a SurfaceHolder later in this exercise to compare speeds.
>
> MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY
>
> As you know, I want to display live camera data through a custom
> filter.
>
> I got most of the way through the YCbCr_422_SP data buffer returned to
> Android's Camera.PreviewCallback onCameraFrame() callback function,
> and now I am looking for help decyphering the U, V portion of the
> buffer.  I verified that the first (width*height) bytes are simple Y
> luminance values that can be displayed (via Bitmap and ImageView) to
> make a viable gray-scale image.  The total number of bytes are (width
> * height * 3 / 2).
>
> The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr)
> data.  Therefore, there are 1/4 image bytes for each U, V component
> (i.e. each U, V component is used for 4 pixels of the image).  This
> looks like 411 or 420 data, not 422, but we have bigger fish to fry.
>
> I cannot determine if the U V data is aligned adjacently, in
> alternating rows, or in squares as described in this Wikipedia
> graphical description:  http://en.wikipedia.org/wiki/Image:Yuv420.svg.
> Once I finally determine the structure of the U, V data, I have
> several equations to convert from YUV to RGB and I have tried many
> ways of combining the UV data with the luminance data of the first 2/3
> of the buffer to no avail.  So far I can only display mono-chrome.
>
> If you or others on this list can decode the Android YCbCr_422_SP
> data, please post the solution as soon as possible.  Your efforts and
> generosity are greatly appreciated.  I am convinced that
> representatives from Google/Android and others monitoring this list
> know how to do this.  Please share the information.  It is crutial to
> our project.  I do not care about the Emulator and it's different
> encoding.  I realize that Google is probably waiting to implement a
> unified solution and share it through an API update, but we cannot
> wait.
>
>  - Thank you, David Manpearl
>
> On Nov 26, 10:23 am, blindfold <[EMAIL PROTECTED]> wrote:
>
> > Hi David,
>
> > > I can't seem to make coexist: SurfaceHolder for the camera & ImageView
> > > for the filtered Bitmap to display.
>
> > ...
>
> > > Do you know why I can't make the Camera's Surface and an ImageView
> > > Bitmap simultaneous members of the same active ViewGroup?
>
> > I do not use ImageView myself so I cannot really judge your problem. I
> > draw my filtered Bitmap to a Canvas in a SurfaceView. No ImageView
> > anywhere.
>
> > Regards
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to