How come I recognize so many of these findings? ;-)

> 4. I believe that processing breaks down whenever I spend too much
> time in the onPreviewFrame function.

That's what I observed too: so do the heavy duty image processing
outside onPreviewFrame(), with supplementary frame skipping and
subsampling as needed. Then there is no problem quitting the app
either.

> I'd incurr an extra copy of the YUV_422 byte[] buffer from
> onCameraFrame into the Thread prior to processing.

Yes, you need to take care of the limited data[] lifetime in
onPreviewFrame().

> Between this and skipping frames that overlap the frame-processing thread,
> this might drastically reduce the filter/display speed.

I doubt that the buffer copying matters (especially when using
arraycopy) as compared to the time lost in decoding the preview image
(pixel-by-pixel) for lack of an Android API with a native decoding
method for the preview format(s).

> I've looked at various offsets within each expected set of 6 bytes for each 2 
> pixels.

For the Y (luminance) part just use the first block in data[]: the
colors are not interleaved with Y; Y is in one block of bytes with
offset zero in data[], with one byte per pixel. So decoding Y is easy,
and identical for emulator and G1 (except for the stride due to
preview image size differences: default 176x144 vs 480x320).

Regards
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to