The background to my question is: I am currently developing a system for the recognition and detection of traffic signs. For this I use colour segmentation in the hsv colour space. The system/algorithm works very well via webcam and desktop pc. I have ported this algorithm to Android via jni/android ndk. However, I realized that Android makes use of some image enhancements e.g. white balance. These changes on the picture, especially caused by the automatic white balance, have a strong influence on the colour segmentation. Even various modes such as
WHITE_BALANCE_AUTO WHITE_BALANCE_CLOUDY_DAYLIGHT WHITE_BALANCE_DAYLIGHT WHITE_BALANCE_FLUORESCENT WHITE_BALANCE_INCANDESCENT do not work reliably. For displaying my camera pictures I implement a surfaceview. Here are my questions: 1) Is there a possibilty to access the image data without consideration for the automatic image enhancements which are provided by the Android camera API? 2) If not, is there a possibilty to individually disable the above-mentioned image enhancements? Methods such as setAutoWhiteBalanceLock() only cause me a black screen. Regarding the camera lifecycle, where exactly does this lock have to be activated? This, however, would only be an emergency solution as for the first frames the white balance is applied and will then only be blocked. Thanks in advance. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

