Our app has a problem with AudioRecord where the recorded audio is very sloooow (sound like it's 2x slower). We use AudioRecord.getMinBufferSize to try 16000 Hz first and then back down to 8000 Hz if that function returns an error. So my guess is that the audio is reported as 8000 Hz even though it's really sampling @ 16000 Hz.
It seems that the problem could also be that the output stream is stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO (deprecated) on the earlier devices. This seems to be how it works for AudioTrack. But is this how it works for AudioRecord? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

