[android-developers] AudioRecord sample rate + Touch Sounds + HDMI
Here is a strange interaction that I discovered on one particular brand of Android Tablet. This interaction can cause the audio sample rate to change by 9% under certain condition. Those conditions are: 1. The HDMI video output to a TV monitor is connected. 2. The Touch Sounds are enabled in Settings / Sounds. (In this device Touch Sounds are little clicks that indicate you have touched a button.) The reason I am using the HDMI output is to demonstrate my app to a large group. It is a professional piano tuning app. I set up an AudioRecord object for a sample rate of 22050 samples per second. It was quite embarrassing for me at a recent presentation I was giving to a group of professional piano tuners when my app suddenly began indicating about 9% higher in pitch. My app had never misbehaved like that before. But then I had done only limited testing with the HDMI output connected. After the presentation was over I began an exhaustive test routine. I still didn't realize the necessity of having the HDMI connected to witness the failure, so I could not duplicate the fault. So finally I plugged in the HDMI to a TV and found that the audio sample rate would sometimes shift to 20255 samples per second when the user touches one of my app buttons that makes a click. Apparently this device shares the audio in and audio out functions in an undocumented way. When the operating system (Android 4.0.3) makes a sound it has a 40% chance of switching the sample rate. Once switched, the sample rate stays switched until I stop and restart the AudioRecord object. Then I found I could simply turn off the Touch Sounds option in Settings and my problem went away. I doubt that this problem is universal. It probably is due to this one OEMs implementation of the hardware drivers. But if you have an app that uses AudioRecord, keep this possibility in mind. This table was a 7" "MID" tablet from some obscure Chinese maker, model T01A, -- -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en --- You received this message because you are subscribed to the Google Groups "Android Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to android-developers+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/groups/opt_out.
[android-developers] AudioRecord error : Unsupported configuration: sampleRate 44100, format 1, channelCount 1
Hello, I am interested to record audio using the API AudioRecord at the following configurations: SAMPPERSEC = 44100; channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO; audioEncoding = AudioFormat.ENCODING_PCM_16BIT; While initializing the AudioRecord API using the getMinBufferSize () method: buffersizebytes = AudioRecord.getMinBufferSize(SAMPPERSEC,channelConfiguration,audioEncoding); it throws an error value of -2: which denotes a failure due to the use of an invalid value. Could you please help in specifying if the configuration parameters look fine. and more specifically, I am interested to know if the sampling rate of 44100Hz is supported by the API. Cheers. Gurman. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord
just go through this link http://code.google.com/p/krvarma-android-samples/source/browse/trunk/AudioRecorder.2/?r=77 -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord
Hi , Try to do recorder.stopRecording() at the end of record. Regards, Sandeep On 5/8/12, ron simon wrote: > Hi, > I want to create an recorder and take the buffer to play the sound but it > doesn't work. I have no idea for my misstake and hope for your help ;) > > > > public class input { > private static final String TAG = "Aufnahme"; > private AudioRecord recorder = null; > private boolean isRecording = false; > private int SAMPLERATE = 8000; > private int CHANNELS = AudioFormat.CHANNEL_CONFIGURATION_MONO; > private int AUDIO_FORMAT = AudioFormat.ENCODING_PCM_16BIT; > private int bufferSize = AudioRecord.getMinBufferSize(SAMPLERATE, CHANNELS, > AUDIO_FORMAT); > private Thread recordingThread = null; > > public void startRecording() { > recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLERATE, > CHANNELS, AUDIO_FORMAT, bufferSize); > > recorder.startRecording(); > isRecording = true; > > recordingThread = new Thread(new Runnable() > > { > public void run() { > writeAudioData(); > } > > }); > recordingThread.start(); > > } > > public void stopRecording() { > isRecording = false; > recorder.stop(); > recorder.release(); > recorder = null; > recordingThread = null; > } > > private void writeAudioData() { > > byte data[] = new byte[bufferSize]; > > while (isRecording) { > > recorder.read(data, 0, bufferSize); > send(data); > > } > } > > private void send(byte[] data) { > > int minBufferSize = AudioTrack.getMinBufferSize(8000, > AudioFormat.CHANNEL_CONFIGURATION_MONO, > AudioFormat.ENCODING_PCM_16BIT); > > AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, > AudioFormat.CHANNEL_CONFIGURATION_MONO, > AudioFormat.ENCODING_PCM_16BIT, minBufferSize, > AudioTrack.MODE_STREAM); > > at.play(); > at.write(data, 0, bufferSize); > at.stop(); > at.release(); > > } > > } > > -- > You received this message because you are subscribed to the Google > Groups "Android Developers" group. > To post to this group, send email to android-developers@googlegroups.com > To unsubscribe from this group, send email to > android-developers+unsubscr...@googlegroups.com > For more options, visit this group at > http://groups.google.com/group/android-developers?hl=en -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord
plz refer this link may be you get some idea!!! http://code.google.com/p/krvarma-android-samples/source/browse/#svn/trunk/AudioRecorder.2 -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord
Hi, I want to create an recorder and take the buffer to play the sound but it doesn't work. I have no idea for my misstake and hope for your help ;) public class input { private static final String TAG = "Aufnahme"; private AudioRecord recorder = null; private boolean isRecording = false; private int SAMPLERATE = 8000; private int CHANNELS = AudioFormat.CHANNEL_CONFIGURATION_MONO; private int AUDIO_FORMAT = AudioFormat.ENCODING_PCM_16BIT; private int bufferSize = AudioRecord.getMinBufferSize(SAMPLERATE, CHANNELS, AUDIO_FORMAT); private Thread recordingThread = null; public void startRecording() { recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLERATE, CHANNELS, AUDIO_FORMAT, bufferSize); recorder.startRecording(); isRecording = true; recordingThread = new Thread(new Runnable() { public void run() { writeAudioData(); } }); recordingThread.start(); } public void stopRecording() { isRecording = false; recorder.stop(); recorder.release(); recorder = null; recordingThread = null; } private void writeAudioData() { byte data[] = new byte[bufferSize]; while (isRecording) { recorder.read(data, 0, bufferSize); send(data); } } private void send(byte[] data) { int minBufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT); AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize, AudioTrack.MODE_STREAM); at.play(); at.write(data, 0, bufferSize); at.stop(); at.release(); } } -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord is not working as expected
I have written following code in which I am trying to record and play the audio... But all I get is just the noise... Please help .. I also want to know why my UI thread hangs when I have a separate thread for recording.Thanks in advance.. /** * @author amit * */ public class AudioRecorder extends Activity { private String LOG_TAG = null; /* variables which are required to generate and manage the UI of the App */ // private RecordButton mRecordButton = null; private Button recordBtn, stopBtn, playBtn; /* * variables which are required for the actual functioning of the recording * and playing */ private AudioRecord recorder = null; private AudioTrack player = null; private int recorderBufSize, recordingSampleRate; private int trackBufSize; private byte[] audioData; private boolean isRecording = false, isPlaying = false; private Thread startRecThread; private AudioRecord.OnRecordPositionUpdateListener posUpdateListener; /** * constructor method for initializing the variables */ public AudioRecorder() { super(); LOG_TAG = "Constructor"; recorderBufSize = recordingSampleRate = trackBufSize = 0; // init function will initialize all the necessary variables ... init(); if (recorder != null && player != null) { Log.e(LOG_TAG, "recorder and player initialized"); audioData = new byte[recorderBufSize]; } else { Log.e(LOG_TAG, "Problem inside init function "); } posUpdateListener = new AudioRecord.OnRecordPositionUpdateListener() { int bytesRead = 0; @Override public void onPeriodicNotification(AudioRecord recorder) { // TODO Auto-generated method stub // Log.e(LOG_TAG, "inside position listener"); bytesRead = recorder.read(audioData, 0, audioData.length); player.write(audioData, 0, bytesRead); } @Override public void onMarkerReached(AudioRecord recorder) { // TODO Auto-generated method stub Log.e(LOG_TAG, "Marker Reached"); } }; // listener will be called every time 160 frames are reached recorder.setPositionNotificationPeriod(160); recorder.setRecordPositionUpdateListener(posUpdateListener); startRecThread = new Thread() { @Override public void run() { // TODO Auto-generated method stub recorder.startRecording(); // Log.e(LOG_TAG, "running"); // while (recorder.getRecordingState() == AudioRecord.RECORDSTATE_RECORDING) { recorder.read(audioData, 0, recorderBufSize); // } } }; Log.e(LOG_TAG, "inside constructor"); } private void init() { LOG_TAG = "initFunc"; int[] mSampleRates = new int[] { 8000, 11025, 22050, 44100 }; short audioFormat = AudioFormat.ENCODING_PCM_16BIT; for (int rate : mSampleRates) { try { Log.d(LOG_TAG, "Attempting rate " + rate + "Hz, bits: " + audioFormat); int bufrSize = AudioRecord.getMinBufferSize(rate, AudioFormat.CHANNEL_IN_MONO, audioFormat); if (bufrSize != AudioRecord.ERROR_BAD_VALUE && bufrSize != AudioRecord.ERROR) { // check if we can instantiate and have a success AudioRecord rec = new AudioRecord( MediaRecorder.AudioSource.MIC, rate, AudioFormat.CHANNEL_IN_MONO, audioFormat, bufrSize); if (rec != null && rec.getState() == AudioRecord.STATE_INITIALIZED) { // storing variables for future use . . . this.recordingSampleRate = rate; this.recorderBufSize = bufrSize; Log.e(LOG_TAG, "Returning..(rate:channelConfig:audioFormat:bufferSize)" + rate + ":" + AudioFormat.CHANNEL_IN_MONO + ":" + audioFormat + ":" + bufrSize); // Now create an instance of the AudioTrack int audioTrackBufSize = AudioTrack .getMinBufferSize(rate, AudioFormat.CHANNEL_OUT_MONO, audioFormat); Log.e(LOG_TAG, "AudioTrack buf size :" + audioTrackBufSize); this.trackBufSize = audioTrackBufSize; this.player = new AudioTrack(AudioManager.STREAM_MUSIC, rate, AudioFormat.CHANNEL_OUT_MONO, audioFormat, audioTrackBufSize, AudioTrack.MODE_STREAM); this.recorder = rec; return; } } } catch (IllegalArgumentException e) { Log.d(LOG_TAG, rate + "Exception, keep trying.", e); } catch (Exception e) { Log.e(LOG_TAG, rate + "Some Exception!!", e); } // for loop for channel config ended here. . . . // for loop for audioFormat ended here. . . } return; } private void startPlaying() { LOG_TAG = "startPlaying"; Log.e(LOG_TAG, "start Playing"); } private void stopPlaying() { LOG_TAG = "stopPlaying"; Log.e(LOG_TAG, "stop Playing"); } private void startRecording() { LOG_TAG = "startRecording"; player.play(); /* start a separate recording thread from here . . . */ startRecThread.start(); // recorder.read(audioData, 0, recorderBufSize); Log.e(LOG_TAG, "start Recording"); } private void stopRecording() { LOG_TAG = "stopRecording"; if(startRecThread.isAlive()) startRecThread.stop(); recorder.stop(); player.stop(); Log.e(LOG_TAG, "stop Recording"); } private void stop() { if (isRecording) { isRecording = false; stopRecording(); } if (isPlaying) { isPlaying = false; stopPlaying(); } recordBtn.setEnabled(true); playBtn.setEnabled(true); } @Override public void onCreate(Bundle icicle) { super.onCreate(icicle); LinearLayout ll = new LinearLayout(this); // creating Buttons one by one . . . . // but
[android-developers] AudioRecord StartRecording call doesnt returns when done thru JNI
hi, I m calling startRecording method of Audio Record thru JNI. 2 out of 10 calls the call doesnt return at all. I m using AudioRecord thru JNI in my application. Please find the code snippet attached: jni_env->CallVoidMethod(pRecStream->channelObject, record_method); Where channelObject refers to the AudioRecord class object's global reference. Thanks in advance for help, Vineet -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord and AudioTrack
Hello, My application uses AudioTrack in streaming mode and AudioRecord simultaneously. My problem is that I start them at the same time, but it is not actually start playback\recording in the same timestamp. There is some delay between recording and playback. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord producing no sound
As i have re-written this several times now. It is very nasty. Im testing on a DroidX where everything i use to record a call is coming back empty. Anyone know why or a fix? My recording class: [code] package com.call.tests; import android.media.AudioFormat; import android.media.AudioRecord; import android.media.MediaRecorder; import android.media.MediaRecorder.AudioSource; import android.os.Environment; import android.util.Log; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.FileOutputStream; import java.io.IOException; import java.text.SimpleDateFormat; import java.util.Date; class Record { private MediaRecorder recorder = null; private AudioRecord rawRecorder = null; public boolean recording = false; private Thread recordingThread = null; public String path = ""; public String tempPath = ""; public void BeginRecording(String incomingNumber) { try { //recorder = new MediaRecorder(); if(incomingNumber.length() == 0) incomingNumber = "noNumber"; Date dateNow = new Date (); SimpleDateFormat dateformat = new SimpleDateFormat("_kms_MMdd"); StringBuilder now = new StringBuilder( dateformat.format( dateNow ) ); path = incomingNumber + now.toString(); Log.d("TEST", "NEW PATH: " + path); String longPath = sanitizePath(path, false); File directory = new File(longPath).getParentFile(); if (!directory.exists() && !directory.mkdirs()) { Log.d("TEST", "Path to file could not be created."); } else { Log.d("TEST", "Path exists."); } rawRecorder = findAudioRecord(); rawRecorder.startRecording(); recording = true; recordingThread = new Thread(new Runnable() { @Override public void run() { writeAudioDataToFile(); } },"AudioRecorder Thread"); recordingThread.start(); /*recorder.reset(); recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_DOWNLINK); recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT); recorder.setOutputFile(longPath); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); Log.e("TEST", "Recording values set"); //try{ recorder.prepare(); recorder.start(); //Log.e("TEST", "Recording: " + recording); //} catch (IOException e) { //Log.e("TEST", "prepare() failed : " + e.toString()); //}*/ } catch(Exception ex) { Log.d("TEST", "BeginRecording: " + ex.toString()); } } private static int[] mSampleRates = new int[] { 8000, 11025, 22050, 44100 }; public AudioRecord findAudioRecord() { for (int rate : mSampleRates) { for (short audioFormat : new short[] { AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT }) { for (short channelConfig : new short[] { AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO }) { try { Log.d("TEST", "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: " + channelConfig); int bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat); if (bufferSize != AudioRecord.ERROR_BAD_VALUE) { AudioRecord recorder = new AudioRecord(AudioSource.VOICE_CALL, rate, channelConfig, audioFormat, bufferSize); if (recorder.getState() == AudioRecord.STATE_INITIALIZED) return recorder; } } catch (Exception e) { Log.e("TEST", rate + "Exception, keep trying.",e); } } } } return null; } public void writeAudioDataToFile() { tempPath = "temp_raw_record"; String longPath = sanitizePath(tempPath, true); File directory = new File(longPath).getParentFile(); if
[android-developers] AudioRecord problem
Did any one try code of sipdroid given in [url]http://sipdroid.org/[/ url]. If yes than please help me. I have to read samples from microphone and send it to udp socket. For this I'm using AudioRecord class as given in RtpStreamSender.java and I have use read method to read the samples which is given below: public int read (byte[] audioData, int offsetInBytes, int sizeInBytes).This method reads audio data from the audio hardware for recording into a buffer. Its Parameters are : audioData :the array to which the recorded audio data is written. offsetInBytes: index in audioData from which the data is written expressed in bytes. sizeInBytes :the number of requested bytes. It Returns: the number of bytes that were read or or ERROR_INVALID_OPERATION if the object wasn't properly initialized, or ERROR_BAD_VALUE if the parameters don't resolve to valid data and indexes. The number of bytes will not exceed sizeInBytes. I have written this method in my code like this : int num; byte[] buf = new byte[160]; num = record.read(buf, 0, 160); The problem is that it always returns 160 (i.e. the requested byte to be read) not less than 160 even if the data is not available. I am getting voice sample from microphone and My sampling rate is 8000 Hz so 8000 samples per second encoding in 16 bit so to read 160 byte it requires at least 10 milliseconds but it send data at every 1 ms. If it is a blocking method than it should return the value 160 after reading 160 bytes. Why it returns maximum value even if it reads 10-20bytes ? what's the problem? please help me. Thanks in advance. My code is like : public void run() { running = true; int frame_size = 160; android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); try { minBuffersize= AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT); Log.i(TAG, "trying to capture " + String.format("%d", frameSize) + " bytes"); record = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT, 2); record.startRecording(); byte[] buffer = new byte[frameSize]; while (running) { record.read(buffer, 0, frameSize); Log.i(TAG, "Captured " + String.format("%d", frameSize) + " bytes of audio"); } } catch (Exception e) { Log.e(TAG, "Failed to capture audio"); } } -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord on HTC Evo or ANdroid 2.2
I am using AudioRecord to record audio. It works very well on g1 and g7, However, when I tested on HTC evo, the record audio is much much faster than the real input. Does any one has the same experience? Thanks! Cindy -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord is not working on emulator
Hi All, I used AudioRecord on emulator and phone. I found that AudioRecord is not working on phone: 1> the min-buffer-size on simulator is much smaller than phone 2> The audio data I read from phone is all zero. Does anyone has same experience? Thanks! Cindy -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord Class Problem: Callback is never called (revisited/resolved)
I've been perplexed by the AudioRecord documentation on the OnRecordPositionUpdateListener interface but I think I got it figured out. All of the posts in this group asking why the callbacks are never called were assuming that the callbacks would tell you when to do a read. But what you need to be doing is reading at the same time as you have callbacks. Setting up a listener with setPositionNotificationPeriod will cause the callback to be called every time you get that many frames read. Setting up a listener with setNotificationMarkerPosition will cause that callback to be called whenever you first get to that many frames total. Let's say you start a thread that loops on your AudioRecord, reading 1000 frames at a time. You could get periodic callbacks every 2500 frames, or you could get a marker callback after 100,000 frames. If you're not reading, you won't trigger callbacks. So don't do the reads in your callbacks. Use the callbacks to trigger other actions. I'm pretty sure I'm right about this, but if someone wants to chime in with corrections, please let us know. - dave www.androidbook.com -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord weirdness on Samsung Moment (SPH-M900)
Try the "soundcheck" app on the android market to check the valid frequencies supported on the device. -Dan On Tue, Sep 7, 2010 at 10:52 PM, Krishna Mohan wrote: > Hi, >Android's AudioFlinger does the job of downsampling to 8khz irrespective > of whatever be the top level applications sampling rate setting. > AudioFlinger will down sample to 8khz and record. You may try another method > of recording using arecord utility of alsa. > > #alsa_arecord -f 16000 -c 2 -Dhw:0,0 test.wav > > In the above alsa_arecord is a soft link to alsa_aplay and -Dhw:0,0 or > -Dhw:0,1 depends on ur sound card. its the capturing device. If this works > fine...then something has gone wrong in android when passing audio > parameters to kernel. > > Regards, > -D Krishna Mohan > > On Tue, Sep 7, 2010 at 2:29 AM, dan raaka wrote: > >> what is the build fingerprint on your device ? >> >> $ adb shell getprop | grep finger >> -Dan >> >> >>On Wed, Sep 1, 2010 at 12:44 PM, Steve Hugg wrote: >> >>> Our app has a problem with AudioRecord where the recorded audio is >>> very slw (sound like it's 2x slower). We use >>> AudioRecord.getMinBufferSize to try 16000 Hz first and then back down >>> to 8000 Hz if that function returns an error. So my guess is that the >>> audio is reported as 8000 Hz even though it's really sampling @ 16000 >>> Hz. >>> >>> It seems that the problem could also be that the output stream is >>> stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API >>> levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO >>> (deprecated) on the earlier devices. This seems to be how it works for >>> AudioTrack. But is this how it works for AudioRecord? >>> >>> -- >>> You received this message because you are subscribed to the Google >>> Groups "Android Developers" group. >>> To post to this group, send email to android-developers@googlegroups.com >>> To unsubscribe from this group, send email to >>> android-developers+unsubscr...@googlegroups.com >>> For more options, visit this group at >>> http://groups.google.com/group/android-developers?hl=en >>> >> >> -- >> You received this message because you are subscribed to the Google >> Groups "Android Developers" group. >> To post to this group, send email to android-developers@googlegroups.com >> To unsubscribe from this group, send email to >> android-developers+unsubscr...@googlegroups.com >> For more options, visit this group at >> http://groups.google.com/group/android-developers?hl=en >> > > -- > You received this message because you are subscribed to the Google > Groups "Android Developers" group. > To post to this group, send email to android-developers@googlegroups.com > To unsubscribe from this group, send email to > android-developers+unsubscr...@googlegroups.com > For more options, visit this group at > http://groups.google.com/group/android-developers?hl=en > -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord weirdness on Samsung Moment (SPH-M900)
Hi, Android's AudioFlinger does the job of downsampling to 8khz irrespective of whatever be the top level applications sampling rate setting. AudioFlinger will down sample to 8khz and record. You may try another method of recording using arecord utility of alsa. #alsa_arecord -f 16000 -c 2 -Dhw:0,0 test.wav In the above alsa_arecord is a soft link to alsa_aplay and -Dhw:0,0 or -Dhw:0,1 depends on ur sound card. its the capturing device. If this works fine...then something has gone wrong in android when passing audio parameters to kernel. Regards, -D Krishna Mohan On Tue, Sep 7, 2010 at 2:29 AM, dan raaka wrote: > what is the build fingerprint on your device ? > > $ adb shell getprop | grep finger > -Dan > > > On Wed, Sep 1, 2010 at 12:44 PM, Steve Hugg wrote: > >> Our app has a problem with AudioRecord where the recorded audio is >> very slw (sound like it's 2x slower). We use >> AudioRecord.getMinBufferSize to try 16000 Hz first and then back down >> to 8000 Hz if that function returns an error. So my guess is that the >> audio is reported as 8000 Hz even though it's really sampling @ 16000 >> Hz. >> >> It seems that the problem could also be that the output stream is >> stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API >> levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO >> (deprecated) on the earlier devices. This seems to be how it works for >> AudioTrack. But is this how it works for AudioRecord? >> >> -- >> You received this message because you are subscribed to the Google >> Groups "Android Developers" group. >> To post to this group, send email to android-developers@googlegroups.com >> To unsubscribe from this group, send email to >> android-developers+unsubscr...@googlegroups.com >> For more options, visit this group at >> http://groups.google.com/group/android-developers?hl=en >> > > -- > You received this message because you are subscribed to the Google > Groups "Android Developers" group. > To post to this group, send email to android-developers@googlegroups.com > To unsubscribe from this group, send email to > android-developers+unsubscr...@googlegroups.com > For more options, visit this group at > http://groups.google.com/group/android-developers?hl=en > -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord weirdness on Samsung Moment (SPH-M900)
what is the build fingerprint on your device ? $ adb shell getprop | grep finger -Dan On Wed, Sep 1, 2010 at 12:44 PM, Steve Hugg wrote: > Our app has a problem with AudioRecord where the recorded audio is > very slw (sound like it's 2x slower). We use > AudioRecord.getMinBufferSize to try 16000 Hz first and then back down > to 8000 Hz if that function returns an error. So my guess is that the > audio is reported as 8000 Hz even though it's really sampling @ 16000 > Hz. > > It seems that the problem could also be that the output stream is > stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API > levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO > (deprecated) on the earlier devices. This seems to be how it works for > AudioTrack. But is this how it works for AudioRecord? > > -- > You received this message because you are subscribed to the Google > Groups "Android Developers" group. > To post to this group, send email to android-developers@googlegroups.com > To unsubscribe from this group, send email to > android-developers+unsubscr...@googlegroups.com > For more options, visit this group at > http://groups.google.com/group/android-developers?hl=en > -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord weirdness on Samsung Moment (SPH-M900)
Our app has a problem with AudioRecord where the recorded audio is very slw (sound like it's 2x slower). We use AudioRecord.getMinBufferSize to try 16000 Hz first and then back down to 8000 Hz if that function returns an error. So my guess is that the audio is reported as 8000 Hz even though it's really sampling @ 16000 Hz. It seems that the problem could also be that the output stream is stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO (deprecated) on the earlier devices. This seems to be how it works for AudioTrack. But is this how it works for AudioRecord? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord weirdness on Samsung Moment (SPH-M900)
Our app has a problem with AudioRecord where the recorded audio is very slw (sound like it's 2x slower). We use AudioRecord.getMinBufferSize to try 16000 Hz first and then back down to 8000 Hz if that function returns an error. So my guess is that the audio is reported as 8000 Hz even though it's really sampling @ 16000 Hz. It seems that the problem could also be that the output stream is stereo, not mono. We're passing AudioFormat.CHANNEL_IN_MONO on API levels 5 and above and AudioFormat.CHANNEL_CONFIGURATION_MONO (deprecated) on the earlier devices. This seems to be how it works for AudioTrack. But is this how it works for AudioRecord? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord API - STATE_UNITIALIZED - What do you do?
I have an app that records 8Khz audio, saves to the SD and plays it back. It uses the AudioRecord API and it works just fine on my Nexus One (2.2), Behold II (1.6), Cliq XT (1.5) and the emulator (tested from 1.5 to 2.2). However, on the market error report, I am seeing the following IllegalStateException: Caused by: java.lang.IllegalStateException: startRecording() called on an uninitialized AudioRecord. at android.media.AudioRecord.startRecording(AudioRecord.java:495) It occurs when I attempt to call startRecording() on an instantiated AudioRecord object. If I catch this exception, or use the getState() method to determine I am the AudioRecord instance is in an Unitialized state, what can I do at that point? I've tried creating a new AudioRecord object, but it just fails with the same IllegalStateException upon calling startRecord(). if (audioRecordInstance.getState() == AudioRecord.STATE_UNITIALIZED) { // what can I do here?? audoRecordInstance.stop(); // this throws an illegalStateException } -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord class - Disable AGC and noise suppression
Hi, Is there a way to disable AGC and noise suppression in the AudioRecord class? I see in the cpp source that these are enabled with no method to disable on the java side. Thanks, Brian -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord: onPeriodicNotification and onMarkerReached are not called
I've seen posts regarding problems with the AudioRecord OnRecordPositionUpdateListener, but I haven't seen any real answers. I'm using AudioRecord to record from the mic, and I want to get the input from the mic periodically so I can check sound levels and update a meter. I set up the listener like this: private OnRecordPositionUpdateListener mRecordListener = new AudioRecord.OnRecordPositionUpdateListener() { public void onPeriodicNotification(AudioRecord recorder) { if (recorder.getRecordingState()!=aRecorder.RECORDSTATE_STOPPED) { mAudioBuffer = new short[8000]; mSamplesRead = recorder.read(mAudioBuffer, 0, 8000); if (mSamplesRead > 0) { // do something } } } public void onMarkerReached(AudioRecord recorder) { if (recorder.getRecordingState()!=aRecorder.RECORDSTATE_STOPPED) { mAudioBuffer = new short[8000]; mSamplesRead = recorder.read(mAudioBuffer, 0, 8000); } } }; I instantiate the AudioRecord object like this: // audioSource = MediaRecorder.AudioSource.MIC // sampleRate = 16000 // channelConfig = AudioFormat.CHANNEL_IN_MONO // audioFormat = AudioFormat.ENCODING_PCM_16BIT // bufferSize = 409600 aRecorder = new AudioRecord(audioSource, sampleRate, channelConfig, audioFormat, bufferSize); if (aRecorder.getState() != AudioRecord.STATE_INITIALIZED) throw new Exception("AudioRecord initialization failed"); aRecorder.setRecordPositionUpdateListener(mRecordListener); aRecorder.setPositionNotificationPeriod(400); aRecorder.setNotificationMarkerPosition(400); What happens is that onMarkerReached gets called once, after the recorder is released, so that's not very useful. Do I need to change the marker position and notification period? Is the listener working? Is there another way to accomplish periodically getting the mic input? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord Problem, please help!
Hi all, I am using AudioRecord to get raw PCM data from the audio device and do some extra processing with NDK. It works fine in the first 16 seconds, and crashes after that. There is meaningful debug info I can refer to. Here is my code and the Log info. Please help! thanks! class AudioReceiveThread extends Thread { private InetAddress serverAddr; private InputStream inStream; int callback_count = 0; short[] mAudioLocalBuffer = null; public void run() { super.run(); mAudioEncodedBuffer = new byte[1000]; // mAudioLocalBuffer = new short[4000]; /* audio */ mRecordListener = new OnRecordPositionUpdateListener() { public void onPeriodicNotification(AudioRecord recorder) { mAudioLocalBuffer = new short[4000]; byte mode = 6; mSamplesRead = recorder.read(mAudioLocalBuffer, 0, AUDIO_BUFFER_SAMPLEREAD_SIZE); if (mSamplesRead > 0) { // do something here... System.out.println("Read samples\n"); System.out.println(mSamplesRead); System.out.println(callback_count); System.out.println("Going to encode\n"); n = EncoderInterfaceEncode(mode, mAudioLocalBuffer,mAudioEncodedBuffer,0); System.out.println("After Encoder...\n"); System.out.println(n); callback_count++; } else { System.out.println("Read nothig\n"); } mAudioLocalBuffer = null; } public void onMarkerReached(AudioRecord recorder) { System.out.println("What? Hu!? Where am I?"); } }; try { mAudioRecorder = new AudioRecord( android.media.MediaRecorder.AudioSource.MIC, AUDIO_SAMPLE_FREQ, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_BYTESIZE); } catch (Exception e) { System.out.println("Unable to init audio recording!"); } EncoderInterfaceInit(0); mAudioBuffer = new short[AUDIO_BUFFER_SAMPLEREAD_SIZE]; mAudioRecorder.setPositionNotificationPeriod(AUDIO_BUFFER_SAMPLEREAD_SIZE/ 2); mAudioRecorder.setRecordPositionUpdateListener(mRecordListener); mAudioRecorder.startRecording(); ///* test if I can read anything at all... (and yes, this here works!) */ mSamplesRead = mAudioRecorder.read(mAudioBuffer, 0, AUDIO_BUFFER_SAMPLEREAD_SIZE); } } Log Info: 05-07 14:59:20.040: INFO/System.out(3994): Read samples 05-07 14:59:20.040: INFO/System.out(3994): 4000 05-07 14:59:20.040: INFO/System.out(3994): 36 05-07 14:59:20.040: INFO/System.out(3994): Going to encode 05-07 14:59:20.040: DEBUG/libnav(3994): Before Encode, mode = 6, forceSpeech = 0 05-07 14:59:20.050: DEBUG/libnav(3994): After Encode 05-07 14:59:20.060: INFO/System.out(3994): After Encoder... 05-07 14:59:20.060: INFO/System.out(3994): 27 05-07 14:59:20.210: INFO/DEBUG(2965): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** 05-07 14:59:20.210: INFO/DEBUG(2965): Build fingerprint: 'tmobile/kila/ dream/trout:1.6/DMD64/21415:user/ota-rel-keys,release-keys' 05-07 14:59:20.210: INFO/DEBUG(2965): pid: 3994, tid: 3994 >>> com.example.LiveStreamer <<< 05-07 14:59:20.210: INFO/DEBUG(2965): signal 11 (SIGSEGV), fault addr 170c8554 05-07 14:59:20.210: INFO/DEBUG(2965): r0 r1 032c r2 170c8534 r3 45114508 05-07 14:59:20.210: INFO/DEBUG(2965): r4 0003 r5 be91a328 r6 4000c238 r7 0061 05-07 14:59:20.220: INFO/DEBUG(2965): r8 be91a2f0 r9 4328b2b0 10 4000c238 fp ad083e1c 05-07 14:59:20.220: INFO/DEBUG(2965): ip 4000c1e8 sp be91a2a0 lr 0003 pc ad01622c cpsr 6010 05-07 14:59:20.380: INFO/DEBUG(2965): #00 pc 0001622c / system/lib/libdvm.so 05-07 14:59:20.380: INFO/DEBUG(2965): #01 pc 00016c44 / system/lib/libdvm.so 05-07 14:59:20.380: INFO/DEBUG(2965): #02 pc 000146f8 / system/lib/libdvm.so 05-07 14:59:20.390: INFO/DEBUG(2965): #03 pc 00014818 / system/lib/libdvm.so 05-07 14:59:20.390: INFO/DEBUG(2965): #04 pc 0001492c / system/lib/libdvm.so 05-07 14:59:20.390: INFO/DEBUG(2965): #05 pc 00016c8c / system/lib/libdvm.so
[android-developers] AudioRecord delay in recording
I'm using a class that uses AudioRecord. RehearsalAudioRecorder recorder = new RehearsalAudioRecorder( RehearsalAudioRecorder.RECORDING_UNCOMPRESSED, MediaRecorder.AudioSource.MIC, 16000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT); The RehearsalAudioRecorder constructor instantiates an AudioRecord: aRecorder = new AudioRecord(audioSource, sampleRate, channelConfig, audioFormat, bufferSize); I'm creating a .wav file on the SD card. I'm experiencing a 5 second delay in the recording; I play back the file, and the recording starts about 5 seconds after I started speaking. I did some measurements with System.currentTimeMillis(), and I measured the following: buffer size: 204800 bytes constructor: 17 ms write the file header: 6 ms AudioRecord.startRecording(): 483 ms AudioRecord.read(): 6268 ms There's correspondingly less delay when I use a smaller buffer: buffer size: 81920 bytes constructor: 16 ms write the file header: 5 ms AudioRecord.startRecording(): 486 ms AudioRecord.read(): 2427 ms Does anyone have any suggestions as to how to reduce the delay? It's still unacceptable to have a few seconds delay. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord creates a stereo file instead of a mono file.
I'm developing with Android 2.1 on a Nexus One with firmware 2.1 update 1. I'm using the RehearsalAudioRecorder class from here: http://rehearsalassist.svn.sourceforge.net/viewvc/rehearsalassist/android/trunk/src/urbanstew/RehearsalAssistant/RehearsalAudioRecorder.java?view=markup public RehearsalAudioRecorder(boolean uncompressed, int audioSource, int sampleRate, int channelConfig, int audioFormat) { try { rUncompressed = uncompressed; if (rUncompressed) { // RECORDING_UNCOMPRESSED if (audioFormat == AudioFormat.ENCODING_PCM_16BIT) { bSamples = 16; } else { bSamples = 8; } if (channelConfig == AudioFormat.CHANNEL_CONFIGURATION_MONO) { nChannels = 1; } else { nChannels = 2; } aSource = audioSource; sRate = sampleRate; aFormat = audioFormat; framePeriod = sampleRate * TIMER_INTERVAL / 1000; bufferSize = framePeriod * 2 * bSamples * nChannels / 8; if (bufferSize < AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat)) { // Check to make sure buffer size is not smaller than the smallest allowed one bufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat); // Set frame period and timer interval accordingly framePeriod = bufferSize / ( 2 * bSamples * nChannels / 8 ); Log.w(RehearsalAudioRecorder.class.getName(), "Increasing buffer size to " + Integer.toString(bufferSize)); } aRecorder = new AudioRecord(audioSource, sampleRate, channelConfig, audioFormat, bufferSize); if (aRecorder.getState() != AudioRecord.STATE_INITIALIZED) throw new Exception("AudioRecord initialization failed"); // aRecorder.setRecordPositionUpdateListener(updateListener); aRecorder.setPositionNotificationPeriod(framePeriod); } else { // RECORDING_COMPRESSED mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); } cAmplitude = 0; fPath = null; state = State.INITIALIZING; } catch (Exception e) { if (e.getMessage() != null) { Log.e(RehearsalAudioRecorder.class.getName(), e.getMessage()); } else { Log.e(RehearsalAudioRecorder.class.getName(), "Unknown error occurred while initializing recording"); } state = State.ERROR; } } Here's how I'm instantiating it: RehearsalAudioRecorder recorder = new RehearsalAudioRecorder( RehearsalAudioRecorder.RECORDING_UNCOMPRESSED, MediaRecorder.AudioSource.MIC, 16000, AudioFormat.CHANNEL_IN_FRONT, AudioFormat.CHANNEL_CONFIGURATION_MONO); I write a .wav file to the SD card. I specified mono, and I checked that CHANNEL_CONFIGURATION_MONO is indeed set in the AudioRecord object. But when I try to play the file, it's treated as if it's stereo, and voices sound like chipmunks. Apparently the header information is wrong. For instance, I open the .wav file in Audacity, and Audacity says it's stereo. Is this a bug? What can I do?
[android-developers] AudioRecord fails on Android 2.1
Hi, I can successfully create an AudioRecord instance on the emulator, on Android 1.5, 1.6, and 2.0 with the following statement : new AudioRecord( MediaRecorder.AudioSource.MIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, 16384); But it fails on an emulator running Android 2.1, with the following error: ERROR/AudioRecord(299): Could not get audio input for record source 1 ERROR/AudioRecord-JNI(299): Error creating AudioRecord instance: initialization check failed. ERROR/AudioRecord-Java(299): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object. For information, AudioRecord.getMinBufferSize() returns 640 bytes. I've played with all constructor arguments, it didn't help... I'm running the SDK Tools revision 5 on Linux. Any clue? -- Olivier -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en To unsubscribe, reply using "remove me" as the subject.
[android-developers] AudioRecord amplitude
How can I get the amplitude/volume when recording with AudioRecord? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en To unsubscribe, reply using "remove me" as the subject.
[android-developers] AudioRecord on Emulator with friquency > 8 kHz
Hello dear developers, I'm doing a voice recording using AudioRecord class and the main requirement that the frequency must be 16000 Hz. I'm using only emulator in development process for now and met the problem that I can't set frequency more than 8000, otherwise I get bufferSize == -2, and can't create instance of AudioRecord with such buffer. Thank you to Szabolcs Vrbos, he told me that this problem is only with emulator and several Samsung phones, device can record with higher frequency. I need a word from Android engineers that I will be able to record voice with 16000 Hz frequency on for instance HTC Hero (Android 1.5) or similar device with os 1.5 or higher. Thank you. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en To unsubscribe from this group, send email to android-developers+unsubscribegooglegroups.com or reply to this email with the words "REMOVE ME" as the subject.
[android-developers] AudioRecord Class Problem: Callback is never called
Hello Android Developers! My Android Java Application needs to record audio data into the RAM and process it. This is why I use the class "AudioRecord" and not the "MediaRecorder" (records only to file). Till now, I used a busy loop polling with "read()" for the audio data. this has been working so far, but it peggs the CPU too much. Between two polls, I put the thread to sleep to avoid 100% CPU usage. However, this is not really a clean solution, since the time of the sleep is not guaranteed and you must subtract a security time in order not to loose audio snippets. This is not CPU optimal. I need as many free CPU cycles as possible for a parallel running thread. Now I implemented the recording using the "OnRecordPositionUpdateListener". This looks very promising and the right way to do it according the SDK Docs. Everything seems to work (opening the audio device, read()ing the data etc.) but the Listner is never called. Does anybody know why? Info: I am working with a real Device, not under the Emulator. The Recording using a Busy Loop basically works (however not satifiying). Only the Callback Listener is never called. Here is a snippet from my Sourcecode: __ snip! public class myApplication extends Activity { /* audio recording */ private static final int AUDIO_SAMPLE_FREQ = 16000; private static final int AUDIO_BUFFER_BYTESIZE = AUDIO_SAMPLE_FREQ * 2 * 3; // = 3000ms private static final int AUDIO_BUFFER_SAMPLEREAD_SIZE = AUDIO_SAMPLE_FREQ / 10 * 2; // = 200ms private short[] mAudioBuffer = null; // audio buffer private int mSamplesRead; // how many samples are recently read private AudioRecord mAudioRecorder; // Audio Recorder ... private OnRecordPositionUpdateListener mRecordListener = new OnRecordPositionUpdateListener() { public void onPeriodicNotification(AudioRecord recorder) { mSamplesRead = recorder.read(mAudioBuffer, 0, AUDIO_BUFFER_SAMPLEREAD_SIZE); if (mSamplesRead > 0) { // do something here... } } public void onMarkerReached(AudioRecord recorder) { Error("What? Hu!? Where am I?"); } }; ... public void onCreate(Bundle savedInstanceState) { try { mAudioRecorder = new AudioRecord( android.media.MediaRecorder.AudioSource.MIC, AUDIO_SAMPLE_FREQ, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_BYTESIZE); } catch (Exception e) { Error("Unable to init audio recording!"); } mAudioBuffer = new short[AUDIO_BUFFER_SAMPLEREAD_SIZE]; mAudioRecorder.setPositionNotificationPeriod(AUDIO_BUFFER_SAMPLEREAD_SIZE); mAudioRecorder.setRecordPositionUpdateListener(mRecordListener); mAudioRecorder.startRecording(); /* test if I can read anything at all... (and yes, this here works!) */ mSamplesRead = mAudioRecorder.read(mAudioBuffer, 0, AUDIO_BUFFER_SAMPLEREAD_SIZE); } } __ snip! -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord on Samsung Moment
What build does this device have? There is new update for this device. Did you try on this build ? http://community.sprint.com/baw/thread/26322?featured=true -Dan On Mon, Jan 18, 2010 at 12:02 PM, Rico wrote: > When switching between the emulator (which supports 8KHz) and a real > phone like the G1 (which supports 16KHz), I used the following code to > set up a valid AudioRecord object: > > AudioRecord ar; > > // Try to construct at 16KHz > ar = new AudioRecord( > MediaRecorder.AudioSource.MIC, > 16000, > AudioFormat.CHANNEL_CONFIGURATION_MONO, > AudioFormat.ENCODING_PCM_16BIT, > AUDIO_BUFFER_SIZE); > > if (ar.getState() != AudioRecord.STATE_INITIALIZED) { > // Unable to set up at 16KHz, try at 8KHz > ar = new AudioRecord( >MediaRecorder.AudioSource.MIC, >8000, >AudioFormat.CHANNEL_CONFIGURATION_MONO, >AudioFormat.ENCODING_PCM_16BIT, >AUDIO_BUFFER_SIZE); > } > > This seems to work just fine, and a 16KHz AudioRecord object is > instantiated on the G1, and a 8KHz object is instantiated on the > emulator. > > However, on the Samsung Moment, which only supports 8KHz, > AudioRecord.getState() returns STATE_INITIALIZED when trying to > construct with 16KHz. This ends up causing recording to fail since > it's trying to record at 16KHz, even though the phone doesn't support > it. Also, I don't see any exceptions thrown when I try to wrap the > construction in a try-catch. > > Does anyone know of a better way to detect 16KHz vs. 8KHz capabilities > on the device? > > -- > You received this message because you are subscribed to the Google > Groups "Android Developers" group. > To post to this group, send email to android-developers@googlegroups.com > To unsubscribe from this group, send email to > android-developers+unsubscr...@googlegroups.com > For more options, visit this group at > http://groups.google.com/group/android-developers?hl=en > -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord on Samsung Moment
When switching between the emulator (which supports 8KHz) and a real phone like the G1 (which supports 16KHz), I used the following code to set up a valid AudioRecord object: AudioRecord ar; // Try to construct at 16KHz ar = new AudioRecord( MediaRecorder.AudioSource.MIC, 16000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_SIZE); if (ar.getState() != AudioRecord.STATE_INITIALIZED) { // Unable to set up at 16KHz, try at 8KHz ar = new AudioRecord( MediaRecorder.AudioSource.MIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_SIZE); } This seems to work just fine, and a 16KHz AudioRecord object is instantiated on the G1, and a 8KHz object is instantiated on the emulator. However, on the Samsung Moment, which only supports 8KHz, AudioRecord.getState() returns STATE_INITIALIZED when trying to construct with 16KHz. This ends up causing recording to fail since it's trying to record at 16KHz, even though the phone doesn't support it. Also, I don't see any exceptions thrown when I try to wrap the construction in a try-catch. Does anyone know of a better way to detect 16KHz vs. 8KHz capabilities on the device? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
Re: [android-developers] AudioRecord works with short but static with byte
knee-jerk answer: because the individual samples are 16 bits wide, and by reading just a byte each time, you're only getting half the sample. Any idea why reading bytes from AudioRecord would cause an error while reading shorts would not? The sound comes through great with short but I get only static with bytes. Working code with short[]: int minBufferSize = AudioRecord.getMinBufferSize(this.getSampleRate(), this.getChannelConfiguration(), this.getAudioEncoding()); AudioRecord recordInstance = new AudioRecord (MediaRecorder.AudioSource.MIC, this.getSampleRate(), this.getChannelConfiguration(), this.getAudioEncoding(), minBufferSize ); bufferedStreamInstance = new BufferedOutputStream(new FileOutputStream (this.pcmFile)); DataOutputStream dataOutputStreamInstance = new DataOutputStream (bufferedStreamInstance); short[] tempBuffer = new short[minBufferSize ]; recordInstance.startRecording(); while (totalBytesRead < timeToRecord*sampleRate) { bufferRead = recordInstance.read(tempBuffer, 0, minBufferSize ); for (int idxBuffer = 0; idxBuffer < bufferRead; ++idxBuffer) { dataOutputStreamInstance.writeShort(tempBuffer[idxBuffer]); } } // Close resources recordInstance.stop(); bufferedStreamInstance.close(); If I just change these two lines though, I get static: 1) Change: short[] tempBuffer = new short[minBufferSize]; To this: byte[] tempBuffer = new byte[minBufferSize]; 2) And this: for (int idxBuffer = 0; idxBuffer < bufferRead; ++idxBuffer) { dataOutputStreamInstance.writeShort(tempBuffer[idxBuffer]); } To this: dataOutputStreamInstance.write(tempBuffer, 0, bufferRead); Anyone else experience this or have suggestions? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -- jason.vp.engineering.particle -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord works with short but static with byte
Any idea why reading bytes from AudioRecord would cause an error while reading shorts would not? The sound comes through great with short but I get only static with bytes. Working code with short[]: int minBufferSize = AudioRecord.getMinBufferSize(this.getSampleRate(), this.getChannelConfiguration(), this.getAudioEncoding()); AudioRecord recordInstance = new AudioRecord (MediaRecorder.AudioSource.MIC, this.getSampleRate(), this.getChannelConfiguration(), this.getAudioEncoding(), minBufferSize ); bufferedStreamInstance = new BufferedOutputStream(new FileOutputStream (this.pcmFile)); DataOutputStream dataOutputStreamInstance = new DataOutputStream (bufferedStreamInstance); short[] tempBuffer = new short[minBufferSize ]; recordInstance.startRecording(); while (totalBytesRead < timeToRecord*sampleRate) { bufferRead = recordInstance.read(tempBuffer, 0, minBufferSize ); for (int idxBuffer = 0; idxBuffer < bufferRead; ++idxBuffer) { dataOutputStreamInstance.writeShort(tempBuffer[idxBuffer]); } } // Close resources… recordInstance.stop(); bufferedStreamInstance.close(); If I just change these two lines though, I get static: 1) Change: short[] tempBuffer = new short[minBufferSize]; To this: byte[] tempBuffer = new byte[minBufferSize]; 2) And this: for (int idxBuffer = 0; idxBuffer < bufferRead; ++idxBuffer) { dataOutputStreamInstance.writeShort(tempBuffer[idxBuffer]); } To this: dataOutputStreamInstance.write(tempBuffer, 0, bufferRead); Anyone else experience this or have suggestions? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord callbacks
Hi, I am trying to develop an app which will consume data from mic. I have used AudioRecord for the same. To read the data from mic I am usingh a separate thread. I want to change this. Is there any way to register the callback to AudioRecord object so as to get notification when data has started coming. I went through setRecordPositionUpdateListener but the docs are not at all helpful and just adds to your frustration. please help. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord + Android 1.5 = Error
Hi, I'm using the AudioRecord to capture raw PCM sound data. On both 1.5 devices (Motorola Cliq, and Droid Eris) I get: AudioRecord(28661): Error obtaining an audio buffer, giving up.I/ AudioHardwareMSM72XX( 2224): AudioHardware PCM record is going to standby. Any ideas why? -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord producing no-sound data. Why??
Here is my AudioRecorder class, using audio record, why is it not producing any sound data? import java.io.FileOutputStream; import java.io.IOException; import android.content.Context; import android.media.AudioFormat; import android.media.AudioRecord; import android.media.MediaRecorder; public class AudioRecorder implements Runnable { public boolean isRecording = false; byte[] tempBuffer = new byte[AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT)]; byte[] saveBuffer = new byte[tempBuffer.length * 1000]; int saveBufferPos=0; Context ctx; String filePath; /** * Handler is passed to pass messages to main screen Recording is done */ public AudioRecorder(Context ctx,String filePath) { super(); this.filePath = filePath; this.ctx = ctx; } public void run() { AudioRecord recordInstance = null; // We're important... android.os.Process.setThreadPriority (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT,tempBuffer.length); recordInstance.startRecording(); // Continue till STOP button is pressed. this.isRecording = true; long cms = System.currentTimeMillis(); while (System.currentTimeMillis() - cms < 5000) { for (int i = 0; i < tempBuffer.length; i++) { tempBuffer[i] = 0; } recordInstance.read(tempBuffer, 0, tempBuffer.length); for (int i = saveBufferPos; i < tempBuffer.length; i++) { saveBuffer[i] = tempBuffer[i-saveBufferPos]; saveBufferPos++; } } recordInstance.stop(); try { FileOutputStream ofo = new FileOutputStream(filePath); ofo.write(saveBuffer); ofo.close(); } catch (IOException e) { e.printStackTrace(); } } } -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] AudioRecord APIs
Hi, I have two questions 1. Has anyone been able to use an AudioRecord properly WITHOUT the "buffer overflow" messages? 2. What is the audio latency parameter? Cheers, Earlence --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] audioRecord buffer overflow
Hi, Can anyone send me some pseudo code to record from AudioRecord without getting buffer overflow messages? This is the code i am using : this is my setup for AudioRecord 8khz, 16bit pcm, mono buffer size spec during object creation - 1024 i read 480shorts(960 bytes) every 60ms in addition, every 20ms i encode a 160shorts(320bytes) packet. Hence, operation is like this. 1. initial sleep for 60ms so that buffer fills. 2. then every 20ms - if i have not read 480 shorts, read 160 shorts, encode, store. if i have read 480 shorts, i execute another read on the AudioRecord object. The problem is that i get "buffer overflow messages" What is the reason for this? Here is some source code: class Packetizer extends TimerTask { @Override public void run() { android.os.Process.setThreadPriority (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); short [] temp = new short[Codec.BLOCKL_MAX]; byte [] tempdata; short [] enc = new short[Codec.ILBCNOOFWORDS_MAX]; int toBeRead = 0; if(offset < actualBytesRead) { if((actualBytesRead - offset) < 160) toBeRead = (actualBytesRead - offset); else toBeRead = 160; System.arraycopy(buff, offset, temp, 0, toBeRead); Codec.encodeData(Codec.encoder_ptr, enc, temp); tempdata = convert(enc); try { fout.write(tempdata); } catch (IOException e) { e.printStackTrace(); } if((actualBytesRead - offset) < 160) offset += (actualBytesRead - offset); else offset += 160; } else { offset = 0; actualBytesRead = ar.read(buff, 0, buffSize); } } } convert() converts short array to byte array size of temp is 240 initially offset is zero Cheers, Earlence --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] AudioRecord , buffer overflow issue.
Hi, Iam using Android 1.5 sdk . I have audio capture working using audiorecord, but then after few seconds , I get buffer overflow and thats tearing down the whole call. Can anyone of you please let me know how to fix this issue. Thanks, Anu --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] AudioRecord
Hi, I'd like to use the AudioRecord API, but it's not clear to me how it should be called. I put a call to the logger in the listener callback, but don't see an message logged anywhere. Any pointers? Thanks, Ben import android.media.AudioFormat; import android.media.AudioRecord; import android.media.MediaRecorder; import android.util.Log; public class AudioListener { public static final int DEFAULT_SAMPLE_RATE = 8000; private static final int DEFAULT_BUFFER_SIZE = 4096; private static final int CALLBACK_PERIOD = 4000; // 500 msec (sample rate / callback period) private final AudioRecord recorder; public AudioListener() { this(DEFAULT_SAMPLE_RATE); } private AudioListener(int sampleRate) { recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, sampleRate, AudioFormat.CHANNEL_CONFIGURATION_DEFAULT, AudioFormat.ENCODING_DEFAULT, DEFAULT_BUFFER_SIZE); } public void start() { recorder.setPositionNotificationPeriod(CALLBACK_PERIOD); recorder.setRecordPositionUpdateListener(new AudioRecord.OnRecordPositionUpdateListener() { @Override public void onMarkerReached(AudioRecord recorder) { Log.e(this.getClass().getSimpleName(), "onMarkerReached Called"); } @Override public void onPeriodicNotification(AudioRecord recorder) { Log.e(this.getClass().getSimpleName(), "onPeriodicNotification Called"); } }); recorder.startRecording(); } } --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] AudioRecord can't get any notification when record/marker position updated
hi all I want use AudioRecord class to record audio in PCM formate. after create class and set setRecordPositionUpdateListener, then start recording. I can't get any notification from system. why? please help me, thanks public class Recorder { private static final int AUDIO_SAMPLE_FREQ = 8000; private static final int AUDIO_BUFFER_SIZE = 80; private AudioRecord recorder; public Recorder() { try { // init recorder recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, AUDIO_SAMPLE_FREQ, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_SIZE); } catch (IllegalArgumentException e) { e.printStackTrace(); } recorder.setRecordPositionUpdateListener(mNotification); recorder.setPositionNotificationPeriod(50); recorder.setNotificationMarkerPosition(AUDIO_SAMPLE_FREQ); } public OnRecordPositionUpdateListener mNotification = new OnRecordPositionUpdateListener(){ public void onMarkerReached(AudioRecord arg0) { // read PCM buffer byte[] audioBuffer = new byte[AUDIO_SAMPLE_FREQ]; arg0.read(audioBuffer, 0, AUDIO_SAMPLE_FREQ); } public void onPeriodicNotification(AudioRecord arg0) { // read PCM buffer byte[] audioBuffer = new byte[AUDIO_SAMPLE_FREQ]; arg0.read(audioBuffer, 0, AUDIO_SAMPLE_FREQ); } }; public void StartRecord() { recorder.startRecording(); } public void StopRecord() { recorder.stop(); } public void ReleaseRecord() { recorder.release(); } } --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] AudioRecord can't get any notification when record/marker position updated
hello everyone, I would like to use the new AudioRecord class to record in PCM format. Create class and setRecordPositionUpdateListener to it, then start recording, I can't get any notification from system forever,why?(I didn't get any error when running) please help me, thanks. the next is my code. public class Recorder { private static final int AUDIO_SAMPLE_FREQ = 8000; private static final int AUDIO_BUFFER_SIZE = 20; private AudioRecord recorder; public Recorder() { try { // init recorder recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, AUDIO_SAMPLE_FREQ, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, AUDIO_BUFFER_SIZE); } catch (IllegalArgumentException e) { e.printStackTrace(); } recorder.setRecordPositionUpdateListener(mNotification); recorder.setPositionNotificationPeriod(50); recorder.setNotificationMarkerPosition(AUDIO_SAMPLE_FREQ); } public OnRecordPositionUpdateListener mNotification = new OnRecordPositionUpdateListener(){ public void onMarkerReached(AudioRecord arg0) { // read PCM buffer here byte[] audioBuffer = new byte[AUDIO_SAMPLE_FREQ]; arg0.read(audioBuffer, 0, AUDIO_SAMPLE_FREQ); } public void onPeriodicNotification(AudioRecord arg0) { // read PCM buffer here byte[] audioBuffer = new byte[AUDIO_SAMPLE_FREQ]; arg0.read(audioBuffer, 0, AUDIO_SAMPLE_FREQ); } public void StartRecord() { recorder.startRecording(); } public void StopRecord() { recorder.stop(); } public void ReleaseRecord() { recorder.release(); } }; --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---