i believe you should use a short[] buffer in your AudioRecord.read().

on return, you pass the short[] buffer to you encoder and that is it.

On Mar 10, 3:39 am, "draf...@gmail.com" <draf...@gmail.com> wrote:
> I currently have a Loop back program for testing Audio on Android
> devices.
>
> It uses AudioRecord and AudioTrack to record PCM audio from the Mic
> and play PCM audio out the earpiece.
>
> Here is the code:
>
> public class Record extends Thread
>   {
>
>           static final int bufferSize = 200000;
>           final short[] buffer = new short[bufferSize];
>           short[] readBuffer = new short[bufferSize];
>
>           public void run() {
>             isRecording = true;
>             android.os.Process.setThreadPriority
>             (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
>
>             int buffersize = AudioRecord.getMinBufferSize(11025,
>             AudioFormat.CHANNEL_CONFIGURATION_MONO,
>             AudioFormat.ENCODING_PCM_16BIT);
>
>                            arec = new
> AudioRecord(MediaRecorder.AudioSource.MIC,
>                                            11025,
>
> AudioFormat.CHANNEL_CONFIGURATION_MONO,
>
> AudioFormat.ENCODING_PCM_16BIT,
>                                            buffersize);
>
>                            atrack = new
> AudioTrack(AudioManager.STREAM_VOICE_CALL,
>                                            11025,
>
> AudioFormat.CHANNEL_CONFIGURATION_MONO,
>
> AudioFormat.ENCODING_PCM_16BIT,
>                                            buffersize,
>                                            AudioTrack.MODE_STREAM);
>
>                            atrack.setPlaybackRate(11025);
>
>                            byte[] buffer = new byte[buffersize];
>                            arec.startRecording();
>                            atrack.play();
>
>                            while(isRecording) {
>
>                                    arec.read(buffer, 0, buffersize);
>                                    atrack.write(buffer, 0,
> buffer.length);
>                            }
>           }
>   }
>
> So as you can see in the creation of the AudioTrack and AudioRecord
> the Encoding is supplied via the AudioFormat but this only allows 16
> bit or 8 bit PCM.
>
> I have my own G711 Codec implementation now and I want to be able to
> encode the audio from the Mic and decode it going into the EarPiece,
> So I have encode(short lin[], int offset, byte enc[], int frames) and
> decode(byte enc[], short lin[], int frames) methods but I'm unsure as
> to how to use them to encode and the decode the audio from the
> AudioRecord and AudioTrack.
>
> Can anyone help me or point me in the right direction?

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to