I do something very similar in my apps (Grand Piano and Opus #1) without
problems on ICS. Sound generation is quite expensive in my case, so I have
problems on slow devices (<1Mhz / single core / old android version) but
not on the galaxy nexus.
One thing I noticed however is that it seems like you use the same size
("minSize") for the AudioTrack buffer and the audio creation buffer used as
argument to processNativeAudio. Although more jni calls imply more
overhead, it might be better to use a smaller buffer size for the jni
calls, based on AudioTrack.getNativeFrameCount() or better yet a divisor of
it to ensure the audio creation thread does not need to wait that long on
the blocking call to AudioTrack.write().
Am Freitag, 27. April 2012 11:25:14 UTC+2 schrieb piezo:
>
> Hello
>
>
> I have a music (sequencer/synth) app in Android Market. One of its
> strengths is that it uses very little resources and plays smoothly
> even on the oldest and cheapest devices and only requires Android
> version 1.6.
>
> However, since the arrival of ICS, I get more and more complaints
> about stuttering playback and a sluggish interface, mainly on Galaxy
> Nexus. The app hasn't changed and still works fine on earlier versions
> of Android.
>
> I wonder what may have changed.
>
> My app does the audio processing in native code and I suspect the way
> data are passed between java and native might be the problem (I still
> use AudioTrack to maintain 1.6 compatibility and because it always
> worked fine):
>
> public class PlayThread extends Thread
> {
> public void run()
> {
> ByteBuffer byteBuffer =
> ByteBuffer.allocateDirect(minSize);
> byte[] byteArray = new byte[minSize];
>
>
> android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
>
>
> int minSize =AudioTrack.getMinBufferSize( 44100,
> AudioFormat.CHANNEL_CONFIGURATION_STEREO,
> AudioFormat.ENCODING_PCM_16BIT );
> myTrack = new AudioTrack( AudioManager.STREAM_MUSIC,
> 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
> AudioFormat.ENCODING_PCM_16BIT,
> minSize, AudioTrack.MODE_STREAM);
>
> myTrack.play();
>
> while(1)
> {
> byteBuffer.position(0);
> processNativeAudio(byteBuffer, minSize);
> byteBuffer.position(0);
> byteBuffer.get(byteArray, 0, minSize);
> myTrack.write( puff, 0, minSize );
> }
> }
> }
>
> In the main activity, a new PlayThread is created on startup. The
> native function looks like this:
>
> void Java_com_myapp_ processNativeAudio( JNIEnv * env, jobject this,
> jobject buffer , jint buflng)
> {
> jbyte *jbuffer = (*env)->GetDirectBufferAddress(env, buffer);
> short *shortBuffer= (short*)(jbuffer);
>
> // processing audio here and writing to shortBuffer
> }
>
>
> Anyone has similar problems or even knows a workaround?
>
> Thank you.
--
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en