I just read the whole discussion.  All I can say is... What were they
thinking??  How is a developer supposed to be able to generate or
synthesize audio real time?  These are important apps.  Just look at
the guitar or harmonica app for the iPhone.  Those are very popular
and do exactly that.

I'm just dumbfounded that the API developers don't provide such basic
functionality as setting a byte stream datasource for a media player.
Media players also are not adequate at all for games other than
providing the background music.  Games often times have small samples
that are played very often and need to be layered on each other.
Maintaining a media player pool seems very heavy and cumbersome just
to achieve this, not to mention you may end up with a 100 media
players if you have 30 different samples and estimate that it would be
possible to play 3 at a time of each.

Another problem is that for racing, flying and other games which have
an "engine" sound, it's not possible to get the pitch of the engine to
match the speed shown on the game without being able to change the
sampling rate of the sound.  I used SoundPool for that because it
supported changing the sampling rate while playing and it does work
correctly for me right now.  Whoever designed SoundPool definitely
knew what they were doing for games because it's the right kind of API
for that but we still have the problem of not being able to produce
our own stream of audio on the fly and play it which limits what kind
of apps we can write and the general quality of the audio in games and
other applications.

Originally I was thinking that I'd go the old school route on the
sound in my game and just mix the audio in an update routine and
constantly feed the sound buffer but there is no option for this.

Unbelievable.  I find this really discouraging.  It's also very
frustrating that there is a SoundPool API which DOES WORK given a few
bugs and workarounds in the SDK 1.0 and is very suitable for 90% of
gaming applications as well as soundboard style apps and I'm sure
other things.  Why would they get rid of this?!

blindfold - Where did you find that it's not supported under 1.0 r1?
I can't find any documentation on whether it's supported, intended for
future or recently added and already deprecated.


On Oct 15, 3:01 am, blindfold <[EMAIL PROTECTED]> wrote:
> > it works on both the emulator and a real G1.
>
> So you have access to a real G1. Lucky you!!
>
> > I'm really surprised that there is no access to the audio buffer.
> > How are we supposed to write dynamic audio generation apps?
>
> Unfortunately not supported by Android SDK 1.0 r1. See
>
> http://code.google.com/p/android/issues/detail?id=739
>
> and search this group for discussions on "ByteArrayInputStream".
>
> Regards
>
> On Oct 15, 12:43 am, Robert Green <[EMAIL PROTECTED]> wrote:
>
> > I used it for my game and it works on both the emulator and a real
> > G1.  Just follow my example.
>
>
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to