[android-developers] Re: Most accurate method of using SoundPool

2009-03-02 Thread Dave Sparks

You can't get good timing from SoundPool. If you're talking about a
music application where a few msecs of jitter affects the musical
feel, you're going to need to mix your own streams to get the timing
accuracy.

We are planning on adding an API in a future release for music
applications, probably something along the lines of a lightweight ASIO
for mobile devices.

On Mar 2, 7:15 am, Sundog sunns...@gmail.com wrote:
 I need a recommendation from someone for the most accurate way of
 timing SoundPool sounds. I can't loop them because what's desired is
 not a loop but very accurate (to the ear at least) timing.

 Thanks for any suggestions.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: srcQuality parameter for SoundPool constructor

2009-02-28 Thread Dave Sparks

The parameter is currently ignored. You should use 0 (default) for
now.

On Feb 28, 10:48 am, clark clarkd...@gmail.com wrote:
 Can anyone fill me in as to the possible values that can be passed to
 the SoundPool constructor for the srcQuality parameter?

 I see that it is an int, but no description is given in the
 documentation about what values are acceptable and what they even
 mean.  I've seen examples that use 100 for this value and other
 examples that use 0.

 regards,
 ~clark
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording Audio

2009-02-27 Thread Dave Sparks

That's because it doesn't get registered as a music file. We felt that
most people did not want to have voice messages from friends show up
in the party shuffle playlists. It could be very embarrasing, if you
know what I mean. :)

On Feb 26, 11:06 pm, MMF android...@gmail.com wrote:
 Hi Dave,

 If I record using this Soundrecorder app, it creates a .3gpp file.
 But this file is not getting listed in the Music app.
 I have tried restarting the emulator.
 Any idea what could be the reason for this?

 Thanks
 ~

 On 2/27/09, Dave Sparks davidspa...@android.com wrote:



  I confess, I don't write much Java code (I work on the native media
  framework), so I could be wrong about this. This API looks promising
  though:

  Environment.getDataDirectory()

  Hopefully someone knowledgeable will correct me if I have steered you
  wrong.

  On Feb 26, 2:40 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
   each application has its own private data directory /
   data/app-private/app-package. I believe your working directory is set
   to this directory by default

  Cool.  So it sounds like I should just be able to use a relative path
  from the current location then.  Unfortunately, I'm getting the
  following exception (with no clues as to why start is failing):

  02-26 14:34:55.132: ERROR/AndroidRuntime(164):
  java.lang.RuntimeException: start failed.
  02-26 14:34:55.132: ERROR/AndroidRuntime(164): at
  android.media.MediaRecorder.start(Native Method)

  Here's my code:

  final MediaRecorder recorder = new MediaRecorder();
  recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
  recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
  recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
  recorder.setOutputFile(test.3gpp);
  recorder.prepare();
  recorder.start();

  On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:

   You can get the path to external storage (e.g. SD card) with
   Environment.getExternalStorageDirectory(). This is world read/
   writable.

   Alternatively, each application has its own private data directory /
   data/app-private/app-package. I believe your working directory is set
   to this directory by default. This is onboard flash, so it will
   survive the user ejecting an SD card. However, there is a limited
   amount to go around, so you don't want to store monster media files
   there.

   On Feb 25, 9:22 pm, benmccann benjamin.j.mcc...@gmail.com wrote:

 setOutputFile() expects a path to where you want the file stored.

Yep, figured that much by the method name, but what's a valid path?  I
mean I'm figuring it's UNIX-like, but other than that I'm in the
dark.  Is there a preferred place for apps to store data?  Are there
certain directories that I have permission to write to?  What
directories exist on the device by default?  It'd be nice for the docs
on data storage to mention any of these
things:http://developer.android.com/guide/topics/data/data-storage.html

 You can take a look at the source to SoundRecorder in the open
 source tree for some working code

Thanks for the example.  This would be a much better example than the
one in the docs that won't compile and implies content must first be
added to a database:
   http://developer.android.com/guide/topics/media/index.html

On Feb 24, 8:03 pm, Dave Sparks davidspa...@android.com wrote:

 setOutputFile() expects a path to where you want the file stored.

 You can take a look at the source to SoundRecorder in the open
 source
 tree for some working code:

http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder...

 On Feb 24, 4:43 pm, benmccann benjamin.j.mcc...@gmail.com wrote:

  Hi,
  I'd like to create anaudiorecordingin Android.  (Actually, I just
  want access to the mic withoutrecordingit, but it seems that's not
  supported so I'll have to create arecordingand tail the file).
  I'm having a very hard time getting started.  Mostly I'm just
  hoping
  that someone from Google reads this and will update the
  documentation
  because the example won't compile - it looks like it's from some
  previous version of the SDK because there's an error in every
  other
  line.  I made my best guess as to what the usage should be, but I
  keep
  getting a number of different exceptions.
  One question I had is whether I can just specify an arbitrary path
  to
  the MediaRecorder to startrecordingor whether I have to create an
  entry in the content database.  The JavaDoc for
  MediaRecorder.setOutputFile isn't clear on what it's expecting.

  Thanks,
  Ben
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com

[android-developers] Re: How test the openCORE on Android Emulator?

2009-02-26 Thread Dave Sparks

Please move this question to the android-framework list. This list is
for application developers.

On Feb 25, 10:30 pm, Vishwanatha S vishy.s1...@gmail.com wrote:
 Dear All

 I am trying to integrate my codec on the ARM side using my own OMX core.
 Now I would like to test it on the Android Emulator using the Eclipse
 and Android SDK. I would like to know,

 1. Tool chain used to compile openCore
 2. How to set up Android Emulator to test my latest openCore (after
 integrating the Codec)?

 Thank you for the help

 Regards,
 -Vishwa
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording Audio

2009-02-26 Thread Dave Sparks

You can get the path to external storage (e.g. SD card) with
Environment.getExternalStorageDirectory(). This is world read/
writable.

Alternatively, each application has its own private data directory /
data/app-private/app-package. I believe your working directory is set
to this directory by default. This is onboard flash, so it will
survive the user ejecting an SD card. However, there is a limited
amount to go around, so you don't want to store monster media files
there.

On Feb 25, 9:22 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
  setOutputFile() expects a path to where you want the file stored.

 Yep, figured that much by the method name, but what's a valid path?  I
 mean I'm figuring it's UNIX-like, but other than that I'm in the
 dark.  Is there a preferred place for apps to store data?  Are there
 certain directories that I have permission to write to?  What
 directories exist on the device by default?  It'd be nice for the docs
 on data storage to mention any of these 
 things:http://developer.android.com/guide/topics/data/data-storage.html

  You can take a look at the source to SoundRecorder in the open source tree 
  for some working code

 Thanks for the example.  This would be a much better example than the
 one in the docs that won't compile and implies content must first be
 added to a database:  
 http://developer.android.com/guide/topics/media/index.html

 On Feb 24, 8:03 pm, Dave Sparks davidspa...@android.com wrote:

  setOutputFile() expects a path to where you want the file stored.

  You can take a look at the source to SoundRecorder in the open source
  tree for some working code:

 http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder...

  On Feb 24, 4:43 pm, benmccann benjamin.j.mcc...@gmail.com wrote:

   Hi,
   I'd like to create anaudiorecordingin Android.  (Actually, I just
   want access to the mic withoutrecordingit, but it seems that's not
   supported so I'll have to create arecordingand tail the file).
   I'm having a very hard time getting started.  Mostly I'm just hoping
   that someone from Google reads this and will update the documentation
   because the example won't compile - it looks like it's from some
   previous version of the SDK because there's an error in every other
   line.  I made my best guess as to what the usage should be, but I keep
   getting a number of different exceptions.
   One question I had is whether I can just specify an arbitrary path to
   the MediaRecorder to startrecordingor whether I have to create an
   entry in the content database.  The JavaDoc for
   MediaRecorder.setOutputFile isn't clear on what it's expecting.

   Thanks,
   Ben
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: SoundPool working correctly yet?

2009-02-26 Thread Dave Sparks

SoundPool issues were fixed in Cupcake. The fixes were dependent on
other changes to the audio system and it was considered too risky to
port those fixes back to the 1.x branch. We haven't released a Cupcake
SDK yet.

Other have had success with SoundPool by setting the maxStreams to a
large value. This avoids the most egregious deadlock issue, although
there are other issues that may result in ANR's depending on your use
case. It also requires that you do your own stream management.

On Feb 24, 9:49 pm, clark clarkd...@gmail.com wrote:
 There I go again with my assumptions.  I was basically assuming the
 next SDK was 1.1r1 based on the following 
 threadhttp://groups.google.com/group/android-developers/browse_thread/threa...
 where Dave mentioned it would be addressed.  Anyhow thanks for the
 response and perhaps Dave can provide some insight or an update.

 ~clark

 On Feb 24, 9:20 pm, Romain Guy romain...@google.com wrote:

  I don't know if it's safe to make such an assumption. I'm just saying
  that you cannot make an assumption about which SDK was the next SDK
  :)) And I know next to nothing about SoundPool so i can't tell you
  more unfortunately.

  On Tue, Feb 24, 2009 at 9:18 PM, clark clarkd...@gmail.com wrote:

   Thanks for the quick reply Romain Guy.  So is it safe to assume that
   SoundPool was not updated in the 1.1r1/RC33 update?  I don't mean to
   be pedantic about it but I just want to make sure I'm reading into
   your reply correctly.

   Thanks again,
   ~clark

   On Feb 24, 9:14 pm, Romain Guy romain...@google.com wrote:
   The next SDK might actually be cupcake. Not all fixes went into SDK
   1.1r1/rc33 since the release of 1.0. SDK 1.1 is actually a rather
   limited update with very specific fixes.

   On Tue, Feb 24, 2009 at 9:11 PM, clark clarkd...@gmail.com wrote:

I've recently started working with the SoundPool class in SDK 1.1R1,
and noticed that my app deadlocks from time to time, or sounds are not
loaded.  I remember reading, back in November, that this was to be
addressed in the next SDK.

I've used pretty much the same code that others have used and most of
the time it works for a bit.  Just a minute ago, the app completely
crashed with no Force Close and returned to the home screen.
Logcat reveals that HeapWorker is wedged: 26577ms spent inside
Landroid/media/SoundPool;.finalize()V

I'm not sure if this is the same issue others had with SoundPool or if
I have encountered an entirely new beast.

Regards,
~clark

   --
   Romain Guy
   Android framework engineer
   romain...@android.com

   Note: please don't send private questions to me, as I don't have time
   to provide private support.  All such questions should be posted on
   public forums, where I and others can see and answer them

  --
  Romain Guy
  Android framework engineer
  romain...@android.com

  Note: please don't send private questions to me, as I don't have time
  to provide private support.  All such questions should be posted on
  public forums, where I and others can see and answer them
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording Audio

2009-02-26 Thread Dave Sparks

I confess, I don't write much Java code (I work on the native media
framework), so I could be wrong about this. This API looks promising
though:

Environment.getDataDirectory()

Hopefully someone knowledgeable will correct me if I have steered you
wrong.

On Feb 26, 2:40 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
  each application has its own private data directory /
  data/app-private/app-package. I believe your working directory is set
  to this directory by default

 Cool.  So it sounds like I should just be able to use a relative path
 from the current location then.  Unfortunately, I'm getting the
 following exception (with no clues as to why start is failing):

 02-26 14:34:55.132: ERROR/AndroidRuntime(164):
 java.lang.RuntimeException: start failed.
 02-26 14:34:55.132: ERROR/AndroidRuntime(164): at
 android.media.MediaRecorder.start(Native Method)

 Here's my code:

 final MediaRecorder recorder = new MediaRecorder();
 recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
 recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
 recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
 recorder.setOutputFile(test.3gpp);
 recorder.prepare();
 recorder.start();

 On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:

  You can get the path to external storage (e.g. SD card) with
  Environment.getExternalStorageDirectory(). This is world read/
  writable.

  Alternatively, each application has its own private data directory /
  data/app-private/app-package. I believe your working directory is set
  to this directory by default. This is onboard flash, so it will
  survive the user ejecting an SD card. However, there is a limited
  amount to go around, so you don't want to store monster media files
  there.

  On Feb 25, 9:22 pm, benmccann benjamin.j.mcc...@gmail.com wrote:

setOutputFile() expects a path to where you want the file stored.

   Yep, figured that much by the method name, but what's a valid path?  I
   mean I'm figuring it's UNIX-like, but other than that I'm in the
   dark.  Is there a preferred place for apps to store data?  Are there
   certain directories that I have permission to write to?  What
   directories exist on the device by default?  It'd be nice for the docs
   on data storage to mention any of these 
   things:http://developer.android.com/guide/topics/data/data-storage.html

You can take a look at the source to SoundRecorder in the open source 
tree for some working code

   Thanks for the example.  This would be a much better example than the
   one in the docs that won't compile and implies content must first be
   added to a database:  
   http://developer.android.com/guide/topics/media/index.html

   On Feb 24, 8:03 pm, Dave Sparks davidspa...@android.com wrote:

setOutputFile() expects a path to where you want the file stored.

You can take a look at the source to SoundRecorder in the open source
tree for some working code:

   http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder...

On Feb 24, 4:43 pm, benmccann benjamin.j.mcc...@gmail.com wrote:

 Hi,
 I'd like to create anaudiorecordingin Android.  (Actually, I just
 want access to the mic withoutrecordingit, but it seems that's not
 supported so I'll have to create arecordingand tail the file).
 I'm having a very hard time getting started.  Mostly I'm just hoping
 that someone from Google reads this and will update the documentation
 because the example won't compile - it looks like it's from some
 previous version of the SDK because there's an error in every other
 line.  I made my best guess as to what the usage should be, but I keep
 getting a number of different exceptions.
 One question I had is whether I can just specify an arbitrary path to
 the MediaRecorder to startrecordingor whether I have to create an
 entry in the content database.  The JavaDoc for
 MediaRecorder.setOutputFile isn't clear on what it's expecting.

 Thanks,
 Ben
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording Audio

2009-02-24 Thread Dave Sparks

setOutputFile() expects a path to where you want the file stored.

You can take a look at the source to SoundRecorder in the open source
tree for some working code:

http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder.git;a=summary

On Feb 24, 4:43 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
 Hi,
 I'd like to create an audio recording in Android.  (Actually, I just
 want access to the mic without recording it, but it seems that's not
 supported so I'll have to create a recording and tail the file).
 I'm having a very hard time getting started.  Mostly I'm just hoping
 that someone from Google reads this and will update the documentation
 because the example won't compile - it looks like it's from some
 previous version of the SDK because there's an error in every other
 line.  I made my best guess as to what the usage should be, but I keep
 getting a number of different exceptions.
 One question I had is whether I can just specify an arbitrary path to
 the MediaRecorder to start recording or whether I have to create an
 entry in the content database.  The JavaDoc for
 MediaRecorder.setOutputFile isn't clear on what it's expecting.

 Thanks,
 Ben
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: m4v files in android

2009-02-18 Thread Dave Sparks

No, I don't believe OpenCore can play raw m4v files, only
containerized mp4 and m4a files.

On Feb 18, 6:17 am, Dilli dilliraomca...@gmail.com wrote:
 Hi all

 I am developing a simple application to play m4v files

 while try to play the m4v files it shows exception

 E/QCvdecH264(   31): get_parameter: unknown param 0ff7a347
 W/QCvdec  (   31): vdec: opened
 W/QCvdec  (   31): VDL_Configure_HW: Interface Not supported
 E/QCvdec  (   31): Driver Layer hardware config failed with error code
 7
 W/QCvdec  (   31): error - H264Decoder::InitializeDecInternal()
 failed!!
 W/QCvdec  (   31): There is no input node available
 E/QCvdec  (   31): partner/qct/proprietary/libOmxH264Dec/
 vdecoder_i.cpp:952 *** ERROR ASSERT(0)
 W/(   31): [vdec_core] vdec_queue error: 5
 W/(   31): [vdec_core] frame buffer malloc failed, index: 8
 W/QCvdec  (   31): Unable to allocate buffers (out of memory)
 W/QCvdec  (   31): VDL_Configure_HW: Interface Not supported
 E/QCvdec  (   31): Driver Layer hardware config failed with error code
 7
 W/QCvdec  (   31): error - H264Decoder::InitializeDecInternal()
 failed!!

 Is Android supports m4v files??

 need help

 Thank you
 Dilli
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: com.google.android.voicesearch

2009-02-18 Thread Dave Sparks

Voice search uses a private Google API optimized for search queries.
We are not making those API's public at this time.

On Feb 18, 5:10 am, Rob Franz rob.fr...@gmail.com wrote:
 Ok let me ask differently then...is it possible to access the part that
 takes the incoming speech, analyzes the audio, and forms words/sentences?

 On Wed, Feb 18, 2009 at 7:28 AM, Mike Hearn mh.in.engl...@gmail.com wrote:

  Voice search isn't a generic transcriber. It's optimized for search
  queries specifically, so I'm not sure what use an API would be.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: How to use AsyncPlayer ()??? Uri?

2009-02-18 Thread Dave Sparks

AsyncPlayer is just a helper class on top of MediaPlayer. It is not
going to help you with gapless playback.

I can see how the statement Plays a series of audio URIs is causing
confusion. It does not maintain a list of URI's to play. It just plays
a file until it reaches the end or it is told to play something else.

On Feb 18, 9:14 am, Moto medicalsou...@gmail.com wrote:
 Dave thanks for your response...
 Now, I really do believe, unless I'm misinterpreting this class, that
 this particular class would help me achieve gap-less playback for
 multiple files since it runs on it's own thread.
 As it says Plays a series of audio URIs which I like to pass a few
 links on the URI? if that's possible?, than
 does all the hard work on another thread so that any slowness with
 preparing or loading doesn't block the calling thread tells me that
 the initial issue with MediaPlayer using the prepare parameter which
 takes a long time therefore giving gaps between file playback, will
 solve this issue...

 Would you agree? or am I way off? if this isn't an option, would you
 happen to suggest a way to achieve gap-less playback using multiple
 files?

 Thanks!
 Moto!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: how to pluging our own window manager

2009-02-18 Thread Dave Sparks

This question should be directed to android-framework.

On Feb 18, 12:40 am, Nasam forum.nami...@gmail.com wrote:
 hi

 I was going through the window manager of android. I wonder if is it
 possible to replace this window manger with out own window manger. Is
 this supported? what depencies are there for this?

 please answer this question if anybody knows
 thanks in advance
 NASAM
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Mediaplayer with streaming

2009-02-17 Thread Dave Sparks

There is no way to synchronize the two players so that they start and
stop synchronously. It involves multiple IPC calls - the best you can
do is probably +/- 100 msecs.

We plan to support shoutcast and improve streaming in general in a
future release.

On Feb 16, 11:13 pm, Dilli dilliraomca...@gmail.com wrote:
 Hi all

 I am developing a media player which will download the media content
 from
 network (Shoutcast)

 It works fine. Problem is a small gap between players (2 players)

 I am using two mediaplayers to play the data while a thread downloads
 and stores the data and stores in to 100Kb files continuously.

 on first players oncompletelistener() i started the second player wise
 versa.

 I think there is a problem with mp3 frames(header sync byte) while
 splitting 100kb files.

 how to resole the gap between players.

 Any suggestions

 Thank U
 Dilli
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Does Camera API really works ?

2009-02-17 Thread Dave Sparks

This code shouldn't even compile, this line is wrong:

mCamera.takePicture(null, mPictureCallback); // requires 3 callback
functions

Camera is only supported in landscape mode. Cupcake release will
unofficially support portrait mode (there will be no API for it, but
I'll probably put some sample code out that shows how to do it).

On Feb 16, 9:35 pm, cindy ypu01...@yahoo.com wrote:
 I tried google's camera API sample code. I found there are 2 problems:
 1 It only can take picture in Landscape orientation
 2 After click the space key, the application crashes. WOw:)

 XML:
 LinearLayout xmlns:android=http://schemas.android.com/apk/res/
 android
 android:layout_width=fill_parent
 android:layout_height=fill_parent
 android:orientation=vertical
 SurfaceView android:id=@+id/surface
 android:layout_width=fill_parent
 android:layout_height=10dip
 android:layout_weight=1
 /SurfaceView
 /LinearLayout
 Java code:
 /**
  * Copyright (c) 2007, Google Inc.
  *
  * Licensed under the Apache License, Version 2.0 (the License);
  * you may not use this file except in compliance with the License.
  * You may obtain a copy of the License at
  *
  *http://www.apache.org/licenses/LICENSE-2.0
  *
  * Unless required by applicable law or agreed to in writing,
 software
  * distributed under the License is distributed on an AS IS BASIS,
  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
 implied.
  * See the License for the specific language governing permissions
 and
  * limitations under the License.
  */

 package com.android.cameraapitest;

 import android.app.Activity;
 import android.content.Intent;
 import android.graphics.Canvas;
 import android.graphics.Paint;
 import android.graphics.PixelFormat;
 import android.graphics.Rect;
 import android.net.Uri;
 import android.os.Handler;
 import android.os.Message;
 import android.os.Bundle;
 import android.provider.MediaStore.Images;
 import android.provider.MediaStore.Video;
 import android.view.Menu;
 import android.view.MenuItem;
 import android.view.SurfaceHolder;
 import android.view.SurfaceView;
 import android.view.KeyEvent;
 import android.hardware.Camera;

 import android.util.Log;

 public class CameraApiTest extends Activity implements
 SurfaceHolder.Callback
 {
 private static final String TAG = CameraApiTest;
 Camera mCamera;
 boolean mPreviewRunning = false;

 public void onCreate(Bundle icicle)
 {
 super.onCreate(icicle);

 Log.e(TAG, onCreate);

 getWindow().setFormat(PixelFormat.TRANSLUCENT);

 setContentView(R.layout.camera_api_test);
 mSurfaceView = (SurfaceView)findViewById(R.id.surface);

 mSurfaceHolder = mSurfaceView.getHolder();
 mSurfaceHolder.addCallback(this);
 mSurfaceHolder.setType
 (SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
 }

 public boolean onCreateOptionsMenu(android.view.Menu menu) {
 MenuItem item = menu.add(0, 0, 0, goto gallery);
 item.setOnMenuItemClickListener(new
 MenuItem.OnMenuItemClickListener() {
 public boolean onMenuItemClick(MenuItem item) {
 Uri target = Uri.parse(content://media/external/
 images/media);
 Intent intent = new Intent(Intent.ACTION_VIEW,
 target);
 startActivity(intent);
 return true;
 }
 });
 return true;
 }

 @Override
 protected void onRestoreInstanceState(Bundle savedInstanceState)
 {
 super.onRestoreInstanceState(savedInstanceState);
 }

 Camera.PictureCallback mPictureCallback = new
 Camera.PictureCallback() {
 public void onPictureTaken(byte[] data, Camera c) {
 Log.e(TAG, PICTURE CALLBACK: data.length =  +
 data.length);
 mCamera.startPreview();
 }
 };

 public boolean onKeyDown(int keyCode, KeyEvent event)
 {
 if (keyCode == KeyEvent.KEYCODE_BACK) {
 return super.onKeyDown(keyCode, event);
 }

 if (keyCode == KeyEvent.KEYCODE_SPACE) {
 mCamera.takePicture(null, mPictureCallback);
 return true;
 }

 return false;
 }

 protected void onResume()
 {
 Log.e(TAG, onResume);
 super.onResume();
 }

 protected void onSaveInstanceState(Bundle outState)
 {
 super.onSaveInstanceState(outState);
 }

 protected void onStop()
 {
 Log.e(TAG, onStop);
 super.onStop();
 }

 public void surfaceCreated(SurfaceHolder holder)
 {
 Log.e(TAG, surfaceCreated);
 mCamera = Camera.open();
 //mCamera.startPreview();
 }

 public void surfaceChanged(SurfaceHolder holder, int format, int
 w, int h)
 {
 Log.e(TAG, surfaceChanged);

 // XXX stopPreview() will crash if preview is not running
 if (mPreviewRunning) {
 mCamera.stopPreview();
 }

 

[android-developers] Re: How to force MediaStore to rescan the SD card

2009-02-17 Thread Dave Sparks

I'm not sure about the process of removing a file from the database. I
suggest looking at the Music Player source - it has an option to
delete files from the SD card.

Under what circumstances would files be deleted without mounting and
re-mounting the SD card? That sounds like a poorly behaved app.

On Feb 17, 6:03 pm, info.sktechnol...@gmail.com
info.sktechnol...@gmail.com wrote:
 Thanks, this solves my first problem.
 Now what if media files are deleted?
 What if I do not know which ones have been deleted?
 Is there a way to have it rescan the entire SD card in the
 way it must do if the card has been removed and reinserted?

 On Feb 14, 2:01 pm, Dave Sparks davidspa...@android.com wrote:

  You want something like this in your activity:

  import android.media.MediaScannerConnection;
  import
  android.media.MediaScannerConnection.MediaScannerConnectionClient;

  private static class MediaScannerNotifier implements
  MediaScannerConnectionClient {
  private Context mContext;
  private MediaScannerConnection mConnection;
  private String mPath;
  private String mMimeType;

  public MediaScannerNotifier(Context context, String path, String
  mimeType) {
  mContext = context;
  mPath = path;
  mMimeType = mimeType;
  mConnection = new MediaScannerConnection(context, this);
  mConnection.connect();
  }

  public void onMediaScannerConnected() {
  mConnection.scanFile(mPath, mMimeType);
  }

  public void onScanCompleted(String path, Uri uri) {
  // OPTIONAL: scan is complete, this will cause the viewer to
  render it
  try {
  if (uri != null) {
  Intent intent = new Intent(Intent.ACTION_VIEW);
  intent.setData(uri);
  mContext.startActivity(intent);
  }
  } finally {
  mConnection.disconnect();
  mContext = null;
  }
  }

  }

  To scan a file, you just create a new MediaScannerNotifier:

  new MediaScannerNotifier(path, mimeType);

  On Feb 14, 9:45 am, kolby kolbys...@gmail.com wrote:

   You can make an android.media.MediaScannerConnection, connect to it,
   and provide a client to scan a directory.

   Michael

   On Feb 14, 7:05 am, info.sktechnol...@gmail.com

   info.sktechnol...@gmail.com wrote:
If I progammatically store new media files on the SD card, the
MediaStore does not know about them until I remove and reinsert the SD
card.  Is there a way to tell the MediaStore to rescan the SD card
without first unmounting the SD card?- Hide quoted text -

  - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Integrating my external library

2009-02-17 Thread Dave Sparks

This topic has been covered many times. See this thread for one
example:

http://groups.google.com/group/android-developers/browse_thread/thread/d68364976e5d98ff/733eea4a1195527e?lnk=gstq=native+support#733eea4a1195527e


On Feb 17, 10:09 pm, Android Groups wqhl.m...@gmail.com wrote:
 I'm also facing the same issue and would get some help from forum

 On Feb 15, 7:21 pm, Ashutosh Agrawal ashuto...@lge.com wrote:

  Hi,

  Though I have gone through the available documentation and the source code,
  I am somewhat ambiguous about the approach to follow for integrating my
  external library (daemon/service which is written in C) in Android
  architecture. I think the following main actions needs to be done

  1.  Port and Make the service as an external library with Android tool
  chain
  2.  Write an application framework for connecting application world to
  my external library i.e. providing Java API interface and mapping the
  corresponding C APIs by JNI
  3.  Write a Java application according to the Android analogy and use my
  application framework (2).

  I would appreciate if somebody from Android team can correct my
  understanding on this, and would provide me the piece of advice.

  Thanks in advance,

  Ashutosh
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: A safer way of using SoundPool?

2009-02-16 Thread Dave Sparks

I believe we were able to do the fixes for SoundPool without changing
the public API. There are no plans to deprecate it at this time.

On Feb 16, 6:05 am, Blake B. bbuckle...@yahoo.com wrote:
 Great idea, Jon.  Thanks for sharing the code.  I'm about to start
 work on sound in my game, so I'll look into using this.

 Hopefully, Google realizes that many are using SoundPool and will keep
 the existing API in place for a while, even if deprecated, long enough
 to migrate apps.  I think we are all interested in keeping Android's
 good name and not annoying users without need.

 On Feb 16, 2:05 am, Jon Colverson jjc1...@gmail.com wrote:

  On Feb 16, 6:25 am, Marco Nelissen marc...@android.com wrote:

   Why do you say it's not a public API?

  SoundPool is undocumented because it is not ready as a public API and
  is subject to change.

 http://groups.google.com/group/android-developers/msg/6c360f2d0662be0a

  --
  Jon
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: On android.media.MediaRecorder, can it record the voice of speaker?

2009-02-16 Thread Dave Sparks

You might be able to get away with recording the mic - I haven't
checked. I'm pretty sure if you change the sample rate to something
other than 8KHz, it will kill uplink audio.

We may be able to support access to and downlink audio in a future
release (not in Cupcake or the release after that).

On Feb 16, 4:18 am, Shawn_Chiu qiuping...@gmail.com wrote:
 Hello, Dave
 Thanks for your reply.
 I do agree with you that HTC G1 could not record uplink or downlink
 audio, but I wrote an application myself to record the audio while
 there exists a conversation. It could record the voice from microphone
 no matter the AudioSource is MIC or DEFAULT. Actually, they do the
 same thing.
 Would other devices such like speaker or both speakermic could be
 supported in further release?
 BR
 Shawn

 On Feb 16, 7:12 am, Dave Sparks davidspa...@android.com wrote:

  The G1 does not support recording uplink or downlink audio.

  On Feb 14, 6:20 am, Shawn_Chiu qiuping...@gmail.com wrote:

   Hello, buddies
   It's about android.media.MediaRecorder.
   I want to implement a feature to record the conversation, both voice
   from speaker and microphone. But I tried on G1 phone, the
  MediaRecorderonly could record the voice from microphone. I looked
   into the Android doc, there are only two audio sources, MIC and DEFAULT
   (either to record microphone). So these are my questions:
   1. whether can I implement it or not?
   2. will more audio sources be supported? If so, which release or when?
   Thank you:-0
   Shawn
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Audio stops after some time in android from video/Music player

2009-02-16 Thread Dave Sparks

Please don't cross-post. This question isn't appropriate for the
application developer forum.

On Feb 16, 4:01 am, getandroid sampath...@gmail.com wrote:
 Hi,

As mentioned audio stops after some random number of times when
 played from either Music/Video player. After some debugging, I found
 that the problem is:

 In android_audio_output.cpp-audout_thread_func() there is a call to
 wait(iAudioThreadSem-Wait()) just before the while(1) and it is
 waiting indefinitely for something.

 Can anybody tell me what is it waiting for and why is it not able to
 come out? From what I can understand, it is waiting for a signal
 (semaphore) but where is it expecting a signal from?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: playing video from browser in sdk1.1

2009-02-16 Thread Dave Sparks

This list is for application developers. Please post questions about
source code in one of the open source forums (android-porting, android-
platform, or android-framework).

Short answer: There are no plans to publish source to any of the
Google properties at this time.

On Feb 15, 8:10 pm, Harishkumar V harishpres...@gmail.com wrote:
 Dear All,

 I have downloaded android source from android git tree and built it, loaded
 the new images into emulator.
 but when i opened up the browser and browse throughhttp://youtube.com, only
 the blank white screen appears.
 i am able to browse other websites.http://m.youtube.comworks fine, videos are 
 listed and if i try to play
 video, the same error, it loads and does not play video.

 is youtube app available in the current source.
 any link or reference available abt youtube player in the android source or
 how to build it in from the source.

 In frameworks/base/packages/SettingsProvider/etc/bookmarks.xml, the
 com.google.android.youtube was mentioned. how to get the package.

 HarishKumar.V
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Can MediaPlayer play .wav fomat audio?

2009-02-16 Thread Dave Sparks

The media player can play 16-bit WAVE files, but only if the format
type is PCM and not the extended format type. I've been meaning to fix
the OpenCore WAVE parser to handle extended format, but it's not a
high priority right now.

On Feb 16, 6:39 pm, herain herainw...@gmail.com wrote:
 I tried to play a piec  of 16-bit .wav audio file,  no sound came out
 from my earphone, but .mp3 file would be OK. Do any one else have this
 kind of experience?

 This is the code:

 player = MediaPlayer.create(this, R.raw.test);// I put test.wav
 in /res/raw
 player.start();

 Thanks for any help!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: How to use AsyncPlayer ()??? Uri?

2009-02-16 Thread Dave Sparks

I think your confusion probably comes from the phrase plays a series
of audio URI's. AudioSyncPlayert's just a simple helper class for
playing audio files that runs on its own thread, instead of on the UI
thread.

The first time you call play, it will start playing a sound. If you
call it a second time while the first sound is playing, it will stop
the first sound and start playing the second one. It's not really
usable for a music player if that's what you hand in mind.

On Feb 16, 5:31 pm, Moto medicalsou...@gmail.com wrote:
 Hi all!
 I'm a little way too confused using the AsyncPlayer class.  So here
 are my questions hope someone can help.

 1. URI? how do I set a list of local files to play? can this be done?
 it seems like it can since I think it says that on the API docs.

 http://developer.android.com/reference/android/media/AsyncPlayer.html...)

 2. Do I need to set permission to use URIs? on my android manifest
 file? If yes how do I do this? I got really confused trying to set
 that.

 3. How well does AsyncPlayer work in regards to latency between file
 plays?

 Thanks for your help!
 Moto!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Can MediaPlayer play .wav fomat audio?

2009-02-16 Thread Dave Sparks

If you replace the test.wav file with a test.mp3 file, does it play
OK?

If it does, then it has to be something about the WAVE file that
OpenCore doesn't like. Have you looked in the log?

On Feb 16, 7:25 pm, herain herainw...@gmail.com wrote:
 Thank you for your reply.

 My wav file was built by cool edit which saved a pcm file as windows
 PCM format.
 I guess it doesn't contain any extended format type as your mentioned.

 On Feb 17, 10:57 am, Dave Sparks davidspa...@android.com wrote:

  The media player can play 16-bit WAVE files, but only if the format
  type is PCM and not the extended format type. I've been meaning to fix
  the OpenCore WAVE parser to handle extended format, but it's not a
  high priority right now.

  On Feb 16, 6:39 pm, herain herainw...@gmail.com wrote:

   I tried to play a piec  of 16-bit .wav audio file,  no sound came out
   from my earphone, but .mp3 file would be OK. Do any one else have this
   kind of experience?

   This is the code:

   player = MediaPlayer.create(this, R.raw.test);// I put test.wav
   in /res/raw
   player.start();

   Thanks for any help!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Playing video in android

2009-02-15 Thread Dave Sparks

Are you looking at the log in DDMS? There must be some log output
associated with your application. At the very least, you should see
ActivityManager starting and stopping it.

You can also add some logging to your code.

import android.util.Log;
private static final String TAG = MyProgram;

Then scatter log statements around the code in various places:

Log.d(TAG, This will output to the log);

On Feb 14, 11:28 pm, Ash ashwin.disco...@gmail.com wrote:
 thanx for reply... the above code after making changes as mentioned by
 Dave Sparks
 does not show any error...  when i found the application in emulator
 and run it myself...
 i jus see a black screen for a second  and the program exits

 Can u pls help me in correcting the code or by any other video display
 code...
 thank u so much for support and replies...

 On Feb 14, 1:05 pm, Dave Sparks davidspa...@android.com wrote:

  We can't help you if you can't describe the errors you are seeing.
  What is in the log?

  On Feb 14, 10:01 am, Ash ashwin.disco...@gmail.com wrote:

   Thank u so much for the reply but The program does Not execute at
   all.. Can u please help

   On Feb 9, 2:53 am, Dilli dilliraomca...@gmail.com wrote:

 Hi

 i think you fixed the size ofvideodisplay

  mHolder.setFixedSize(176, 144);

  it may cause the problem of  videodisplaying

 use   mHolder.setFixedSize(mp.getVideoWidth(),
mp.getVideoHeight());

 Regards

On Feb 9, 3:58 am, Ash ashwin.disco...@gmail.com wrote:

 Playvideoonandroid
 -

 I'm trying toplayvideofiles onandroid...can anyone please help
 me...
 I'm not able to see thevideobut audio works fine... here is the code

 PLEASE HELP ME... BY GUIDING ME.. BY CORRECTING THE CODE
 OR WITH ANY NEW CODE

 package com.vi3;

 import java.io.IOException;
 importandroid.app.Activity;
 importandroid.os.Bundle;
 importandroid.content.Context;
 importandroid.graphics.PixelFormat;
 importandroid.media.MediaPlayer;
 importandroid.media.MediaPlayer.OnBufferingUpdateListener;
 importandroid.media.MediaPlayer.OnCompletionListener;
 importandroid.media.MediaPlayer.OnErrorListener;
 importandroid.util.Log;
 importandroid.view.Menu;
 importandroid.view.SurfaceHolder;
 importandroid.view.SurfaceView;
 importandroid.view.Surface;
 importandroid.view.Window;
 //importandroid.view.Menu.Item;

 public class vi3 extends Activity
 {
private static final String LOG_TAG = |;
private MediaPlayer mp;

private Preview mPreview;
//private myAcListener myListener = new myAcListener()this;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle icicle)
{
   super.onCreate(icicle);
   Log.i(LOG_TAG, CameraApp.onCreate);
   mPreview = new Preview(this);
   //requestWindowFeature(W);

 //  stopMedia();
// releaseMedia();
   setContentView(R.layout.main);

   //setContentView(mPreview);
   playMedia();

}

private void playMedia(String s_filePath)
{
   setContentView(mPreview);
   //s_filePath = /tmp/mp4.mp4;
   s_filePath =  /data/local/video/test_qcif_200_aac_64.mp4;
   //s_filePath = /tmp/test.mpg;
   //s_filePath = /tmp/3.3gp;
   Log.i(LOG_TAG, CameraApp.playMedia);
   mp = new MediaPlayer();
   try
   {
  mp.setDataSource(s_filePath);
   }
   catch (IllegalArgumentException e)
   {
  // TODO Auto-generated catch block
  Log.v(LOG_TAG,
 CameraApp.playMedia:IllegalArgumentException);
  e.printStackTrace();
   }
   catch (IOException e)
   {
  Log.v(LOG_TAG, CameraApp.playMedia:IOException);
  // TODO Auto-generated catch block
  e.printStackTrace();
   }
   try
   {

  //mp.setDisplay(mPreview.getHolder().getSurface());
  mp.prepare();
  int i = mp.getDuration();
  Log.i(LOG_TAG, Duration: + String.valueOf(i));
  mp.start();
   }
   catch (Exception e)
   {
  Log.v(LOG_TAG, e.toString());
  mp.stop();
  mp.release();
   }
   //setContentView(mPreview);
}

private void pauseMedia()
{
   Log.i(LOG_TAG, CameraApp.pauseMedia);
   if (null != mp)
   {
  mp.pause();
   }
}

private void stopMedia()
{
   Log.i(LOG_TAG, CameraApp.stopMedia);
   if (null != mp)
   {
  mp.stop();
   }
}
private void releaseMedia()
{
   Log.i(LOG_TAG

[android-developers] Re: On android.media.MediaRecorder, can it record the voice of speaker?

2009-02-15 Thread Dave Sparks

The G1 does not support recording uplink or downlink audio.

On Feb 14, 6:20 am, Shawn_Chiu qiuping...@gmail.com wrote:
 Hello, buddies
 It's about android.media.MediaRecorder.
 I want to implement a feature to record the conversation, both voice
 from speaker and microphone. But I tried on G1 phone, the
 MediaRecorder only could record the voice from microphone. I looked
 into the Android doc, there are only two audio sources, MIC and DEFAULT
 (either to record microphone). So these are my questions:
 1. whether can I implement it or not?
 2. will more audio sources be supported? If so, which release or when?
 Thank you:-0
 Shawn
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: RTSP and MMS - mediaplayer shenanigans

2009-02-14 Thread Dave Sparks

I'm pretty sure that OpenCore is going to reject the mms URI.

On Feb 13, 8:57 pm, Rob Franz rob.fr...@gmail.com wrote:
 I believe this is WMA on the other end.  Does this present a problem?

 On Feb 13, 2009 11:13 PM, Rob Franz rob.fr...@gmail.com wrote:

 Hi all
 I'm trying to get an RTSP stream going with a verified source - I know
 there's something on the other end in this case.

 However, the format of the URL is like this:

 mms://
 a757.l1265761171.c12657.g.lm.akamaistream.net/D/757/12657/v0001/reflector:61171?auth=daEdLa4adcAc2aoc1bYceclcFcQdfbwckcA-bjLJ8.-b4-NvGjptD

 I admit I'm new to RTSP in general but I think I understand how it
 works - for my purposes, all I need to do is pass the RTSP URL to the
 mediaPlayer.setDataSource() method, prepare it, and start.  That
 should be pretty much it, if I understand correctly.

 No matter what I do, I always get Prepare failed.:
 status=0xFFFC ,etc. I try to modify different things, but I can
 never prepare the stream, and so I can't start it.

 Does the above URL qualify as an rtsp stream (i.e. remove the mms and
 put in rtsp)?

 As I understood it, RTSP obsoleted MMS, and MMS has been completely
 phased out, but apparently some services are still passing out that
 URL.

 Anyone seen anything like this before?

 Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: How to force MediaStore to rescan the SD card

2009-02-14 Thread Dave Sparks

You want something like this in your activity:

import android.media.MediaScannerConnection;
import
android.media.MediaScannerConnection.MediaScannerConnectionClient;

private static class MediaScannerNotifier implements
MediaScannerConnectionClient {
private Context mContext;
private MediaScannerConnection mConnection;
private String mPath;
private String mMimeType;

public MediaScannerNotifier(Context context, String path, String
mimeType) {
mContext = context;
mPath = path;
mMimeType = mimeType;
mConnection = new MediaScannerConnection(context, this);
mConnection.connect();
}

public void onMediaScannerConnected() {
mConnection.scanFile(mPath, mMimeType);
}

public void onScanCompleted(String path, Uri uri) {
// OPTIONAL: scan is complete, this will cause the viewer to
render it
try {
if (uri != null) {
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setData(uri);
mContext.startActivity(intent);
}
} finally {
mConnection.disconnect();
mContext = null;
}
}
}

To scan a file, you just create a new MediaScannerNotifier:

new MediaScannerNotifier(path, mimeType);

On Feb 14, 9:45 am, kolby kolbys...@gmail.com wrote:
 You can make an android.media.MediaScannerConnection, connect to it,
 and provide a client to scan a directory.

 Michael

 On Feb 14, 7:05 am, info.sktechnol...@gmail.com

 info.sktechnol...@gmail.com wrote:
  If I progammatically store new media files on the SD card, the
  MediaStore does not know about them until I remove and reinsert the SD
  card.  Is there a way to tell the MediaStore to rescan the SD card
  without first unmounting the SD card?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: sound effect using SoundPool

2009-02-14 Thread Dave Sparks

There are a lot of fixes to SoundPool coming in the Cupcake release.

I need to check on the crash you mentioned - I don't recall seeing
that before and it should give you an error, not crash. The range is
dependent on the ratio of the sample rate of the source and the
hardware output.

On Feb 13, 9:46 am, Robert Green rbgrn@gmail.com wrote:
 One more thing you need to know:

 Soundpool crashes if you change the pitch over 1.5 or under .5.  I
 created multiple engine samples for higher and higher pitches, so no
 matter how high you hear the engine on light racer, it never goes
 above 1.5 playback rate.  Soundpool is actually a really nice API for
 game sound FX and I really hope it doesn't go away because it's
 perfect for that application.  I just hope that things like that are
 fixed in the future.

 On Feb 13, 10:34 am, Marco Nelissen marc...@android.com wrote:

  There is a known problem (fixed in cupcake) with SoundPool that will cause
  it to lock up if you try to play more simultaneous sounds than the capacity
  of the SoundPool. If you specify the capacity of the SoundPool to be higher
  than the largest number of sounds you will ever play at the same time, then
  it should work.

  On Thu, Feb 12, 2009 at 1:11 PM, djp david.perou...@gmail.com wrote:

   Hi,

   there seems to be a dead-lock problem when using sound effects with
   SoundPool. I was hoping the the problem would be fixed in the new
   firmware update, but unfortunately, it was not. The application still
   locks up after couple of minutes when playing sound effects using
   SoundPool. My application is a game playing a background music
   scenario with multiple simultaneous sound effects.

   Any word from devs as when we can expect stable SoundPool would be
   very welcome.

   Best
   David
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: I want to use mediaplayer to play avi and other file formats

2009-02-13 Thread Dave Sparks

This is not the appropriate list for your questions.

There are lots of threads about this in android-framework. Search for
OMX hardware codecs. There is also a guide to integrating OMX codecs
in the OpenCore project.

On Feb 12, 7:57 pm, susanner zsusan...@163.com wrote:
 Dear all
 Is there any support for avi and other media formats except mp4 and
 3gpp?
 If there is no support may I develop a support through correct the
 program? Will that be too difficult ?I don't know where to put my
 hands into first, should I correct programs in opencore first or
 should I just correct programs in directory frameworks/base/include/
 media . I want to have a clear mind how I can achieve my goal, i need
 a roadmap to guide me how to do it . Anyone have any suggestion? I am
 trying to get familiar with the SDK 1.0 now.
 another question, my chip is from freescale and it supports hardware
 codecs of mpeg4, h.263 and h.264, I don't know whether I can change
 the program or the architecture of mediaplayer to utilize my hardware
 codecs instead of software codecs provided by android opencore.
 I once implement an mplayer to my platform, there is problems about
 sync of audio and video due to the quality of my chips(CPU can support
 software codecs  but is not fast enough to make sure sync of audio and
 video   ), SO i am worried about when the android is implemented,
 there would also be the same problem, and for this reason, I am
 thinking whether I can utilize my hardware codecs for mediaplayer.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: What's in raw data from PictureCallback of camera?

2009-02-13 Thread Dave Sparks
. The vector operators
  would not become obsolete because a JIT compiler would in all
  likelihood never reach the efficiency for low level processing
  obtainable through hand-crafted or even hardware accelerated
  implementations of the small set of vector operators under the
  proposed media and signal processing API.

  Of course it remains the responsibility of the Android application
  programmer to make good use of the vector operators to replace the
  most costly Android for-loops and conditional branches.

  Regards

  On Feb 12, 5:29 am, Dave Sparks davidspa...@android.com wrote:

   I think we'll be able to give you something that will meet your needs.
   It's always a balancing act between taking the time to get the API
   just right and getting a product to market.

   Keep making suggestions, we are listening.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: RTSP Video Issues on 1.1 firmware

2009-02-13 Thread Dave Sparks

Can you supply some links so we can try to figure out what's wrong?

On Feb 12, 12:15 pm, jz0o0z floresje...@gmail.com wrote:
 Update: After a little more testing I found some rtsp links that do
 play in 1.1, but other links, which I verified are still active and
 were working before are giving me the error.

 Is there an easy way to revert back to the 1.0 firmware so I can
 confirm that the problem started with 1.1?

 On Feb 11, 7:01 pm, jz0o0z floresje...@gmail.com wrote:

  My application plays RTSP video by launching the browser.  The code
  looks like this, where url is a string for an RTSP video :

  Intent i = new Intent(Intent.ACTION_VIEW, Uri.parse(url));
  startActivity(i);

  This was working fine in 1.0, but when I upgraded to 1.1 I started
  getting this error : ERROR/MediaPlayer(316): Error (-1,0)

  Any ideas?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Is VideoView Broken?

2009-02-11 Thread Dave Sparks

You can't play a resource file using VideoView by passing a pathname.
The file doesn't exist; it's just a binary blob inside the APK file.
If you create a content provider, you could pass a URI to setVideo().
I'm not entirely sure that will work because the video file might be
compressed inside the APK. We exclude audio files from compression,
but I don't video files are excluded because they don't make a lot of
sense as a resource - they are too big.

The simplest way to get this working is to copy the file to the SD
card. You can either adb push the file to the SD card or UMS mount
the SD card and copy it over from your host computer. Then modify your
setVideo(path) to /sdcard/movie.mp4 (or whatever directory you put
it in on the SD card).

On Feb 10, 11:21 pm, Brendan raven...@gmail.com wrote:
 So I'm trying to do this:

 VideoView video_view = new VideoView(context);
 video_view.setVideoPath(res/raw/movie.mp4);
 video_view.setMediaController(new MediaController(context));
 video_view.requestFocus();
 video_view.start();

 Seems simple, but even with video files that I've seen work in other
 video players on the phone, I keep getting the Sorry, this video
 cannot be played message. And this associated debug info:

 INFO/MediaPlayer-JNI(234): prepareAsync: surface=0x1a3ff0 (id=1)
 ERROR/MediaPlayer(234): Error (-4,0)
 DEBUG/VideoView(234): Error: -4,0

 Any ideas as to what those errors mean or how I can fix this?

 If this is not how I should be using VideoView please let me know. I'd
 appreciate being sent in the right direction for how to get a local
 video file playing within a view.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: the android's built in video player, supports which file format

2009-02-11 Thread Dave Sparks

We have no plans to support those formats. Android manufacturers
always have the option of including other file formats and codecs if
the demand is there.

On Feb 11, 5:57 pm, waterblood guoyin.c...@gmail.com wrote:
 Does Google has any plan for other format support , as avi, rm?

 On 1月15日, 下午1时27分, rktb yend...@pv.com wrote:

  Did you really mean pm4 or mp4?

  pm4 -- I don't know what this is
  avi -- To the best of my knowledge, we don't have support for this
  currently.

  Audio -- AAC (AAC-LC, AAC+, Enhanced AAC+), AMR (narrowband and
  wideband), mp3, wav, midi, ogg-vorbis.

  On Jan 15, 10:19 am, jalandar jagtap...@gmail.com wrote:

   thank u for reply
   will you clear little more, how to play pm4 or avi with android built
   in player, is there any converter.
   And for audio, is it only support mp3 format.- 隐藏被引用文字 -

  - 显示引用的文字 -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: how to get camera object for double camera

2009-02-11 Thread Dave Sparks

SDK 1.0 only has support for one camera.

When we have demand for a second camera from an Android partner, we'll
add a new API so that you can select the camera.

On Feb 11, 1:35 am, Link link.li...@gmail.com wrote:
 hi, all

 i wonder how to get camera object in android, if there are two or more
 cameras in a telephone. thanks a lot!

 best regards!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: MediaRecorder Sound Quality?

2009-02-11 Thread Dave Sparks

The codec is AMR-NB with an 8KHz sample frequency. In the Cupcake
release we will provide access to the raw 16-bit PCM stream so you can
do your own encoding or signal processing.

On Feb 11, 9:18 am, g1bb corymgibb...@gmail.com wrote:
 Hello,

 Is anyone else experiencing poor playback quality on files recorded
 with MediaRecorder? Is there a way to improve this?

 Thanks!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: how to get the first frame of a video file ?

2009-02-11 Thread Dave Sparks

There is no support for thumbnail extraction in SDK 1.0. It's coming
in Cupcake as an adjunct to video record.

On Feb 11, 7:30 am, Freepine freep...@gmail.com wrote:
 Opencore has a frame and metadata utility, and there is also an API as
 android.media.MediaMetadataRetriever.captureFrame()
 in Java layer, but it might not be available in public SDK.

 On Wed, Feb 11, 2009 at 8:49 PM, trust_chen chen trustc...@gmail.comwrote:

  how to get the first frame of a video file ?
  Are there such APIs in OPENCORE?
   Thanks !
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: What's in raw data from PictureCallback of camera?

2009-02-11 Thread Dave Sparks

I think we'll be able to give you something that will meet your needs.
It's always a balancing act between taking the time to get the API
just right and getting a product to market.

Keep making suggestions, we are listening.

On Feb 11, 4:08 am, blindfold seeingwithso...@gmail.com wrote:
 Thank you David, I feel relieved to hear that. :-)

  Rather than trying to do all your image processing in Java, wouldn't you
  prefer to have built-in native signal processing kernels that are optimized
  for the platform?

 Yes, of course. One can parameterize and wrap under an API a number of
 useful low-level image (and audio) operations, such as various
 filtering operations, edge detection, corner detection, segmentation,
 convolution and so on. That is certainly very nice to have, but not
 enough. I would want to implement my own pixel-level (and audio-byte-
 level) processing algorithms that do not fit within the pre-canned
 categories, while benefiting from the platform and CPU independence of
 Android (no JNI if I can avoid it). It would be purely computational
 code with loops and conditional branches, operating on arrays, array
 elements and scalars. At that level, C code (minus any pointering) and
 Java code actually looks almost the same. No real need for any fancy
 data structures etc at *that* level that then covers the expensive
 parts of the processing, so I feel that a simple, light-weight,
 targeted JIT compiler could come a long way to meeting all these
 needs: I really would not mind if it would only compile a
 (computational) subset of Java, and it may leave code optimizations to
 the developer by compiling rather straightforwardly. (Cannot help
 being reminded of FORTRAN-77 on old IBM mainframes getting compiled
 almost 1-to-1 to easy-to-read machine code.) So perhaps rather than a
 built-in native signal processing *kernel* I am here thinking of a
 built-in native signal processing (JIT) *compiler*. ;-)

 Regards

 On Feb 11, 11:03 am, Dave Sparks davidspa...@android.com wrote:

  I'm talking about deprecating the raw picture callback that has never
  worked. It won't affect any existing applications. As for the camera
  API in SDK 1.0: It was never intended for signal processing. It was
  intended only for taking snapshots. It just happens that creative
  people like yourself have found other uses for it.

  I certainly don't want to break your application. I do want to give
  you a better API in the future. Rather than trying to do all your
  image processing in Java, wouldn't you prefer to have built-in native
  signal processing kernels that are optimized for the platform?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording Audio with MediaRecorder on G1

2009-02-11 Thread Dave Sparks

I looked over the code and didn't see anything obvious. You won't see
anything in the log unless an error occurs - we try to minimize
logging in production code.

Run adb bugreport and take a look at the kernel log. You should see
something like this:

6[  820.265000] adsp: opening module AUDPREPROCTASK
6[  820.265488] audmgr_rpc_thread() start
6[  820.266281] adsp: module AUDPREPROCTASK has been registered
6[  820.266586] adsp: opening module AUDRECTASK
6[  820.267838] adsp: module AUDRECTASK has been registered
6[  820.268326] snd_set_device 256 1 0
6[  820.280289] audmgr: rpc_reply status 0
6[  820.291489] audmgr: rpc READY handle=0x
6[  820.355332] audmgr: rpc CODEC_CONFIG volume=0x2ff4
6[  820.355759] msm_adsp_enable() 'AUDPREPROCTASK'
6[  820.357468] adsp: rpc event=0, proc_id=2, module=14, image=0
6[  820.357926] adsp: module AUDPREPROCTASK: READY
6[  820.358688] msm_adsp_enable() 'AUDRECTASK'
6[  820.359909] adsp: rpc event=0, proc_id=2, module=13, image=0
6[  820.360825] adsp: module AUDRECTASK: READY
6[  820.362259] audpre: CFG ENABLED
6[  820.363663] audrec: PARAM CFG DONE
6[  822.417343] audrec: CFG SLEEP
6[  822.417801] msm_adsp_disable() 'AUDRECTASK'
6[  822.420151] msm_adsp_disable() 'AUDPREPROCTASK'
6[  822.423081] adsp: closing module AUDRECTASK
6[  822.423508] adsp: closing module AUDPREPROCTASK
6[  822.423843] adsp: disable interrupt
6[  822.424667] snd_set_device 256 1 1
6[  822.428391] audmgr: rpc_reply status 0
3[  822.458450] audmgr: DISABLED

If so, then we know that the audio input driver is being opened
correctly.

On Feb 11, 7:02 am, michael kuh...@gmail.com wrote:
 Hi,

 I've written a little program (see below) to record audio using
 MediaRecorder. While everything works fine in the emulator (SDK1r2),
 the program does not work on the real phone (HTC G1). On the phone,
 after pressing the start button, the audio file is created, but no
 content is written to it (i.e. the file size remains 0). Also, no
 exception is shown in logcat, and no AudioStreamInGeneric events are
 generated (such events are generated when running the program on the
 emulator).

 Is there maybe a setting on the phone that I have to turn on (or off)?
 Or am I missing something else?

 I've set the permission to record audio in the android-manifest.xml
 file using
 uses-permission android:name=android.permission.RECORD_AUDIO/uses-
 permission

 Thanks for any help!

 Michael
 --
 My program code:

 package ch.ethz.dcg.mictest;

 import android.app.Activity;
 import android.media.MediaRecorder;
 import android.os.Bundle;
 import android.util.Log;
 import android.view.View;
 import android.widget.Button;

 public class MicTest extends Activity {

 private final static String TAG = MicTest.class.getSimpleName();

 private MediaRecorder recorder;
 private Button button;

 @Override
 public void onCreate(Bundle savedInstanceState) {
 super.onCreate(savedInstanceState);
 setContentView(R.layout.main);

 recorder = new MediaRecorder();

 button = (Button)findViewById(R.id.button);
 button.setText(start);
 button.setOnClickListener(new View.OnClickListener() {
 @Override
 public void onClick(View v) {
 buttonClicked();
 }
 });
 }

 private void buttonClicked() {
 if (button.getText().equals(start)) {
 try {
 recorder = new MediaRecorder();
 String path = /sdcard/test.3gpp;

 
 recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
 
 recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
 
 recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
 recorder.setOutputFile(path);

 recorder.prepare();
 recorder.start();
 button.setText(stop);
 } catch (Exception e) {
 Log.w(TAG, e);
 }
 } else {
 try {
 recorder.stop();
 recorder.release(); // Now the object cannot 
 be reused
 button.setEnabled(false);
 } catch (Exception e) {
 Log.w(TAG, e);
 }
 }
 }

 }
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to 

[android-developers] Re: 3gpp files

2009-02-10 Thread Dave Sparks

Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?

On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
 I have a video file in 3gpp format and I can use QuickTime to play it.
 But when tried with the SDK Windows emulator, I got the error msg:

 [2009-02-09 14:46:23 - DeviceMonitor]Error reading jdwp list: EOF
 [2009-02-09 14:46:23 - DeviceMonitor]Connection Failure when starting
 to monitor device 'emulator-5554' : device (emulator-5554) request
 rejected: device not found

 Any ideas??

 -KC
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: What's in raw data from PictureCallback of camera?

2009-02-10 Thread Dave Sparks

On the G1, no data is returned - only a null pointer. The original
intent was to return an uncompressed RGB565 frame, but this proved to
be impractical.

On Feb 9, 3:57 pm, Xster xyxyx...@gmail.com wrote:
 Hi,
 Our university is intending to use the Android as a platform for
 mobile image analysis. We're wondering what kind of information is
 returned in the raw format when android.hardware.Camera.takePicture()
 is called with a raw Camera.PictureCallback. I can't seem to find more
 information about it on 
 thehttp://code.google.com/android/reference/android/hardware/Camera.Pict...
 page.

 Thanks,
 Xiao
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: 3gpp files

2009-02-10 Thread Dave Sparks

The problem is that your video path is on your host machine. You have
to remember that the emulator is actually emulating an ARM processor
run its own Linux kernel. The file system is not your host computer,
but a virtual file system within the Linux kernel emulation.

Use mksdcard to create an SD card virtual file system (there are
options to specify the size):

mksdcard sdcard.img

Start the emulator and tell it to use your SD card image:

emulator -sdcard sdcard.img

Copy the file from your host computer to the virtual SD card image:

adb push D:/Profiles/mgia0013/MyDocuments/Android/Movies/HomeAlone.
3gp /sdcard

Now change the path in your setVideoPath:

video.setVideoPath(/sdcard/HomeAlone.3gp);

On Feb 10, 8:55 am, K. Chen grssmount...@gmail.com wrote:
 Nope. I was using a sample code from a book (I forgot the name). It seems
 fairly straightforward (below). Of course, I'm newbie, so please bear with
 me. Thanks.

 public class VideoDemo extends Activity {
 private VideoView video;
 private MediaController ctlr;

 @Override
 public void onCreate(Bundle icicle) {
 super.onCreate(icicle);
 getWindow().setFormat(PixelFormat.TRANSLUCENT);
 setContentView(R.layout.main);

 Button show=(Button)findViewById(R.id.show);

 show.setOnClickListener(new View.OnClickListener() {
 public void onClick(View view) {
 ctlr.show();
 }
 });

 video=(VideoView)findViewById(R.id.video);
 video.setVideoPath(D:/Profiles/mgia0013/My
 Documents/Android/Movies/HomeAlone.3gp);

 ctlr=new MediaController(this);
 ctlr.setMediaPlayer(video);
 video.setMediaController(ctlr);
 video.requestFocus();
 }

 On Tue, Feb 10, 2009 at 12:00 AM, Dave Sparks davidspa...@android.comwrote:



  Can you elaborate a bit more on how you are playing the file? Did you
  write your own video player?

  On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
   I have a video file in 3gpp format and I can use QuickTime to play it.
   But when tried with the SDK Windows emulator, I got the error msg:

   [2009-02-09 14:46:23 - DeviceMonitor]Error reading jdwp list: EOF
   [2009-02-09 14:46:23 - DeviceMonitor]Connection Failure when starting
   to monitor device 'emulator-5554' : device (emulator-5554) request
   rejected: device not found

   Any ideas??

   -KC
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: changing voice attributes with android

2009-02-10 Thread Dave Sparks

No, there is no way to do this in SDK 1.0.

On Feb 10, 9:48 am, eliak...@gmail.com eliak...@gmail.com wrote:
 hello,
 I want to create an application that can change one's voice during a
 call in real time
 is there a way to do that in android?
 can you point me to the right package?
 thanx
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Sampling

2009-02-10 Thread Dave Sparks

It's on the roadmap for Cupcake.

On Feb 10, 6:44 pm, clark clarkd...@gmail.com wrote:
 How about SDK 1.1? Or 1.2?  Any idea where on the roadmap this feature
 stands?

 On Feb 6, 10:18 am, Dave Sparks davidspa...@android.com wrote:

  No, this is not supported in SDK 1.0.

  On Feb 6, 8:34 am, Sundog sunns...@gmail.com wrote:

   Is it possible to piggyback the audio stream or themicrophoneand
   get raw sample data from it? Can anyone point me to some
   documentation?

   Thanks.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: What's in raw data from PictureCallback of camera?

2009-02-10 Thread Dave Sparks

Highly unlikely. Applications are restricted to a heap of 16MB. An 8MP
image in RG888 will use 32MB.

I am inclined to deprecate that API entirely and replace it with hooks
for native signal processing.

On Feb 10, 6:17 pm, gjs garyjamessi...@gmail.com wrote:
 Hi,

 I'm hoping that Android will supported this in future for RGB888 and
 (at least) 8mp images.

 I know this is a big ask, probably requiring much larger process heap
 size (  16MB  high speed bus/memory ) but there already is 8mp
 camera phones available on other platforms.

 And yes I know these phones do not yet support loading such large
 images into (java) application memory but I'm hoping the Android
 architecture can accommodate this in the not to distant future.

 Regards

 On Feb 10, 7:01 pm, Dave Sparks davidspa...@android.com wrote:

  On the G1, no data is returned - only a null pointer. The original
  intent was to return an uncompressed RGB565 frame, but this proved to
  be impractical.

  On Feb 9, 3:57 pm, Xster xyxyx...@gmail.com wrote:

   Hi,
   Our university is intending to use the Android as a platform for
   mobile image analysis. We're wondering what kind of information is
   returned in the raw format when android.hardware.Camera.takePicture()
   is called with a raw Camera.PictureCallback. I can't seem to find more
   information about it on 
   thehttp://code.google.com/android/reference/android/hardware/Camera.Pict...
   page.

   Thanks,
   Xiao
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: 3gpp files

2009-02-10 Thread Dave Sparks

You'll find a complete list of codecs and file formats here:

http://developer.android.com/guide/appendix/media-formats.html

On Feb 10, 3:59 pm, K. Chen grssmount...@gmail.com wrote:
 Thanks; it works.. A somewhat related question.

 In addition to 3GPP video file, does Android support MP4 as well? Any plan
 for MP4 video (if it's not doing it), in light of the trend that more and
 more codec chip vendors are supporting it.

 -KC

 On Tue, Feb 10, 2009 at 9:56 AM, Dave Sparks davidspa...@android.comwrote:



  The problem is that your video path is on your host machine. You have
  to remember that the emulator is actually emulating an ARM processor
  run its own Linux kernel. The file system is not your host computer,
  but a virtual file system within the Linux kernel emulation.

  Use mksdcard to create an SD card virtual file system (there are
  options to specify the size):

  mksdcard sdcard.img

  Start the emulator and tell it to use your SD card image:

  emulator -sdcard sdcard.img

  Copy the file from your host computer to the virtual SD card image:

  adb push D:/Profiles/mgia0013/MyDocuments/Android/Movies/HomeAlone.
  3gp /sdcard

  Now change the path in your setVideoPath:

  video.setVideoPath(/sdcard/HomeAlone.3gp);

  On Feb 10, 8:55 am, K. Chen grssmount...@gmail.com wrote:
   Nope. I was using a sample code from a book (I forgot the name). It seems
   fairly straightforward (below). Of course, I'm newbie, so please bear
  with
   me. Thanks.

   public class VideoDemo extends Activity {
   private VideoView video;
   private MediaController ctlr;

   @Override
   public void onCreate(Bundle icicle) {
   super.onCreate(icicle);
   getWindow().setFormat(PixelFormat.TRANSLUCENT);
   setContentView(R.layout.main);

   Button show=(Button)findViewById(R.id.show);

   show.setOnClickListener(new View.OnClickListener() {
   public void onClick(View view) {
   ctlr.show();
   }
   });

   video=(VideoView)findViewById(R.id.video);
   video.setVideoPath(D:/Profiles/mgia0013/My
   Documents/Android/Movies/HomeAlone.3gp);

   ctlr=new MediaController(this);
   ctlr.setMediaPlayer(video);
   video.setMediaController(ctlr);
   video.requestFocus();
   }

   On Tue, Feb 10, 2009 at 12:00 AM, Dave Sparks davidspa...@android.com
  wrote:

Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?

On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
 I have a video file in 3gpp format and I can use QuickTime to play
  it.
 But when tried with the SDK Windows emulator, I got the error msg:

 [2009-02-09 14:46:23 - DeviceMonitor]Error reading jdwp list: EOF
 [2009-02-09 14:46:23 - DeviceMonitor]Connection Failure when starting
 to monitor device 'emulator-5554' : device (emulator-5554) request
 rejected: device not found

 Any ideas??

 -KC
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Media file formats and codecs supported by Android

2009-02-10 Thread Dave Sparks

A list of media file formats and codecs supported by Android can be
found here:

http://developer.android.com/guide/appendix/media-formats.html


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Multimedia buffering

2009-02-09 Thread Dave Sparks

The files are not stored, they are streamed into a temporary memory
buffer.

What kind of file are you trying to stream? If it's an MP4 file, you
need to make sure that the 'moov' atom comes before the 'mdat' atom.

On Feb 9, 3:08 am, AliBaba kanul1...@gmail.com wrote:
 Hi All,

 I am trying to run the media player from API Demo by specifying the
 http based url of the video. In specific I want to play streaming
 Video.

 By debugging I found that it is showing the logs for streaming the
 video from 1-100 %. But then also it is not playing the video. Can
 anybody please help me where(file location) streamed video file get
 stored on Android.

 -AliBaba
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Playing video in android

2009-02-09 Thread Dave Sparks

First, the surface type needs to be push buffers. In your Preview
constructor, add the following:

getHolder().setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

Second, you need to tell the media player where to display the video.
You have a line commented out:

//mp.setDisplay(mPreview.getHolder().getSurface());

It should look like this:

mp.setDisplay(mPreview.getHolder());

If you get stuck, take a look at the VideoView.java widget source code
on android.git.kernel.org. It's in the frameworks/base project at core/
java/widget/VideoView.java

On Feb 8, 2:58 pm, Ash ashwin.disco...@gmail.com wrote:
 Play video on android
 -

 I'm trying to play video files on android...can anyone please help
 me...
 I'm not able to see the video but audio works fine... here is the code

 PLEASE HELP ME... BY GUIDING ME.. BY CORRECTING THE CODE
 OR WITH ANY NEW CODE

 package com.vi3;

 import java.io.IOException;
 import android.app.Activity;
 import android.os.Bundle;
 import android.content.Context;
 import android.graphics.PixelFormat;
 import android.media.MediaPlayer;
 import android.media.MediaPlayer.OnBufferingUpdateListener;
 import android.media.MediaPlayer.OnCompletionListener;
 import android.media.MediaPlayer.OnErrorListener;
 import android.util.Log;
 import android.view.Menu;
 import android.view.SurfaceHolder;
 import android.view.SurfaceView;
 import android.view.Surface;
 import android.view.Window;
 //import android.view.Menu.Item;

 public class vi3 extends Activity
 {
private static final String LOG_TAG = |;
private MediaPlayer mp;

private Preview mPreview;
//private myAcListener myListener = new myAcListener()this;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle icicle)
{
   super.onCreate(icicle);
   Log.i(LOG_TAG, CameraApp.onCreate);
   mPreview = new Preview(this);
   //requestWindowFeature(W);

 //  stopMedia();
// releaseMedia();
   setContentView(R.layout.main);

   //setContentView(mPreview);
   playMedia();

}

private void playMedia(String s_filePath)
{
   setContentView(mPreview);
   //s_filePath = /tmp/mp4.mp4;
   s_filePath =  /data/local/video/test_qcif_200_aac_64.mp4;
   //s_filePath = /tmp/test.mpg;
   //s_filePath = /tmp/3.3gp;
   Log.i(LOG_TAG, CameraApp.playMedia);
   mp = new MediaPlayer();
   try
   {
  mp.setDataSource(s_filePath);
   }
   catch (IllegalArgumentException e)
   {
  // TODO Auto-generated catch block
  Log.v(LOG_TAG,
 CameraApp.playMedia:IllegalArgumentException);
  e.printStackTrace();
   }
   catch (IOException e)
   {
  Log.v(LOG_TAG, CameraApp.playMedia:IOException);
  // TODO Auto-generated catch block
  e.printStackTrace();
   }
   try
   {

  //mp.setDisplay(mPreview.getHolder().getSurface());
  mp.prepare();
  int i = mp.getDuration();
  Log.i(LOG_TAG, Duration: + String.valueOf(i));
  mp.start();
   }
   catch (Exception e)
   {
  Log.v(LOG_TAG, e.toString());
  mp.stop();
  mp.release();
   }
   //setContentView(mPreview);
}

private void pauseMedia()
{
   Log.i(LOG_TAG, CameraApp.pauseMedia);
   if (null != mp)
   {
  mp.pause();
   }
}

private void stopMedia()
{
   Log.i(LOG_TAG, CameraApp.stopMedia);
   if (null != mp)
   {
  mp.stop();
   }
}
private void releaseMedia()
{
   Log.i(LOG_TAG, CameraApp.releaseMedia);
   if (null != mp)
   {
  mp.release();
   }
}
class Preview extends SurfaceView implements
 SurfaceHolder.Callback
{
SurfaceHolder   mHolder;
private boolean mHasSurface;
Preview(Context context) {
super(context);

mHolder = getHolder();
mHolder.addCallback(this);
mHasSurface = false;

//mHolder.setFixedSize(320, 240);
mHolder.setFixedSize(176, 144);
//mHolder.setFixedSize(192, 242);
}

public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, start our main acquisition
 thread.
mHasSurface = true;
}

public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return. Stop the
 preview.
mHasSurface = false;
}

public void surfaceChanged(SurfaceHolder holder, int format,int
 w, int h) {
// Surface size or format has changed. This should not
 happen   in this
// example.
}
}

 }
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers 

[android-developers] Re: the image captured using Intent i = new Intent(android.media.action.IMAGE_CAPTURE); is very small

2009-02-09 Thread Dave Sparks

This is an issue that will be fixed in the Cupcake release.

On Feb 8, 11:31 pm, jj jagtap...@gmail.com wrote:
 hello everybody

 I am capturing Image from app using :
 the image captured using Intent i = new Intent
 (android.media.action.IMAGE_CAPTURE);

 but it is very small (25*50)

 Why this is happening; will anybody suggest me solution?

 thank you
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Reg playing youtube video in android

2009-02-09 Thread Dave Sparks

I'm pretty sure this is due to the way the emulator handles UDP
packets. There is an outstanding bug about this, but no one has had
time to work on it.

On Feb 9, 10:59 pm, Harishkumar V harishpres...@gmail.com wrote:
 Michael,

 using browser running in the android in emulator mode, i 
 launchedhttp://m.youtube.com.
 it opened up a page contaning list of video files. when i click any one of
 them, it starts to load video, but finally it fails with the message,

 Sorry, this video cannot be played.

 using adb logcat,  i got the log messages as,

 I/ActivityManager(   50): Starting activity: Intent {
 action=android.intent.action.VIEW
 categories={android.intent.category.BROWSABLE} data=rtsp://
 bfug.rtsp-youtube.l.google.com/CkYLENy73wIaPQkcSEPa1Fc6nRMYDSANFEIJbXYtZ29v 
 Z2xlSARSBWluZGV4Wg5DbGlja1RodW1ibmFpbGCqqamFtdjqsQkM/0/0/0/video.3gpcomp={c 
 om.android.music/com.android.music.MovieView}}

 W/SensorService(   50): could not enable sensor 2
 I/MediaPlayer-JNI(  189): prepareAsync: surface=0x1ae340 (id=1)
 I/ActivityManager(   50): Displayed activity com.android.music/.MovieView:
 418 ms
 D/dalvikvm(  166): GC freed 3935 objects / 290168 bytes in 75ms
 E/MediaPlayer(  189): Error (-1,0)
 D/VideoView(  189): Error: -1,0
 W/PlayerDriver(   25): PVMFInfoErrorHandlingComplete

 in the shell,
 when i do ps command,

 app_8    189   24    91620 13156  afe0c824 S
 com.android.music:MovieView is appearing.

 how the complete flow works?
 who is responsible for launching the video stream, is MovieView consists of
 MediaPlayer and VideoView.

 is any format changes in m.youtube.com video, any way to clearly see the
 Error(-1,0) means.

 Thanks and Regards,
 HarishKumar.V

 On Fri, Jan 30, 2009 at 8:15 PM, kolby kolbys...@gmail.com wrote:

  Here is the sample code to launch the view intent:

         Uri uri = Uri.parse(url);
         // check if others handle this url
         Intent intent = new Intent(Intent.ACTION_VIEW, uri);
         intent.addCategory(Intent.CATEGORY_BROWSABLE);
         try {
           if (startActivityIfNeeded(intent, -1)) {
           // success
           }
         } catch (ActivityNotFoundException ex) {
           // fail
         }

  The rtsp link doesn't seem to be recognized by anything on the
  emulator.

  Michael
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: debugging integrated Java and native code

2009-02-07 Thread Dave Sparks

A few of our developers use Eclipse as a front-end for gdb. I recall
that the setup is a bit tricky. Maybe someone can post the magic
formula.

I use gdb myself, but then I still use vi and makefiles. IDE's are for
wimps. :)

On Feb 7, 9:22 am, Sergey Ten sergeyte...@gmail.com wrote:
 Hi,

 I am trying to figure out what is the best way to debug a mix of Java
 and native code? Please notice, that I am NOT trying to develop a native
 app. The app will be written entirely in Java, using Android SDK.
 However, I noticed that some pieces of the SDK use native methods (e.g.
 AssetManager, WebKit, etc). I wonder which tools Google developers use
 if/when they need to debug a mix of Java and C/C++ code? Eclipse/gdb or
 there are commercial tools which make the debugging experience less
 painful?

 I googled on this topic and the results returned do not look very
 encouraging.

 Thanks,
 Sergey
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Sampling

2009-02-06 Thread Dave Sparks

No, this is not supported in SDK 1.0.

On Feb 6, 8:34 am, Sundog sunns...@gmail.com wrote:
 Is it possible to piggyback the audio stream or the microphone and
 get raw sample data from it? Can anyone point me to some
 documentation?

 Thanks.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Updation of the Media Data base in emulator only on bootup/mount/recording?

2009-02-06 Thread Dave Sparks

You need to tell the media scanner that you have added a new file. See
http://code.google.com/android/reference/android/media/MediaScannerConnection.html

On Feb 6, 4:38 am, Rishi kaurari...@gmail.com wrote:
 When i added a media file to the sdcard an update in the MediaProvider
 database is not happening. When i bootup the emulator it happens .
 Is this the expected behaviour ?

 - Rishi Kaura
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: problem with playing sounds using media player

2009-02-06 Thread Dave Sparks

There is a lot of new audio support in the Cupcake SDK release. This
includes working SoundPool for low-latency sounds and streaming audio
API's.

On Feb 6, 2:13 am, suhas gavas suhas.ga...@gmail.com wrote:
 Hi,
 I have used media player in my formal game which was 2d game ...
 At that time I used view  . But now its 3d game and so i have
 extended surfcae view . So much rendering is done at run time
 . So i thing this must be issue as cpu is overloaded and i have
 checked this in ddms (CPU Usage). I m yet not satisfied
 with media player api of android .. Also prb with sound pool is that
 some times sample sound is not ready to play as might be some problem in
 that ... Truley i think sound api should be well furnished as i have
 checked on iphone and the game works fine as sounds are also played 
 I mean that same game is working fine on iphone with huge sound files in
 term of size also

 On Fri, Feb 6, 2009 at 12:04 PM, Dave Sparks davidspa...@android.comwrote:



  Suggest you try running top to find out what's hogging the CPU.

  On Feb 5, 9:22 pm, suhas gavas suhas.ga...@gmail.com wrote:
   Hi,
   No ... i m not playing 6 to 7 mp3 files at same time ..
   later that day i have tried with just playing one single file and same
  issue
   . But then i tried SoundPool api and it worked fine
   . Then also i wonder what was the prb with mediaplayer

   On Fri, Feb 6, 2009 at 9:07 AM, Dave Sparks davidspa...@android.com
  wrote:

If you are playing 6 or 7 MP3 files at the same time, you are probably
saturating the CPU just decoding the audio.

On Feb 5, 1:10 am, suhas gavas suhas.ga...@gmail.com wrote:
 hi,

 My program is a 3d game . And i m creating 6 to 7
  media
 player in my game .. so is this y the warning is flashed ?

 On Thu, Feb 5, 2009 at 1:04 PM, Dave Sparks davidspa...@android.com

wrote:

  The message could be a clue, it's trying to tell you that the CPU
  is
  overloaded, i.e. you're trying to do too much. Have you tried
  running
  top to check the CPU load?

  On Feb 4, 10:32 pm, suhas suhas.ga...@gmail.com wrote:
   Hi all,
   I m using mp3 sound in my game (used wav format also) 
   while
   playing sound i get problem as  obtainBuffer timeout (is the CPU
   pegged) and then sound gets played after some time i.e it is not
   syncronized with the game play ... plz help me with this
  issue

   Thnxs
   Suhas
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: OpenGL-ES 2.0 support

2009-02-05 Thread Dave Sparks

We are planning Open GL ES 2.0 hardware binding support for Donuts
(the next release). There will not be a software renderer, so you'll
need to have hardware that supports it. Theoretically it should be
possible write a software renderer as well.

On Feb 5, 3:55 am, AndroidDev son...@hotmail.com wrote:
 Hi.

 Is there any plan to support OpenGL-ES 2.0 in Android?
 Is it possible to write a renderer youself based on OpenGL-ES 2.0?
 In that case, where to start :)
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Processing audio

2009-02-05 Thread Dave Sparks

This is not possible in SDK 1.0.

On Feb 4, 1:01 pm, Natalie natlinn...@gmail.com wrote:
 I would like to be able to extract frequency/amplitude info from
 incoming mic audio.  From looking at previous posts, it looks like the
 way to do this with the current sdk is to write to a file, then tail
 that file.  This means I need to be able to extract frequency/
 amplitude information from .mp4 or 3gpp files, since these are the
 audio formats supported by MediaRecorder.  Now, I'd really prefer not
 to have to decode this data by hand.  :)  MediaPlayer is obviously
 decoding the .mp4 files, so I was wondering if there is any way to tap
 into this functionality, or some other way to get my .mp4 files
 decoded?
 Thanks!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: problem with playing sounds using media player

2009-02-05 Thread Dave Sparks

If you are playing 6 or 7 MP3 files at the same time, you are probably
saturating the CPU just decoding the audio.

On Feb 5, 1:10 am, suhas gavas suhas.ga...@gmail.com wrote:
 hi,

 My program is a 3d game . And i m creating 6 to 7 media
 player in my game .. so is this y the warning is flashed ?

 On Thu, Feb 5, 2009 at 1:04 PM, Dave Sparks davidspa...@android.com wrote:

  The message could be a clue, it's trying to tell you that the CPU is
  overloaded, i.e. you're trying to do too much. Have you tried running
  top to check the CPU load?

  On Feb 4, 10:32 pm, suhas suhas.ga...@gmail.com wrote:
   Hi all,
   I m using mp3 sound in my game (used wav format also)   while
   playing sound i get problem as  obtainBuffer timeout (is the CPU
   pegged) and then sound gets played after some time i.e it is not
   syncronized with the game play ... plz help me with this issue

   Thnxs
   Suhas
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: problem with playing sounds using media player

2009-02-05 Thread Dave Sparks

Suggest you try running top to find out what's hogging the CPU.

On Feb 5, 9:22 pm, suhas gavas suhas.ga...@gmail.com wrote:
 Hi,
 No ... i m not playing 6 to 7 mp3 files at same time ..
 later that day i have tried with just playing one single file and same issue
 . But then i tried SoundPool api and it worked fine
 . Then also i wonder what was the prb with mediaplayer

 On Fri, Feb 6, 2009 at 9:07 AM, Dave Sparks davidspa...@android.com wrote:

  If you are playing 6 or 7 MP3 files at the same time, you are probably
  saturating the CPU just decoding the audio.

  On Feb 5, 1:10 am, suhas gavas suhas.ga...@gmail.com wrote:
   hi,

   My program is a 3d game . And i m creating 6 to 7 media
   player in my game .. so is this y the warning is flashed ?

   On Thu, Feb 5, 2009 at 1:04 PM, Dave Sparks davidspa...@android.com
  wrote:

The message could be a clue, it's trying to tell you that the CPU is
overloaded, i.e. you're trying to do too much. Have you tried running
top to check the CPU load?

On Feb 4, 10:32 pm, suhas suhas.ga...@gmail.com wrote:
 Hi all,
 I m using mp3 sound in my game (used wav format also)   while
 playing sound i get problem as  obtainBuffer timeout (is the CPU
 pegged) and then sound gets played after some time i.e it is not
 syncronized with the game play ... plz help me with this issue

 Thnxs
 Suhas
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Native code is not supported in the current SDK.

2009-02-05 Thread Dave Sparks

Further clarification:

 I was under the impression it is possible to download the java
source code AND the C source code and build them all. 

It is possible to download the open source code and build for the
emulator. If you want the code to run on a specific device, you need
additional libraries that are not necessarily available in open
source. These libraries are provided by the manufacturer and/or third
party vendors and contain hardware specific code.

 If its not possible does it just mean its not possible for 3rd
party developers to do but it is possible for device manufacturers to
do? 

Manufacturers do write native code to adapt Android to their hardware
in order to create these device specific libraries. This code is
usually exposed through a hardware abstraction layer that hides the
eccentricities of the device from the application developer.

We do intend to support native code development in the future. We just
want to take time to refine the native API's before we make them
public. There is nothing more painful than changing an API because you
overlooked something and then having to support the legacy API for
years to come.

On Feb 5, 10:57 am, Jean-Baptiste Queru j...@android.com wrote:
 it is possible to make it work for a given device running a given
 revision of the software. It is not currently possible to make it work
 in a way that behaves predictably on different devices or different
 revisions of the software (including, importantly, devices or software
 revisions that haven't been released yet).

 That's why it's not supported: if you make it work on your device
 right now and it doesn't work in another environment, you shouldn't
 expect to receive any help or to see any effort spent at the framework
 level to try to make your application work in the future. The same is
 true of undocumented classes, functions, or parameter values.

 JBQ



 On Wed, Feb 4, 2009 at 11:28 AM, fructose david_prest...@hotmail.com wrote:

  I keep seeing this quote everywhere whenever anybody asks a question
  about about C code.

  However I don't understand it, nor another quote which is used to
  futher explain it The SDK does not include support for native ARM
  code.

  I was under the impression it is possible to download the java source
  code AND the C source code and build them all.

  Is this therefore incorrect, you cannot build the existing C code for
  a target platform and therefore make changes to it?

  I want to know if it is possible to write some C code in, for example,
  the Library layer, and then write some Java code in the Apllication/
  Application framework that accesses that C code via JNI.

  If this is not possible will it be possible in the future, and if so
  at what point?

  If its not possible does it just mean its not possible for 3rd party
  developers to do but it is possible for device manufacturers to do?
  Otherwise I don't understand how a device manufacturer can create a
  device containing proprietary functionality if there is no ability to
  write and access C code. Is Android forcing manufacturers to write
  *everything* in Java?

 --
 Jean-Baptiste M. JBQ Queru
 Android Engineer, Google.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Playing an audio file during a call

2009-02-04 Thread Dave Sparks

No, this is not supported. It requires access to in-call audio which
is currently not available to the apps processor.

On Feb 4, 3:36 am, Mak kemper.mar...@gmx.de wrote:
 I want to accept incoming calls and play an audio file for the caller.
 Is there a possibility of playing an audio file during a phonecall,
 so that the caller hears a wav- file?
 It seems to me that nothing is possible during a phone call since
 release 1.0.
 Hope this is getting better in the next release.
 Has anybody information about the next release?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: About media player

2009-02-04 Thread Dave Sparks

This is not a scenario we can support today.

You can get close by decrypting the stream, writing it to a file in
your application's private directory, and playing it from there. This
would keep it reasonably secure unless the phone is jail-broken. Of
course, it's not real-time streaming - you need to write out the
complete file first.

On Feb 4, 7:08 pm, Jerry Yang jer...@intertrust.com wrote:
 Hi, all
 Any feedback? What I need is like: I try to playback a scrambled
 stream/file. I wanna to de-crypto the file and feed the clear stream to
 player. But I did not find any detail information about the media player
 of android platform.
 My possible solution is like:
 1. build a C based service/process to make the decrypto work running,
 then try to give the stream to the player by some way. I still do not
 know how to give the clear stream to decoder. Does anyone has any idea
 about it?
 2. build a C based library to make the de-crypto work, they the android
 java app calls the api to decrypto the stream/file, transfer the clear
 stream/file to player, but I also noticed the android native code
 support is not added, how to transfer the clear stream to decoder is
 also problem here.

 With best wishes
 Jerry

 -Original Message-
 From: android-developers@googlegroups.com

 [mailto:android-develop...@googlegroups.com] On Behalf Of Jerry Yang
 Sent: Tuesday, February 03, 2009 4:45 PM
 To: android-developers@googlegroups.com
 Subject: [android-developers] Re: About media player

 I wanna to add a plug-in with certain de-crypto. That means I receive
 encrypted meida, and de-encrypt it on run time with certain key. My
 propose is to setup a demo system for research and study.
 With best wishes
 Jerry

 -Original Message-
 From: android-developers@googlegroups.com
 [mailto:android-develop...@googlegroups.com] On Behalf Of Dave Sparks
 Sent: Tuesday, February 03, 2009 6:43 AM
 To: Android Developers
 Subject: [android-developers] Re: About media player

 What kind of plug-in do you want to write?

 media player is kind of a vague term. There is the Music player
 application, the MusicPlaybackService, the MovieView activity, the
 VideoView activity, and the MediaPlayer object. Source for all of
 those is available at source.android.com.

 On Feb 1, 12:36 am, Jerry Yang jer...@intertrust.com wrote:
  Hi, all

  I have a simple question, except the default player, is there any open
  source media player we can modify or is there anyway we can add some
  feature to the media player in android platform? Like we add a plug-in
  into the MS media player?

  Thanks

  With best wishes

  Jerry Yang

  Client System Engineer Intertrust.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: problem with playing sounds using media player

2009-02-04 Thread Dave Sparks

The message could be a clue, it's trying to tell you that the CPU is
overloaded, i.e. you're trying to do too much. Have you tried running
top to check the CPU load?

On Feb 4, 10:32 pm, suhas suhas.ga...@gmail.com wrote:
 Hi all,
 I m using mp3 sound in my game (used wav format also)   while
 playing sound i get problem as  obtainBuffer timeout (is the CPU
 pegged) and then sound gets played after some time i.e it is not
 syncronized with the game play ... plz help me with this issue

 Thnxs
 Suhas
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: RTSP Streaming on g1 phone

2009-02-03 Thread Dave Sparks

Frame size? Video and audio bitrate? Anything in the log?

On Feb 3, 3:59 am, Jeff Oh jeff.o...@gmail.com wrote:
  Hi, I'm trying to receive RTSP streaming video with g1. The video
 file I made was encoded using QuickTime pro, and they are progressive
 streamable with a hint track. Video is encoded in H.264, and audio is
 encoded in AAC LC. File container is MP4. (They can be played via
 sdcard)

  I used Darwin Streaming Server to stream this file.

  With sample media player given from android, I changed 'path' to the
 address like rtsp://172.29.10.109/test.mp4

  The results are really odd. It sometimes (like once per 20~30 times)
 runs well, but in other times, only audio is played and video freezes
 after first 1~3 frames.

  Is there anyone having same problem or any idea? Any comments will be
 very appreciated.
  Thanks in forward.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: About media player

2009-02-02 Thread Dave Sparks

What kind of plug-in do you want to write?

media player is kind of a vague term. There is the Music player
application, the MusicPlaybackService, the MovieView activity, the
VideoView activity, and the MediaPlayer object. Source for all of
those is available at source.android.com.

On Feb 1, 12:36 am, Jerry Yang jer...@intertrust.com wrote:
 Hi, all

 I have a simple question, except the default player, is there any open
 source media player we can modify or is there anyway we can add some
 feature to the media player in android platform? Like we add a plug-in
 into the MS media player?

 Thanks

 With best wishes

 Jerry Yang

 Client System Engineer Intertrust.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Audio in the emulator...

2009-02-02 Thread Dave Sparks

Sorry, I don't have a Windows machine for testing. Maybe one of our
developer advocates can help you with that.

From the debug spew from the audio input driver, it sounds like it
should be recording. Do you see the file being written to the SD card?

On Feb 1, 2:45 am, Nicolas Cosson dodgemysp...@gmail.com wrote:
 Thanks for the advice,

 I work on windows vista,

 the audio backend found is:
  winaudioWindows wave audio
 But it doesn't work with audio-out and audio-in at the same time for me.
 emulator: warning: opening audio output failed

 audio-out works alone
 I tested it with the sample provided with the sdk when reading ressources.
 It doesn't when trying to read the sdcard.

 the sdcard isn't readable in settings too, and it won't launch itself from
 eclipse with the additionnal emulator command line option, but only from
 cmd.exe with the arguments -sdcard sdimg.iso

 However, the logcat says this repetitivly when I click on record with
 audio-in:
 D/AudioHardware   25: AudioStreamInGeneric::read0x40308160, 320 from fd
 7

 I finally found the sound recorder application you where talking about, I
 haven't tested it yet, but the source code is 100* bigger than 
 mine...http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder...

 Maybe I have made a mistake somewhere, any help is greatly appreciated

 thanks for your time



 On Sun, Feb 1, 2009 at 5:15 AM, Dave Sparks davidspa...@android.com wrote:

  Try this:

  emulator -help-audio-in

  It will tell you which audio backends are available on your system.
  You didn't specify what OS you are using.

  I think there was also some sample code in the SDK at one point. Maybe
  one of the developer advocates can point you to it. Another option is
  to look for the source for the Sound Recorder application on
  source.android.com. It should be in packages/apps/SoundRecorder.

  On Jan 31, 10:45 am, nicolas cosson dodgemysp...@gmail.com wrote:
   Hello,

   I have been searching for some time and I can't find a detailled
   tutorial on how to easily record and then read audio on the emulator
   under eclipse. I have found these steps:

   - You have to install a virtual sd card with mksdcard.exe 1024M
   sdimg.iso

   -then run the emulator : emulator.exe -sdcard
   sdimg.iso//where sdimg.iso it is the path to
   the sdcard

   -then run adb.exe : adb push local_file sdcard/remote_file

   -then you should put : uses-permission
   android:name=android.permission.RECORD_AUDIO/uses-permission in
   the androidmanifest.xml

   -then there is some code to implement which should look like :
   private void startRecord() {

  recorder = new MediaRecorder();
   recorder.setAudioSource(MediaRecorder.AudioSource.MIC);  //ok
  so
   I say audio source is the microphone, is it windows/linux microphone
   on the emulator?

  recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
   recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
   recorder.setOutputFile(/sdcard/test.3gpp);

   recorder.prepare();
   recorder.start();

   }

   -then you should stop with : recorder.stop(); // at some point (I have
   no idea when and where to decide to stop but I haven't searched yet)

   -then you should play it.

   I have also heard about :
 http://code.google.com/intl/fr/android/reference/emulator.html#sdcard
   in this page of the manual (~1/4 of the total scroll), there are some
   informations about Emulator Startup Options, one of them is about
   Media  -audio backend
   I couldn't find much about that backend thing, google didn't said much
   about it. I still don't know if it's important to the audio recording
   process.

   The fact is all these steps are pretty blurry to me, and I believe I
   am not the only android newbie trying to record some sound :)

   Anyone knows where we can find a complete tutorial for dummies
   teaching this feature?

   Any help is of course greatly appreciated

   Thanks.

   On 27 jan, 20:08, Breno breno.min...@gmail.com wrote:

Hey Andrei,

   To recordaudioit's pretty easy. But, you must record in
sdcard, only. Be sure your path to file it's pointing to sdcard, and
you have one mounted in eclipse (or something else). It's working
perfectly.

Regards

Breno

On Jan 15, 8:58 am, Andrei Craciun avcrac...@gmail.com wrote:

 Thanks David...

 2009/1/15 David Turner di...@android.com

  the emulator now supportsaudiorecording. If you have problems with
  it,
  you should report mode detailed information about it here

  On Thu, Jan 15, 2009 at 11:05 AM, Andrei Craciun 
  avcrac...@gmail.comwrote:

  Hi All,
  As reported on this blog:

 http://blog.roychowdhury.org/2008/04/29/sip-ua-for-android-stack-rtp-...the 
 emulator, but everything
  works fine on the real phone. Does anyone has a workaround

[android-developers] Re: MediaRecorder - No value other than 0 returned from getMaxAmplitude

2009-01-31 Thread Dave Sparks

Are you running on a G1 or on the emulator? If on the emulator, maybe
audio input isn't working correctly and it's failing to open the audio
input device.

On Jan 31, 9:59 am, Phill Midwinter ph...@grantmidwinter.com wrote:
 Looking at adb logcat I'm getting this error:

 *Record channel already open*

 Could this be a bug? I don't understand how it could already be open..
 nothing else is recording on the device?

 2009/1/31 ph...@grantmidwinter.com ph...@grantmidwinter.com





  I've got a media recorder, prepared and started in the following way:

  mRecorder = new MediaRecorder();
             mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
             mRecorder.setOutputFormat
  (MediaRecorder.OutputFormat.THREE_GPP);
             mRecorder.setAudioEncoder
  (MediaRecorder.AudioEncoder.AMR_NB);
             mRecorder.setOutputFile(/sdcard/test.3gpp);
             mRecorder.prepare();
             mRecorder.start();

  I'm then using a thread to return the mRecorder.getMaxAmplitude int,
  but it never returns as anything but 0. I've looked all over for why
  this might be happening - but I'm at a complete loss now so help would
  really be appreciated.

  Thanks.

 --
 Phill Midwinter
 Director
 Grant Midwinter Limited
 d: 0844 736 5234 x: 0
 m: 07538 082156
 e: ph...@grantmidwinter.com
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: gapless playback

2009-01-31 Thread Dave Sparks

Yes, two reasons actually.

One, because the Ogg spec allows for specifying the exact number of
samples in the stream. MP3 does not have an official way to specify
the number of samples (there are unofficial ways).

Two, because the Ogg player has a native loop capability that allows
us to seamlessly loop at the exact sample boundary. OpenCore does not
support seamless loops, we have to explicitly seek to the beginning of
the file in loop mode. Thus any streams that are rendered by OpenCore
(i.e. everything but Ogg and MIDI) will have gaps.

On Jan 31, 1:01 am, Phill Midwinter ph...@grantmidwinter.com wrote:
 That works perfectly, thanks for the help.

 Do you know why it works?

 2009/1/31 Dave Sparks davidspa...@android.com





  Use Ogg files, you can get a nice seamless loop. We use this for the
  ringtones.

  On Jan 30, 10:30 am, ph...@grantmidwinter.com
  ph...@grantmidwinter.com wrote:
   Hoya,

   When using a mediaplayer to play back an audio file, you can set
   looping to true - that's not the issue here.

   If looping is set to true, there's still an audible gap between the
   file finishing and then starting again.

   How could I achieve true gapless playback? I've attempted using two
   instances of the same file, overriding oncomplete and onseek.. can't
   seem to improve the gap though. Any help appreciated.

 --
 Phill Midwinter
 Director
 Grant Midwinter Limited
 d: 0844 736 5234 x: 0
 m: 07538 082156
 e: ph...@grantmidwinter.com
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: MapView disable?

2009-01-31 Thread Dave Sparks

I don't know anything about MapView. What service are you on e.g.
WiFi, 3G/EDGE (what carrier)? Is there anything useful in the log?
Maybe a proxy failure?

On Jan 30, 5:46 pm, Keiji Ariyama ml_andr...@c-lis.co.jp wrote:
 Hi folks,

 Now, I'm developing an Android app called Echo.
 But 5 hours ago, My Dev phone haven't displayed MapView.

 First, I supposed that my API-key have been disabled.
 I tried to Maps API Key signup agian. But google server response error.

  Server Error
  The server encountered a temporary error and could not complete your 
  request.

  Please try again in 30 seconds.

 And not only Echo but also other all map apps cannot display map.

 Default map apps display error message that [Attention] There is a
 connection problem... we'll keep trying..

 If you can think of anything that may help me, I'll appreciate it.

 --
 Keiji,
 ml_andr...@c-lis.co.jp
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Audio in the emulator...

2009-01-31 Thread Dave Sparks

Try this:

emulator -help-audio-in

It will tell you which audio backends are available on your system.
You didn't specify what OS you are using.

I think there was also some sample code in the SDK at one point. Maybe
one of the developer advocates can point you to it. Another option is
to look for the source for the Sound Recorder application on
source.android.com. It should be in packages/apps/SoundRecorder.

On Jan 31, 10:45 am, nicolas cosson dodgemysp...@gmail.com wrote:
 Hello,

 I have been searching for some time and I can't find a detailled
 tutorial on how to easily record and then read audio on the emulator
 under eclipse. I have found these steps:

 - You have to install a virtual sd card with mksdcard.exe 1024M
 sdimg.iso

 -then run the emulator : emulator.exe -sdcard
 sdimg.iso                        //where sdimg.iso it is the path to
 the sdcard

 -then run adb.exe : adb push local_file sdcard/remote_file

 -then you should put : uses-permission
 android:name=android.permission.RECORD_AUDIO/uses-permission in
 the androidmanifest.xml

 -then there is some code to implement which should look like :
 private void startRecord() {

            recorder = new MediaRecorder();
             recorder.setAudioSource(MediaRecorder.AudioSource.MIC);  //ok so
 I say audio source is the microphone, is it windows/linux microphone
 on the emulator?
             recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
             recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
             recorder.setOutputFile(/sdcard/test.3gpp);

             recorder.prepare();
             recorder.start();

                 }

 -then you should stop with : recorder.stop(); // at some point (I have
 no idea when and where to decide to stop but I haven't searched yet)

 -then you should play it.

 I have also heard about 
 :http://code.google.com/intl/fr/android/reference/emulator.html#sdcard
 in this page of the manual (~1/4 of the total scroll), there are some
 informations about Emulator Startup Options, one of them is about
 Media  -audio backend
 I couldn't find much about that backend thing, google didn't said much
 about it. I still don't know if it's important to the audio recording
 process.

 The fact is all these steps are pretty blurry to me, and I believe I
 am not the only android newbie trying to record some sound :)

 Anyone knows where we can find a complete tutorial for dummies
 teaching this feature?

 Any help is of course greatly appreciated

 Thanks.

 On 27 jan, 20:08, Breno breno.min...@gmail.com wrote:

  Hey Andrei,

             To recordaudioit's pretty easy. But, you must record in
  sdcard, only. Be sure your path to file it's pointing to sdcard, and
  you have one mounted in eclipse (or something else). It's working
  perfectly.

  Regards

  Breno

  On Jan 15, 8:58 am, Andrei Craciun avcrac...@gmail.com wrote:

   Thanks David...

   2009/1/15 David Turner di...@android.com

the emulator now supportsaudiorecording. If you have problems with it,
you should report mode detailed information about it here

On Thu, Jan 15, 2009 at 11:05 AM, Andrei Craciun 
avcrac...@gmail.comwrote:

Hi All,
As reported on this blog:
   http://blog.roychowdhury.org/2008/04/29/sip-ua-for-android-stack-rtp-...problems
inrecordingaudioon the emulator, but everything
works fine on the real phone. Does anyone has a workaround forrecording
   audioon the emulator?

Thanks in advance,
A.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: MediaRecorder - No value other than 0 returned from getMaxAmplitude

2009-01-31 Thread Dave Sparks

You shouldn't have to reboot the device. The release() call just
forces the release of the hardware resources instead of waiting for
the garbage collector to come along.

You should always call release() in your onPause() if you are using a
singleton hardware resource like a media player, media recorder,
camera, etc.

On Jan 31, 11:10 am, Phill Midwinter ph...@grantmidwinter.com wrote:
 I was running on a G1.

 I've just found the problem though. On previous debug sessions - the
 mediarecorder hadn't been properly released.

 Solution was to reboot the phone, add overrides to onpause and ondestroy,
 making sure it's released in each case and then run again. Works like a
 charm!

 2009/1/31 Dave Sparks davidspa...@android.com





  Are you running on a G1 or on the emulator? If on the emulator, maybe
  audio input isn't working correctly and it's failing to open the audio
  input device.

  On Jan 31, 9:59 am, Phill Midwinter ph...@grantmidwinter.com wrote:
   Looking at adb logcat I'm getting this error:

   *Record channel already open*

   Could this be a bug? I don't understand how it could already be open..
   nothing else is recording on the device?

   2009/1/31 ph...@grantmidwinter.com ph...@grantmidwinter.com

I've got a media recorder, prepared and started in the following way:

mRecorder = new MediaRecorder();
           mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
           mRecorder.setOutputFormat
(MediaRecorder.OutputFormat.THREE_GPP);
           mRecorder.setAudioEncoder
(MediaRecorder.AudioEncoder.AMR_NB);
           mRecorder.setOutputFile(/sdcard/test.3gpp);
           mRecorder.prepare();
           mRecorder.start();

I'm then using a thread to return the mRecorder.getMaxAmplitude int,
but it never returns as anything but 0. I've looked all over for why
this might be happening - but I'm at a complete loss now so help would
really be appreciated.

Thanks.

   --
   Phill Midwinter
   Director
   Grant Midwinter Limited
   d: 0844 736 5234 x: 0
   m: 07538 082156
   e: ph...@grantmidwinter.com

 --
 Phill Midwinter
 Director
 Grant Midwinter Limited
 d: 0844 736 5234 x: 0
 m: 07538 082156
 e: ph...@grantmidwinter.com
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: android audio

2009-01-31 Thread Dave Sparks

In SDK 1.0, you can only record to a file using the AMR-NB codec,
which is bandwidth limited to 4KHz, and the encoding process itself is
pretty lossy. If you want to experiment with this on a G1, go to
Messaging, create a new message, click menu and select Attach, and
select Record Audio. This will take you to the Sound Recorder activity
where you can record and playback audio to get an idea of the audio
quality. A future release of the SDK will enable getting at the raw
audio samples.

On Jan 31, 8:34 pm, solid young...@gmail.com wrote:
 I am trying to discover the frequeny of recored audio.  I am kinda new
 to multimedia programming (I am very new to android).  Is there a way
 to get raw frequency information from the mic?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: gapless playback

2009-01-30 Thread Dave Sparks

Use Ogg files, you can get a nice seamless loop. We use this for the
ringtones.

On Jan 30, 10:30 am, ph...@grantmidwinter.com
ph...@grantmidwinter.com wrote:
 Hoya,

 When using a mediaplayer to play back an audio file, you can set
 looping to true - that's not the issue here.

 If looping is set to true, there's still an audible gap between the
 file finishing and then starting again.

 How could I achieve true gapless playback? I've attempted using two
 instances of the same file, overriding oncomplete and onseek.. can't
 seem to improve the gap though. Any help appreciated.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: how to overlay an image over the camera preview?

2009-01-30 Thread Dave Sparks

No, you don't draw on the camera preview surface. You create a
transparent surface above it in the Z stack and draw on that.

On Jan 30, 5:31 pm, srajpal sraj...@gmail.com wrote:
 I checked out the api demo, it helps to place the camera preview on
 top of the surface view, but the buffers are handid over to the camera
 so anything drawn on the canvas, which is received from the handler,
 is ignored.

 There must be some way.  I just don't know it yet.

 On Jan 30, 7:27 pm, Dianne Hackborn hack...@android.com wrote:

  There is an Api Demo showing how to generally do this with a surface view.
  It's very easy, since SurfaceView essentially operates like any other view
  in terms of compositing.

  On Fri, Jan 30, 2009 at 3:48 PM, srajpal sraj...@gmail.com wrote:

   Does someone know how I can overlay an image over the camera preview?

  --
  Dianne Hackborn
  Android framework engineer
  hack...@android.com

  Note: please don't send private questions to me, as I don't have time to
  provide private support.  All such questions should be posted on public
  forums, where I and others can see and answer them.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Reg playing youtube video in android

2009-01-29 Thread Dave Sparks

Interesting, I didn't know that made it to the 1.0 release. Thanks!

On Jan 29, 6:48 am, kolby kolbys...@gmail.com wrote:
 Dave,

 you can play youtube video by launching a View intent with the youtube
 video page url right now on the device (no cupcake or developer key
 required). It will launch the youtube player.
 The problem is, it doesn't work in the emulator, since there's no
 youtube app.

 Michael

 On Jan 28, 3:37 pm, Dave Sparks davidspa...@android.com wrote:

  If you specifically want to play a YouTube video, you need to register
  as a YouTube developer to get the keys you need to access the Gdata
  feeds.

  I think that the Cupcake release will support a new intent for playing
  YouTube videos with the YouTube app, but it will be some time before
  that feature makes it into an SDK or actual device. That would remove
  the need for getting the developer keys.

  Aside from that, playing a YT video is no different than any other
  HTTP streamed video. You just pass the URL through setDataSource and
  the media player does the rest.

  On Jan 28, 1:42 am, harish harishpres...@gmail.com wrote:

   Dear All,

   Using Mediaplayer (Api Demo) example, i was able to play a local video
   file and also an video mp4 streamed over internet on my android
   emulator using sdk in my development machine.

   When i run an youtube video from browser running in an emulator, it
   asks for to download flash player.

   Is there any way to play youtube video using mediaplayer or any other
   alternative methods in android.

   In emulator session, is it possile to play youtube video.

   Guys, help me in this. any reference or example of youtube video being
   played in the emulator using sdk is available.

   Thanks and Regards,
   HarishKumar.V
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: what is maximum height and width of image which can display?

2009-01-29 Thread Dave Sparks

The application heap is limited to 16MB. For RGB565, that's a max of
8M pixels, not including the heap that the app uses for other objects.
Chances are, you are probably decoding from a JPEG, so you need room
for both the compressed and uncompressed version.

On Jan 29, 7:41 am, Phill Midwinter ph...@grantmidwinter.com wrote:
 That's a Z buffer.  A correct comparison would be running a game in
 1280x1024 but only having an 800x600 capable screen... in which case the
 game wouldn't display.

 This is simply a 2d image, why not load it all into memory if you're going
 to be dragging it around?

 2009/1/29 Stoyan Damov stoyan.da...@gmail.com





  I still don't get it. The viewport is 480x320. Imagine if you are
  trying to play Quake or Unreal Tournament and the game tries to draw
  the entire level, even though only a tiny part of it is visible.

  On Thu, Jan 29, 2009 at 4:39 PM, Phill Midwinter
  ph...@grantmidwinter.com wrote:
   I suppose for something like the camera gallery, so you can view your
  photos
   at full res?

   2009/1/29 Stoyan Damov stoyan.da...@gmail.com

   Why would you want to draw a 1000x1000 image on 480x320 screen?

   On Thu, Jan 29, 2009 at 2:05 PM, jj jagtap...@gmail.com wrote:

what is maximum height and width of image which can display?

When I try for 1000px height and 1000pix width it work, but for beyond
it,
it gives exception, appl force close.

   --
   Phill Midwinter
   Director
   Grant Midwinter Limited
   d: 0844 736 5234 x: 0
   m: 07538 082156
   e: ph...@grantmidwinter.com

 --
 Phill Midwinter
 Director
 Grant Midwinter Limited
 d: 0844 736 5234 x: 0
 m: 07538 082156
 e: ph...@grantmidwinter.com
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: How to extract an image from a video

2009-01-29 Thread Dave Sparks

This is not possible with the 1.0 SDK. This feature will be available
in the Cupcake release.

On Jan 28, 11:16 pm, Raghu gragh...@gmail.com wrote:
 Hi,

       I want to get thumbnail from a video. So I need to extract first
 frame in the video. Please let me know how can I do that in Android.

 Thanks in Advance,
 Raghu
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Audio chat

2009-01-29 Thread Dave Sparks

This is not really possible with 1.0 SDK. Someone has done a walkie-
talkie like program that records audio to a file and then streams it,
but not real-time audio.

The Cupcake release will add the capability to stream audio from the
mic into a Java app and stream audio from a Java app to the audio
output. However, there is no support for streaming audio over RTSP, so
you have a fair bit of work cut out for you.

On Jan 29, 12:57 am, gunar adrian.proco...@gmail.com wrote:
 Hi!

 I have a question? Can someone give me a direction where to look to
 make an audio chat application?
 I saw some threads related to audio streaming, also other related to
 audio recording/playing.
 How to setup a RTP/RTCP/RTSP session between two Android clients?

 Best regards!
 Adrian
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Video resolution support

2009-01-29 Thread Dave Sparks

I don't know that there are any specific restrictions on the video
resolution for the software codecs. It's really a matter of whether
the CPU has adequate cycles to decode it in real-time.

Some of the codecs are hardware accelerated and they do have
restrictions about frame size, bit-rate, and fps. Each product will
have different capabilities based on the underlying hardware, so it's
not possible to give maximums.

We are going to try to establish a minimum performance benchmark for
all Android compatible devices in the near future.

On Jan 29, 12:52 am, Raghu D K dk.ra...@gmail.com wrote:
 Hello All,

 What are the video resolutions supported by the PV codecs in the
 opencore module? Is there a manual or guide describing this?

 Warm Regards,
 Raghu
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: MediaPlayer seeking back

2009-01-29 Thread Dave Sparks

I believe the OpenCore HTTP streaming engine maintains a circular
buffer of data. As data is played out, the buffer space it occupied is
re-filled with new data. When you seek backwards in the stream, it has
to re-fill the buffer from the earlier part of the stream.

On Jan 29, 1:05 pm, ed edwin.fuq...@gmail.com wrote:
 I've been using androids MediaPlayer to stream from an http url and
 have a question about seeking.  Currently, our urls expire after they
 have been used once or a certain time out has expired to dissuade
 scraping content.  Now, this obviously makes progressive streaming
 past the buffer impossible with the exact same url as you need to open
 a new http connection with the same mangled key, which we
 intentionally don't allow.

 However, MediaPlayer seems to do this when seeking before the current
 position (i.e. seeking from 1:00 in the audio to 0:30).  As the file
 has already been downloaded up to the current position I'm confused as
 to why MediaPlayer is still trying to initate a new http connection in
 this case?  The only thing I can think of is that MediaPlayer is
 getting rid of audio its already played up to the current position,
 and hence needs to restart the connection if you try to seek back on
 the stream.  Is this correct, or is there something else going on?

 Thanks,
 Ed
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Simultaneous Photography using Andriod

2009-01-28 Thread Dave Sparks

The simplest approach is just firing off an intent to the existing
camera app to take a picture. This requires the user to push the
shutter button.

If you want it purely under program control, you could have the
application snap the picture without the user pressing a button. It
just takes a bit more work.

On Jan 28, 2:20 am, Thomas thomas.gar...@gmail.com wrote:
 I am working on an art project with several other people that involves
 simultaneity and photography.  We want to coordinate a series of
 events involving simultaneous photographs based on the Android system.

 In simple terms, here is what we are proposing.  Someone writes an app
 for Android Phone/Camera that makes the camera take a photograph at a
 predetermined time so that many thousands of people all take a
 simultaneous photograph.  The art involved here is not only the
 synchronizing of the event to approximate simultaneity but also making
 social networking into an element of an art project.

 Could someone please help with this?  Even if just to give a few
 suggestions about how to move the project forward?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Reg playing youtube video in android

2009-01-28 Thread Dave Sparks

If you specifically want to play a YouTube video, you need to register
as a YouTube developer to get the keys you need to access the Gdata
feeds.

I think that the Cupcake release will support a new intent for playing
YouTube videos with the YouTube app, but it will be some time before
that feature makes it into an SDK or actual device. That would remove
the need for getting the developer keys.

Aside from that, playing a YT video is no different than any other
HTTP streamed video. You just pass the URL through setDataSource and
the media player does the rest.

On Jan 28, 1:42 am, harish harishpres...@gmail.com wrote:
 Dear All,

 Using Mediaplayer (Api Demo) example, i was able to play a local video
 file and also an video mp4 streamed over internet on my android
 emulator using sdk in my development machine.

 When i run an youtube video from browser running in an emulator, it
 asks for to download flash player.

 Is there any way to play youtube video using mediaplayer or any other
 alternative methods in android.

 In emulator session, is it possile to play youtube video.

 Guys, help me in this. any reference or example of youtube video being
 played in the emulator using sdk is available.

 Thanks and Regards,
 HarishKumar.V
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Video and Audio format Supported by Android

2009-01-28 Thread Dave Sparks

We are getting ready to post some new pages in the SDK area that will
cover this information.

If you can't wait for that, do a search on this forum for codecs. It
has been covered a number of times.

On Jan 28, 2:33 am, Tom vmspa...@gmail.com wrote:
 Hi All

  i want to knw what r the Audio and Video format r supported by
 Android by Default .

 is thier any plug-ins  needs to be add in to SDK .
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Audio Streaming Integration

2009-01-27 Thread Dave Sparks

If you are really ambitious, you can download the Cupcake source,
unhide all the new API's and build the SDK yourself. However, that is
a topic for a different list.

On Jan 27, 5:58 am, Jean-Baptiste Queru j...@google.com wrote:
 You can't. You'll have to wait for an SDK built form the cupcake code
 base, and there is currently no such thing.

 JBQ

 On Tue, Jan 27, 2009 at 5:56 AM, Tez earlencefe...@gmail.com wrote:

  Hi,

  How can I integrate the cupcake audio streaming code into the existing
  android sdk?

  Cheers,
  Earlence

 --
 Jean-Baptiste M. JBQ Queru
 Android Engineer, Google.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: the problem with libopencoreplayer.so when compile Android source code

2009-01-27 Thread Dave Sparks

This message is off-topic, this forum is for application developers.

Try the android-framework list.

On Jan 26, 10:53 pm, bardshen bard.s...@gmail.com wrote:
 Dear Sirs:
     when i try to build the Android source code download from internet
 use repo. i meet the following problem:

     external/opencore//android/mediascanner.cpp:924: undefined
 reference to `ucnv_open_3_8'
 external/opencore//android/mediascanner.cpp:957: undefined reference
 to `ucnv_convertEx_3_8'
 external/opencore//android/mediascanner.cpp:970: undefined reference
 to `ucnv_close_3_8'
 external/opencore//android/mediascanner.cpp:971: undefined reference
 to `ucnv_close_3_8'
 external/opencore//android/mediascanner.cpp:919: undefined reference
 to `ucnv_open_3_8'
 collect2: ld returned 1 exit status

 My environment has build up under the Intruction in how get the
 Android source code web page under ubuntu 8.04.

 would you please give me some tips or advice?
 thank you very much!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: MediaRecorder docs fix

2009-01-26 Thread Dave Sparks

Would you please post a bug with specifics? Thanks!

On Jan 26, 12:03 pm, Tim Bray timb...@gmail.com wrote:
 The section Recording Media Resources 
 ofhttp://code.google.com/android/toolbox/apis/media.htmlseems to be out of
 date and wrong.  I got working code 
 fromhttp://rehearsalassist.svn.sourceforge.net/viewvc/rehearsalassist/and...
 which
 is quite different.
 It would be a good idea to correct at least remove the misleading online
 version. -Tim
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Recording audio stream

2009-01-26 Thread Dave Sparks

I can't give you specifics about deployment because:

(a) I don't know, it's up to the carrier and manufacturer to decide
when they deploy new releases

(b) Even I did know, there's a fairly good chance I couldn't tell you
due to NDA's we have with our partners

You can see the code in development on the Cupcake branch at
android.git.kernel.org.

On Jan 26, 11:31 am, benmccann benjamin.j.mcc...@gmail.com wrote:
 I'm happy to hear future releases will support the ability to stream
 audio being recorded.  Any ETA on this?

 On Dec 30 2008, 9:58 am, Dave Sparks davidspa...@android.com wrote:

  It's probably not really streaming audio. Some people are working
  around the issue by tailing the file as it is being written.

  On Dec 30, 5:03 am, FranckLefevre flas...@gmail.com wrote:

   The application Phone Recorder available in Market softwares already
   does this pretty well.
   I don't know if sources are available somewhere...

   Franck.

   On Dec 25, 1:27 am, vitalii.mi...@gmail.com

   vitalii.mi...@gmail.com wrote:
  Is there any way torecordaudiostreamand send streaming audio  to
network ??  Instead of recording to file.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Identifying pictures taken on 'this' device

2009-01-26 Thread Dave Sparks

The camera application in the Cupcake branch does it somehow. You
could try looking at the code in packages/apps/Camera.

On Jan 26, 4:38 pm, GiladH gila...@gmail.com wrote:
 Hey,

 Is there a way to identify which of the MediaStore images has been
 taken
 on 'this' device, as opposed to pictures exported/downloaded from an
 external source ?

 I have started my search by creating a Camera.PictureCallback and
 registering it
 by  camera.takePicture() only to realize that it only gets called upon
 images taken from _my own_ application -  and idea goes to basket.

 I have also looked within MediaStore.Images.Media for fields that
 might carry relevant information, with no luck.

 TIA,
 GiladH
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: How to record calls? Or Is it possible to record calls?

2009-01-24 Thread Dave Sparks

It is not possible to access call audio in the G1. This is a function
of the firmware in the radio and DSP and is currently not supported.

It is possible that future devices may enable this functionality, but
at the moment it is not part of a planned release.

On Jan 24, 2:32 am, javame_android su...@saltriver.com wrote:
 Hi,

 I have read about intents in particular ACTION_CALL,
 ACTION_CALL_BUTTON, ACTION_OUTGOING_CALL. I think application will get
 started on receiving this intents and then call recording
 functionality should be invoked. So, can someone please guide me as to
 how to record call after that.

 Moreover, whatever I have suggested is correct or not. If its not then
 please let me know the correct way.

 Thanks  Regards
 Sunil
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: android.hardware.Camera - JPEG image of correct size but always black

2009-01-23 Thread Dave Sparks

The camera service has no concept of foreground activity. It simply
gives the camera to the app that requests it. If another app currently
owns the camera, it is notified when the camera is stolen.

I don't know all the rationale for that design decision. It's probably
not the way I would have designed it, but I'm not sure that it's
wrong, either. However, we do need a way to reset the camera to a
known state when the new owner takes over.

Also, there is a new mechanism in Cupcake for sharing a camera between
two processes. The remote can be passed around and the owner can
unlock the camera to allow a cooperative use of the camera. This is
used to switch quickly between still camera mode and video record mode
(the codecs live in another process and need direct access to the
camera to get frame buffers).

On Jan 23, 6:08 am, Pascal Merle pmerl...@googlemail.com wrote:
 Yes, that would be great.

 I have been able to test my app running in foreground. In background
 will be much better. It sometimes just needs to take a snapshot so
 there is no problem with the battery.

 When implementing you have to take care about the locking mechanism.
 As you know the camera can only be used by the activity shown on
 screen now (implicitly locked by the preview). I found out however,
 that this concept is not 100% clean. If your surface got created and
 you start the preview immediately, without any delay, you can conflict
 with another app still stopping its preview. So work has to be done on
 proper locking of the camera anyway.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: android.hardware.Camera - JPEG image of correct size but always black

2009-01-23 Thread Dave Sparks

By contract, apps are supposed to release the camera in onPause().
Failing to do so could result in the battery draining quickly because
we have the DSP and sensor hardware spun up.

The problem that the current design choice introduces is that the
camera can be in the middle of any operation when another app steals
it away. There is no mechanism for returning the camera to a known
state or even querying the state so that appropriate steps can be
taken to get it to the right state.

On Jan 23, 9:39 am, Dianne Hackborn hack...@android.com wrote:
 The locking is done that way because as you move through the UI the next
 activity receiving focus should be the one to get the camera, as soon as it
 comes up.  Otherwise if you switch from one activity using the camera
 directly to another using the camera, the new one won't be able to access it
 because the previous one has not yet had a chance to close it.

 On Fri, Jan 23, 2009 at 9:08 AM, Dave Sparks davidspa...@android.comwrote:





  The camera service has no concept of foreground activity. It simply
  gives the camera to the app that requests it. If another app currently
  owns the camera, it is notified when the camera is stolen.

  I don't know all the rationale for that design decision. It's probably
  not the way I would have designed it, but I'm not sure that it's
  wrong, either. However, we do need a way to reset the camera to a
  known state when the new owner takes over.

  Also, there is a new mechanism in Cupcake for sharing a camera between
  two processes. The remote can be passed around and the owner can
  unlock the camera to allow a cooperative use of the camera. This is
  used to switch quickly between still camera mode and video record mode
  (the codecs live in another process and need direct access to the
  camera to get frame buffers).

  On Jan 23, 6:08 am, Pascal Merle pmerl...@googlemail.com wrote:
   Yes, that would be great.

   I have been able to test my app running in foreground. In background
   will be much better. It sometimes just needs to take a snapshot so
   there is no problem with the battery.

   When implementing you have to take care about the locking mechanism.
   As you know the camera can only be used by the activity shown on
   screen now (implicitly locked by the preview). I found out however,
   that this concept is not 100% clean. If your surface got created and
   you start the preview immediately, without any delay, you can conflict
   with another app still stopping its preview. So work has to be done on
   proper locking of the camera anyway.

 --
 Dianne Hackborn
 Android framework engineer
 hack...@android.com

 Note: please don't send private questions to me, as I don't have time to
 provide private support.  All such questions should be posted on public
 forums, where I and others can see and answer them.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Porting JMF - Native source code to android based

2009-01-22 Thread Dave Sparks

We do not support native code on Android at this time, but we have
plans to publish a native SDK soon.

On Jan 22, 2:03 am, MRK infoto...@gmail.com wrote:
 I am creating an Android application which uses the JMF (SIP, RTP,
 JAIN). So i downloaded the JMF source code for some adhoc change to my
 application.

 The basic questions
 1. How will it behave the native files - C/C++/header in JMF source
 files(downloaded)? because android is based on java right now.
 2. Is there any complete different way to porting/doing the SIP, RTP
 working in android?

 Note : The downloaded JMF source code is not direct under from sun.com
 (this) site but thread start from this forum only.

 Is there any link from sun.com(this) site for JMF source code?

 Any suggestion and comment about this?

 Advance thanks.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: android.hardware.Camera - JPEG image of correct size but always black

2009-01-21 Thread Dave Sparks

Right now, the answer is no. Most cameras require that you go to
preview mode before you can take a picture so that the image processor
can grab some frames for auto-focus, white balance, etc.

I'll see if we can get a change into Cupcake that allows you to start
preview without a surface. That should allow you to do what you want.
It will be some time though before this feature will appear on devices
in the field.

The other thing you need to bear in mind is that the camera eats a LOT
of power. You don't want to run it for long periods of time in
background. I forget what the power numbers are on the G1, but I bet
when if kept fully spun-up it will drain the battery in a couple of
hours.

On Jan 21, 9:28 am, Pascal Merle pmerl...@googlemail.com wrote:
 Please also try to address my last question Can this be done from
 within a service (no activity)?. Since a service does not have a UI I
 don't see how I can start a preview from there. I want to take the
 picture from the service running in the background.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Make a photo with Android Emulator

2009-01-21 Thread Dave Sparks

You need to call startPreview() before takePicture. You also need to
supply a PictureCallback function to receive the encoded JPEG. By
passing null, you are telling the camera service you don't want the
final JPEG image.

On Jan 21, 2:50 am, ANDREA P andrewpag...@gmail.com wrote:
 I want to use the camera to make a photo snapshot with Android .

 At the beginning in my code :

 Camera mCamera = Camera.open();

 PictureCallback callback = null;
 mCamera.takePicture(null, null, callback);

 but nothing doesn't happen 

 And if I Use the program Camera of the phone Emulator says that I have
 to insert the SD  Card.

 Please Help me..Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Camera Focus Facility

2009-01-21 Thread Dave Sparks

Roughly:

Camera mCamera = Camera.open();

// this sequence should occur after onResume and surfaceCreated
mCamera.setPreviewDisplay(mSurfaceHolder); // saved from
surfaceCreated call
mCamera.startPreview();

// to start focus - usally called from your onClickListener
mCamera.autoFocus(afCallback);

// to take picture - usually called from your onClick listener
mCamera.takePicture(shutterCallback, null, pictureCallback);

// auto focus callback
public void autoFocusCallback(boolean focused, Camera camera) {
// play focused sound
}

// shutter callback function
public void shutterCallback() {
// play shutter sound
}

// picture callback function
public void pictureCallback(byte rawData[], Camera camera) {
// save JPEG
}

You should also note that there is significant lag from the time
takePicture() is called until the image capture begins. You can
shorten this time by calling auto-focus first. You don't want to move
the camera until after the shutter callback.

On Jan 21, 1:59 am, mobilek...@googlemail.com
mobilek...@googlemail.com wrote:
 Could you list the proper sequence as I'm having hard time working it
 out! Thanks

 On Jan 21, 3:21 am, Dave Sparks davidspa...@android.com wrote:

  Sounds like you might have some sequencing issues. Did you call
  startPreview first?

  On Jan 20, 2:51 pm, mobilek...@googlemail.com

  mobilek...@googlemail.com wrote:
   Hi, thanks for the hint! I've tried that but I got this:

   java.io.IOException: autoFocus failed

   I registered the callback in CameraActivity.surfaceCreated() method.

   Could you advice on how to get that working? Thank you!

   On Jan 20, 5:09 pm, Dave Sparks davidspa...@android.com wrote:

Camera.autoFocus(cb);

where cb is a callback function you supply that tells you focus is
successful or not.

On Jan 20, 5:27 am, mobilek...@googlemail.com

mobilek...@googlemail.com wrote:
 Hi,

 My app is struggling to take focused shots. Is there a built in
 facility that sets an auto-focus property on the camera, so it
 automatically takes clear and focused images. I've noticed that
 feature in numerous well-known apps such as ShopSavvy,
 CompareEverywhere, etc.

 Could you advise on how to achieve that?

 Thank you.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Play wav files in Media player

2009-01-21 Thread Dave Sparks

I suspect that your problem is in some details that you haven't given
us yet.

How many media players are you creating at the same time?

On Jan 20, 10:47 pm, ena enu1...@gmail.com wrote:
 On Jan 21, 8:23 am, Dave Sparks davidspa...@android.com wrote: What is the 
 format of the data in the WAVE file?

 i play audio file of wav format.

  OpenCore only supports 8- and 16-bit linear PCM.

 i think that is not file problem because some time the files are able
 to played and some time not...

  On Jan 20, 11:59 am, ena enu1...@gmail.com wrote:

   plz help me outActually i want to play many file one by one in
   media player.im using that code

   MediaPlayer  melodyPlayer=MediaPlayer.create(context, resID);
   melodyPlayer.seekTo(0);
   melodyPlayer.start();

   but after same time i get Error

   01-20 23:12:01.785: ERROR/AudioTrack(24): Could not get control block
   01-20 23:12:01.785: ERROR/AudioSink(24): Unable to create audio track
   01-20 23:12:01.785: ERROR/audiothread(24): Error creating AudioTrack
   01-20 23:12:01.876: WARN/PlayerDriver(24):
   PVMFInfoErrorHandlingComplete
   01-20 23:12:01.886: DEBUG/MediaPlayer(316): create failed:
   01-20 23:12:01.886: DEBUG/MediaPlayer(316): java.io.IOException:
   Prepare failed.: status=0x
   01-20 23:12:01.886: DEBUG/MediaPlayer(316): at
   android.media.MediaPlayer.prepare(Native Method)
   01-20 23:12:01.886: DEBUG/MediaPlayer(316): at
   android.media.MediaPlayer.create(MediaPlayer.java:169)
   01-20 23:12:01.886: DEBUG/MediaPlayer(316): at
   org.isol.MyCustomButton$1.onClick(MyCustomButton.java:89)
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Camera Focus Facility

2009-01-20 Thread Dave Sparks

Camera.autoFocus(cb);

where cb is a callback function you supply that tells you focus is
successful or not.

On Jan 20, 5:27 am, mobilek...@googlemail.com
mobilek...@googlemail.com wrote:
 Hi,

 My app is struggling to take focused shots. Is there a built in
 facility that sets an auto-focus property on the camera, so it
 automatically takes clear and focused images. I've noticed that
 feature in numerous well-known apps such as ShopSavvy,
 CompareEverywhere, etc.

 Could you advise on how to achieve that?

 Thank you.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: taking picture from emulator, if pc having webcam

2009-01-20 Thread Dave Sparks

No, this is not supported.

On Jan 20, 3:57 am, jalandar jagtap...@gmail.com wrote:
 is it possible to take photo with emulator's camera?, if the pc(on
 emulator is there) having web cam
 thank you
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: Fwd: Audio Streaming

2009-01-20 Thread Dave Sparks

I think you may need to remove the @hide directives in the new API's.
I can't help you with the Eclipse plug-in though, I still use VI and
make. :)

On Jan 20, 3:17 am, Breno breno.min...@gmail.com wrote:
 Hey Tez,

it's not only a .jar...you need to :

 1) download source code
 2) Compile it (make)
 3) Create SDK (make sdk)

   It's quite easy. The problem is that you cannot use this new SDK in
 Eclipse, because conflicts between the SDK and the Eclipse plugin.
 There is a way to generate this plugin using the source code, but this
 is what i'm trying to figure out

 Cheers

 Breno

 On Jan 20, 2:14 am, Tez earlencefe...@gmail.com wrote:

  Hi,

  I noticed in your post that you were successful in building the new
  sdk. I have been trying to accomplish the same thing. Could you send
  me the jar that you compiled or maybe only the cupcake branch source
  code?

  Cheers,
  Earlence

  On Jan 19, 9:18 pm, Breno T.Minzon breno.min...@gmail.com wrote:

   Hey Dave,

   I downloaded the android source code and this already has Cupcake 
   branch
   merged into main. I compile and made the sdk (make sdk) with success. 
   But
   now i need the eclipse plugin compatible with this version of this 
   custom
   sdk. Note that i just downloaded and compile. I need to work with audio
   streaming, and in our prototype we walked around, but now, we really need
   this feature. So i'm trying to use it. I looked the source, and, 
   apparently,
   Cupcake solves our needs...
I try this 
   link,http://review.source.android.com/5290/diff/201/z474cb3132ea0f2114f9e5...,
   but when i import the projects into my workspace, many compile errors
   appear. I think that there is a lot of new functions and classes, that
   eclipse plugin doesn't detect...So, how can i build successfully the 
   eclipse
   plugin?

   Thanks a lot

   Breno

   On Jan 17, 5:44 am, Dave Sparks davidspa...@android.com wrote: OK, now 
   I see where you going with it. :)

What you want is coming in Cupcake. There is astreaminginterface

   foraudioinput and output that gives you an array ofaudiosamples you

can do your signal processing on.

If you need something for SDK 1.0, there is a MediaRecorder function
called getMaxAmplitude(). You should be able to get what you want by
writing theaudiofile to /dev/null and calling into getMaxAmplitude
periodically. Take a look at the source code for SoundRecorder on
android.git.kernel.org.

On Jan 16, 3:20 pm, Matt Flax flat...@gmail.com wrote:

 we are implementing a sound level meter. For privacy reasons, we don't
   want
 theaudiolying around on the disk.

 We could do it on the fly without recording to disk, however I don't
   think
 that is possible with the sdk ... is it ?

 Matt

 On Fri, Jan 16, 2009 at 12:16 PM, Dave Sparks davidspa...@android.com
   wrote:

  I am pretty sure that won't work. Why do you want to record a bunch 
  of
  smallaudiofiles without dropping samples?

  On Jan 14, 7:52 pm, flatmax flat...@gmail.com wrote:
   Hi there,

   Has anyone managed to recordaudioto small files without dropping
   samples between ?

   perhaps it is possible to use two recorders which toggle between
   recording states ... when on is recording, the other is priming 
   and
   waiting to record ...

   Something like :

   //declare two recorders
   MediaRecorder recorder1 = new MediaRecorder();
   MediaRecorder recorder2 = new MediaRecorder();

   /// continuously prime, start and stop the two streams so one is
   always capturing ...

   //prime and start recorder1 - this is the loop roll in
   recorder1.setAudioSource
   recorder1.setOutputFormat
   recorder1.setAudioEncoder
   recorder1.setOutputFile
   recorder1.prepare
   recorder1.start

   while (1){

   //prime and start recording from recorder2
   recorder2.setAudioSource
   recorder2.setOutputFormat
   recorder2.setAudioEncoder
   recorder2.setOutputFile
   recorder2.prepare

   //wait a certain ammount of time to allow capture
   sleep

   recorder2.start

   recorder1.stop
   recorder1.reset

   //prime recorder1
   recorder1.setAudioSource
   recorder1.setOutputFormat
   recorder1.setAudioEncoder
   recorder1.setOutputFile
   recorder1.prepare

   //wait a certain ammount of time to allow capture
   sleep

   /// switch recorders
   recorder1.start

   recorder2.stop
   recorder2.reset

   }

   recorder1.release();
   recorder2.release();

 --http://www.flatmaxstudios.com
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android

[android-developers] Re: Equalizer...

2009-01-20 Thread Dave Sparks

At this time, there is no mechanism to get at the raw audio after it
is decoded.

On Jan 19, 11:03 am, Valeria vscarba...@gmail.com wrote:
 Hi everyone,

 I'm developing a multimedia player and I want to create an equalizer
 in it , but I can't find any information about how to do it. Could
 anyone help me? I think I read some time ago it could be implemented
 in Android... am I wrong? Thanks in advance...

 Bye
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers-unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~--~~~~--~~--~--~---



[android-developers] Re: missing javax audio?

2009-01-20 Thread Dave Sparks

We haven't released a Cupcake SDK yet. Docs will come when the SDK is
published.

On Jan 17, 6:17 pm, peter cowan paco...@gmail.com wrote:
 static audio buffers and PCM streamin should do what i need. i don't
 see any javadoc api for the new cupcake features on the android site,
 can you point to where this code is located (ie: what package)?

 -peter

 have you written up this FAQ?
 On Jan 7, 6:50 pm, Dave Sparks davidspa...@android.com wrote:

  Thereisnoplantosupportjavax.sound. I guess I need to writeup a
  media FAQ because this question gets asked repeatedly.

  Cupcake hassupportfor streaming PCM audio in and out of Java. It
  also supports static buffers, i.e. load a buffer with sound data and
  trigger (one-shot) or loop. Both static and streamed buffers havesupportfor 
  pitch and volume controls.

  On Jan 7, 1:19 pm, Pv p...@swooby.com wrote:

   Any update on anyone gettingjavax.sound.sampled (or something
   similar) working?

   Pv

   On Nov 23 2008, 12:49 am, MichaelEGR foun...@egrsoftware.com wrote:

Greets...  Wow.. good thread and discussion thus far. Best and most
recent audio thread I've seen.

I am an audio/graphics professional and my first project, for kicks,
is to port my desktop Java Quake3 class game engine to Android and
OpenGL ES as a test of well my ability to get Q3 tech playing with ES
and Android and it didn't take long to realize the state of audio in
SDK 1.0_r1 is not suitable for lots of audio stuff I'd like to
accomplish in general and some novel features for my port (IE voice
chat with desktop clients from built in mic on G1, etc; fun!). We'll
see how the Q3 port turns out; gonna hack a file based push to talk
feature most likely via MediaRecorder; delayed audio w/ this though
and only one way to desktop clients to show proof of concept.

I specifically did not adopt or even look at Android until a handset
device was released and the SDK hit 1.0 simply because I knew the
platform probably would not be complete and I don't have the time /
desire to work around incomplete APIs / implementation (yes, was
correct on that; for audio at least; lots of gold to be had elsewhere;
seriously, kudos Android team et al, I haven't been this excited in a
while about a new tech!). So, yes, lets continue to discuss how to
solve the audio issue that encroaches on all of us who want
performance audio on Android and all 'droid devices in general. I will
throw in my two cents on how I _am_ (not would be) solving it after
dealing with the woes and inadequacies of Java Sound on the desktop
_for years_ from an API perspective to simply incomplete
implementations on various platforms and JVM versions. (Java Sound -
Write once, test everywhe... oh wait you mean it doesn't fully work on
X platform at all?.?. gah!!!)

One solution on the desktop has been to ditch Java Sound andsupport
PortAudio (www.portaudio.com) via JNI. I propose that Android can also
provide the best and most _portable_ audio solution not only on the G1
and other devices, but _future_ hardware that supports Android by
adopting PortAudio and exposing a lean and mean API to developers that
then can be further extended with higher level APIs for purpose built
processes (speech detection / Jspeex port / DSP processing, even a
   javax.soundimplementation built _on top_ of PortAudio, etc). What is
needed is simple and efficient raw audio I/O functionality at the
core; PortAudio provides this and only this! For file I/O it's not a
bad idea tosupportLibsndfile (http://www.mega-nerd.com/libsndfile/).
Between PortAudio  Libsndfile raw hardware and file based audio I/O
can be solved by time tested open source solutions.

Now... I mentioned future hardware supporting Android... I just so
happen to be developing an embedded audio hardware product that
focuses on advanced spatial processing (think large scale 2D and 3D
sound arrays; I have a 32 speaker hemisphere setup at my facility in
SF for instance; check here for those interested in an overview I
published for the recent AES conference in SF w/ picts  equipment
specifications 
--http://research.egrsoftware.com/conferences/2008/aes2008/)
and after finally dipping into Android (IE G1  SDK 1.0 finally
available and in my hands) I've decided unanimously and almost
instantly to ditch my previous path which was Analog Devices Blackfin/
Sharc based processors running uClinux and am switching to the TI
OMAP3 3550 and Android as the processor/stack. In doing this I already
am adopting PortAudio on my future Android based hardware and this is
how I presentlyplanto expose audio I/O to Android developers on my
hardware. So in time this is _going_ to be done already.

Seeing asthereisn't a published / unified vision on where to take
audio I/O for Android as is I do

<    1   2   3   4   >