You can't get good timing from SoundPool. If you're talking about a
music application where a few msecs of jitter affects the musical
feel, you're going to need to mix your own streams to get the timing
accuracy.
We are planning on adding an API in a future release for music
applications,
The parameter is currently ignored. You should use 0 (default) for
now.
On Feb 28, 10:48 am, clark clarkd...@gmail.com wrote:
Can anyone fill me in as to the possible values that can be passed to
the SoundPool constructor for the srcQuality parameter?
I see that it is an int, but no
,
If I record using this Soundrecorder app, it creates a .3gpp file.
But this file is not getting listed in the Music app.
I have tried restarting the emulator.
Any idea what could be the reason for this?
Thanks
~
On 2/27/09, Dave Sparks davidspa...@android.com wrote:
I confess, I don't
Please move this question to the android-framework list. This list is
for application developers.
On Feb 25, 10:30 pm, Vishwanatha S vishy.s1...@gmail.com wrote:
Dear All
I am trying to integrate my codec on the ARM side using my own OMX core.
Now I would like to test it on the Android
for the example. This would be a much better example than the
one in the docs that won't compile and implies content must first be
added to a database:
http://developer.android.com/guide/topics/media/index.html
On Feb 24, 8:03 pm, Dave Sparks davidspa...@android.com wrote:
setOutputFile() expects
SoundPool issues were fixed in Cupcake. The fixes were dependent on
other changes to the audio system and it was considered too risky to
port those fixes back to the 1.x branch. We haven't released a Cupcake
SDK yet.
Other have had success with SoundPool by setting the maxStreams to a
large
);
recorder.prepare();
recorder.start();
On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:
You can get the path to external storage (e.g. SD card) with
Environment.getExternalStorageDirectory(). This is world read/
writable.
Alternatively, each application has its own
setOutputFile() expects a path to where you want the file stored.
You can take a look at the source to SoundRecorder in the open source
tree for some working code:
http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder.git;a=summary
On Feb 24, 4:43 pm, benmccann
No, I don't believe OpenCore can play raw m4v files, only
containerized mp4 and m4a files.
On Feb 18, 6:17 am, Dilli dilliraomca...@gmail.com wrote:
Hi all
I am developing a simple application to play m4v files
while try to play the m4v files it shows exception
E/QCvdecH264( 31):
Voice search uses a private Google API optimized for search queries.
We are not making those API's public at this time.
On Feb 18, 5:10 am, Rob Franz rob.fr...@gmail.com wrote:
Ok let me ask differently then...is it possible to access the part that
takes the incoming speech, analyzes the
AsyncPlayer is just a helper class on top of MediaPlayer. It is not
going to help you with gapless playback.
I can see how the statement Plays a series of audio URIs is causing
confusion. It does not maintain a list of URI's to play. It just plays
a file until it reaches the end or it is told to
This question should be directed to android-framework.
On Feb 18, 12:40 am, Nasam forum.nami...@gmail.com wrote:
hi
I was going through the window manager of android. I wonder if is it
possible to replace this window manger with out own window manger. Is
this supported? what depencies are
There is no way to synchronize the two players so that they start and
stop synchronously. It involves multiple IPC calls - the best you can
do is probably +/- 100 msecs.
We plan to support shoutcast and improve streaming in general in a
future release.
On Feb 16, 11:13 pm, Dilli
This code shouldn't even compile, this line is wrong:
mCamera.takePicture(null, mPictureCallback); // requires 3 callback
functions
Camera is only supported in landscape mode. Cupcake release will
unofficially support portrait mode (there will be no API for it, but
I'll probably put some sample
if the card has been removed and reinserted?
On Feb 14, 2:01 pm, Dave Sparks davidspa...@android.com wrote:
You want something like this in your activity:
import android.media.MediaScannerConnection;
import
android.media.MediaScannerConnection.MediaScannerConnectionClient;
private static
This topic has been covered many times. See this thread for one
example:
http://groups.google.com/group/android-developers/browse_thread/thread/d68364976e5d98ff/733eea4a1195527e?lnk=gstq=native+support#733eea4a1195527e
On Feb 17, 10:09 pm, Android Groups wqhl.m...@gmail.com wrote:
I'm also
I believe we were able to do the fixes for SoundPool without changing
the public API. There are no plans to deprecate it at this time.
On Feb 16, 6:05 am, Blake B. bbuckle...@yahoo.com wrote:
Great idea, Jon. Thanks for sharing the code. I'm about to start
work on sound in my game, so I'll
microphone
no matter the AudioSource is MIC or DEFAULT. Actually, they do the
same thing.
Would other devices such like speaker or both speakermic could be
supported in further release?
BR
Shawn
On Feb 16, 7:12 am, Dave Sparks davidspa...@android.com wrote:
The G1 does not support recording
Please don't cross-post. This question isn't appropriate for the
application developer forum.
On Feb 16, 4:01 am, getandroid sampath...@gmail.com wrote:
Hi,
As mentioned audio stops after some random number of times when
played from either Music/Video player. After some debugging, I found
This list is for application developers. Please post questions about
source code in one of the open source forums (android-porting, android-
platform, or android-framework).
Short answer: There are no plans to publish source to any of the
Google properties at this time.
On Feb 15, 8:10 pm,
The media player can play 16-bit WAVE files, but only if the format
type is PCM and not the extended format type. I've been meaning to fix
the OpenCore WAVE parser to handle extended format, but it's not a
high priority right now.
On Feb 16, 6:39 pm, herain herainw...@gmail.com wrote:
I tried
I think your confusion probably comes from the phrase plays a series
of audio URI's. AudioSyncPlayert's just a simple helper class for
playing audio files that runs on its own thread, instead of on the UI
thread.
The first time you call play, it will start playing a sound. If you
call it a
by cool edit which saved a pcm file as windows
PCM format.
I guess it doesn't contain any extended format type as your mentioned.
On Feb 17, 10:57 am, Dave Sparks davidspa...@android.com wrote:
The media player can play 16-bit WAVE files, but only if the format
type is PCM
;
Then scatter log statements around the code in various places:
Log.d(TAG, This will output to the log);
On Feb 14, 11:28 pm, Ash ashwin.disco...@gmail.com wrote:
thanx for reply... the above code after making changes as mentioned by
Dave Sparks
does not show any error... when i found
The G1 does not support recording uplink or downlink audio.
On Feb 14, 6:20 am, Shawn_Chiu qiuping...@gmail.com wrote:
Hello, buddies
It's about android.media.MediaRecorder.
I want to implement a feature to record the conversation, both voice
from speaker and microphone. But I tried on G1
I'm pretty sure that OpenCore is going to reject the mms URI.
On Feb 13, 8:57 pm, Rob Franz rob.fr...@gmail.com wrote:
I believe this is WMA on the other end. Does this present a problem?
On Feb 13, 2009 11:13 PM, Rob Franz rob.fr...@gmail.com wrote:
Hi all
I'm trying to get an RTSP
You want something like this in your activity:
import android.media.MediaScannerConnection;
import
android.media.MediaScannerConnection.MediaScannerConnectionClient;
private static class MediaScannerNotifier implements
MediaScannerConnectionClient {
private Context mContext;
private
There are a lot of fixes to SoundPool coming in the Cupcake release.
I need to check on the crash you mentioned - I don't recall seeing
that before and it should give you an error, not crash. The range is
dependent on the ratio of the sample rate of the source and the
hardware output.
On Feb
This is not the appropriate list for your questions.
There are lots of threads about this in android-framework. Search for
OMX hardware codecs. There is also a guide to integrating OMX codecs
in the OpenCore project.
On Feb 12, 7:57 pm, susanner zsusan...@163.com wrote:
Dear all
Is there any
programmer to make good use of the vector operators to replace the
most costly Android for-loops and conditional branches.
Regards
On Feb 12, 5:29 am, Dave Sparks davidspa...@android.com wrote:
I think we'll be able to give you something that will meet your needs.
It's always a balancing act
Can you supply some links so we can try to figure out what's wrong?
On Feb 12, 12:15 pm, jz0o0z floresje...@gmail.com wrote:
Update: After a little more testing I found some rtsp links that do
play in 1.1, but other links, which I verified are still active and
were working before are giving
You can't play a resource file using VideoView by passing a pathname.
The file doesn't exist; it's just a binary blob inside the APK file.
If you create a content provider, you could pass a URI to setVideo().
I'm not entirely sure that will work because the video file might be
compressed inside
We have no plans to support those formats. Android manufacturers
always have the option of including other file formats and codecs if
the demand is there.
On Feb 11, 5:57 pm, waterblood guoyin.c...@gmail.com wrote:
Does Google has any plan for other format support , as avi, rm?
On 1月15日,
SDK 1.0 only has support for one camera.
When we have demand for a second camera from an Android partner, we'll
add a new API so that you can select the camera.
On Feb 11, 1:35 am, Link link.li...@gmail.com wrote:
hi, all
i wonder how to get camera object in android, if there are two or more
The codec is AMR-NB with an 8KHz sample frequency. In the Cupcake
release we will provide access to the raw 16-bit PCM stream so you can
do your own encoding or signal processing.
On Feb 11, 9:18 am, g1bb corymgibb...@gmail.com wrote:
Hello,
Is anyone else experiencing poor playback quality
There is no support for thumbnail extraction in SDK 1.0. It's coming
in Cupcake as an adjunct to video record.
On Feb 11, 7:30 am, Freepine freep...@gmail.com wrote:
Opencore has a frame and metadata utility, and there is also an API as
android.media.MediaMetadataRetriever.captureFrame()
in
.) So perhaps rather than a
built-in native signal processing *kernel* I am here thinking of a
built-in native signal processing (JIT) *compiler*. ;-)
Regards
On Feb 11, 11:03 am, Dave Sparks davidspa...@android.com wrote:
I'm talking about deprecating the raw picture callback that has never
I looked over the code and didn't see anything obvious. You won't see
anything in the log unless an error occurs - we try to minimize
logging in production code.
Run adb bugreport and take a look at the kernel log. You should see
something like this:
6[ 820.265000] adsp: opening module
Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?
On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
I have a video file in 3gpp format and I can use QuickTime to play it.
But when tried with the SDK Windows emulator, I got the error msg:
On the G1, no data is returned - only a null pointer. The original
intent was to return an uncompressed RGB565 frame, but this proved to
be impractical.
On Feb 9, 3:57 pm, Xster xyxyx...@gmail.com wrote:
Hi,
Our university is intending to use the Android as a platform for
mobile image
, Dave Sparks davidspa...@android.comwrote:
Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?
On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
I have a video file in 3gpp format and I can use QuickTime to play it.
But when tried
No, there is no way to do this in SDK 1.0.
On Feb 10, 9:48 am, eliak...@gmail.com eliak...@gmail.com wrote:
hello,
I want to create an application that can change one's voice during a
call in real time
is there a way to do that in android?
can you point me to the right package?
thanx
It's on the roadmap for Cupcake.
On Feb 10, 6:44 pm, clark clarkd...@gmail.com wrote:
How about SDK 1.1? Or 1.2? Any idea where on the roadmap this feature
stands?
On Feb 6, 10:18 am, Dave Sparks davidspa...@android.com wrote:
No, this is not supported in SDK 1.0.
On Feb 6, 8:34 am
such large
images into (java) application memory but I'm hoping the Android
architecture can accommodate this in the not to distant future.
Regards
On Feb 10, 7:01 pm, Dave Sparks davidspa...@android.com wrote:
On the G1, no data is returned - only a null pointer. The original
intent
as well? Any plan
for MP4 video (if it's not doing it), in light of the trend that more and
more codec chip vendors are supporting it.
-KC
On Tue, Feb 10, 2009 at 9:56 AM, Dave Sparks davidspa...@android.comwrote:
The problem is that your video path is on your host machine. You have
A list of media file formats and codecs supported by Android can be
found here:
http://developer.android.com/guide/appendix/media-formats.html
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers
The files are not stored, they are streamed into a temporary memory
buffer.
What kind of file are you trying to stream? If it's an MP4 file, you
need to make sure that the 'moov' atom comes before the 'mdat' atom.
On Feb 9, 3:08 am, AliBaba kanul1...@gmail.com wrote:
Hi All,
I am trying to
First, the surface type needs to be push buffers. In your Preview
constructor, add the following:
getHolder().setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
Second, you need to tell the media player where to display the video.
You have a line commented out:
This is an issue that will be fixed in the Cupcake release.
On Feb 8, 11:31 pm, jj jagtap...@gmail.com wrote:
hello everybody
I am capturing Image from app using :
the image captured using Intent i = new Intent
(android.media.action.IMAGE_CAPTURE);
but it is very small (25*50)
Why this
I'm pretty sure this is due to the way the emulator handles UDP
packets. There is an outstanding bug about this, but no one has had
time to work on it.
On Feb 9, 10:59 pm, Harishkumar V harishpres...@gmail.com wrote:
Michael,
using browser running in the android in emulator mode, i
A few of our developers use Eclipse as a front-end for gdb. I recall
that the setup is a bit tricky. Maybe someone can post the magic
formula.
I use gdb myself, but then I still use vi and makefiles. IDE's are for
wimps. :)
On Feb 7, 9:22 am, Sergey Ten sergeyte...@gmail.com wrote:
Hi,
I am
No, this is not supported in SDK 1.0.
On Feb 6, 8:34 am, Sundog sunns...@gmail.com wrote:
Is it possible to piggyback the audio stream or the microphone and
get raw sample data from it? Can anyone point me to some
documentation?
Thanks.
--~--~-~--~~~---~--~~
You need to tell the media scanner that you have added a new file. See
http://code.google.com/android/reference/android/media/MediaScannerConnection.html
On Feb 6, 4:38 am, Rishi kaurari...@gmail.com wrote:
When i added a media file to the sdcard an update in the MediaProvider
database is not
that same game is working fine on iphone with huge sound files in
term of size also
On Fri, Feb 6, 2009 at 12:04 PM, Dave Sparks davidspa...@android.comwrote:
Suggest you try running top to find out what's hogging the CPU.
On Feb 5, 9:22 pm, suhas gavas suhas.ga...@gmail.com wrote:
Hi
We are planning Open GL ES 2.0 hardware binding support for Donuts
(the next release). There will not be a software renderer, so you'll
need to have hardware that supports it. Theoretically it should be
possible write a software renderer as well.
On Feb 5, 3:55 am, AndroidDev son...@hotmail.com
This is not possible in SDK 1.0.
On Feb 4, 1:01 pm, Natalie natlinn...@gmail.com wrote:
I would like to be able to extract frequency/amplitude info from
incoming mic audio. From looking at previous posts, it looks like the
way to do this with the current sdk is to write to a file, then tail
the warning is flashed ?
On Thu, Feb 5, 2009 at 1:04 PM, Dave Sparks davidspa...@android.com wrote:
The message could be a clue, it's trying to tell you that the CPU is
overloaded, i.e. you're trying to do too much. Have you tried running
top to check the CPU load?
On Feb 4, 10:32 pm
. But then i tried SoundPool api and it worked fine
. Then also i wonder what was the prb with mediaplayer
On Fri, Feb 6, 2009 at 9:07 AM, Dave Sparks davidspa...@android.com wrote:
If you are playing 6 or 7 MP3 files at the same time, you are probably
saturating the CPU just
Further clarification:
I was under the impression it is possible to download the java
source code AND the C source code and build them all.
It is possible to download the open source code and build for the
emulator. If you want the code to run on a specific device, you need
additional
No, this is not supported. It requires access to in-call audio which
is currently not available to the apps processor.
On Feb 4, 3:36 am, Mak kemper.mar...@gmx.de wrote:
I want to accept incoming calls and play an audio file for the caller.
Is there a possibility of playing an audio file
@googlegroups.com
[mailto:android-develop...@googlegroups.com] On Behalf Of Dave Sparks
Sent: Tuesday, February 03, 2009 6:43 AM
To: Android Developers
Subject: [android-developers] Re: About media player
What kind of plug-in do you want to write?
media player is kind of a vague term. There is the Music
The message could be a clue, it's trying to tell you that the CPU is
overloaded, i.e. you're trying to do too much. Have you tried running
top to check the CPU load?
On Feb 4, 10:32 pm, suhas suhas.ga...@gmail.com wrote:
Hi all,
I m using mp3 sound in my game (used wav format also)
Frame size? Video and audio bitrate? Anything in the log?
On Feb 3, 3:59 am, Jeff Oh jeff.o...@gmail.com wrote:
Hi, I'm trying to receive RTSP streaming video with g1. The video
file I made was encoded using QuickTime pro, and they are progressive
streamable with a hint track. Video is
What kind of plug-in do you want to write?
media player is kind of a vague term. There is the Music player
application, the MusicPlaybackService, the MovieView activity, the
VideoView activity, and the MediaPlayer object. Source for all of
those is available at source.android.com.
On Feb 1,
* bigger than
mine...http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder...
Maybe I have made a mistake somewhere, any help is greatly appreciated
thanks for your time
On Sun, Feb 1, 2009 at 5:15 AM, Dave Sparks davidspa...@android.com wrote:
Try this:
emulator -help-audio
Are you running on a G1 or on the emulator? If on the emulator, maybe
audio input isn't working correctly and it's failing to open the audio
input device.
On Jan 31, 9:59 am, Phill Midwinter ph...@grantmidwinter.com wrote:
Looking at adb logcat I'm getting this error:
*Record channel already
...@grantmidwinter.com wrote:
That works perfectly, thanks for the help.
Do you know why it works?
2009/1/31 Dave Sparks davidspa...@android.com
Use Ogg files, you can get a nice seamless loop. We use this for the
ringtones.
On Jan 30, 10:30 am, ph...@grantmidwinter.com
ph
I don't know anything about MapView. What service are you on e.g.
WiFi, 3G/EDGE (what carrier)? Is there anything useful in the log?
Maybe a proxy failure?
On Jan 30, 5:46 pm, Keiji Ariyama ml_andr...@c-lis.co.jp wrote:
Hi folks,
Now, I'm developing an Android app called Echo.
But 5 hours
Try this:
emulator -help-audio-in
It will tell you which audio backends are available on your system.
You didn't specify what OS you are using.
I think there was also some sample code in the SDK at one point. Maybe
one of the developer advocates can point you to it. Another option is
to look
and ondestroy,
making sure it's released in each case and then run again. Works like a
charm!
2009/1/31 Dave Sparks davidspa...@android.com
Are you running on a G1 or on the emulator? If on the emulator, maybe
audio input isn't working correctly and it's failing to open the audio
input
In SDK 1.0, you can only record to a file using the AMR-NB codec,
which is bandwidth limited to 4KHz, and the encoding process itself is
pretty lossy. If you want to experiment with this on a G1, go to
Messaging, create a new message, click menu and select Attach, and
select Record Audio. This
Use Ogg files, you can get a nice seamless loop. We use this for the
ringtones.
On Jan 30, 10:30 am, ph...@grantmidwinter.com
ph...@grantmidwinter.com wrote:
Hoya,
When using a mediaplayer to play back an audio file, you can set
looping to true - that's not the issue here.
If looping is
No, you don't draw on the camera preview surface. You create a
transparent surface above it in the Z stack and draw on that.
On Jan 30, 5:31 pm, srajpal sraj...@gmail.com wrote:
I checked out the api demo, it helps to place the camera preview on
top of the surface view, but the buffers are
the youtube player.
The problem is, it doesn't work in the emulator, since there's no
youtube app.
Michael
On Jan 28, 3:37 pm, Dave Sparks davidspa...@android.com wrote:
If you specifically want to play a YouTube video, you need to register
as a YouTube developer to get the keys you need
The application heap is limited to 16MB. For RGB565, that's a max of
8M pixels, not including the heap that the app uses for other objects.
Chances are, you are probably decoding from a JPEG, so you need room
for both the compressed and uncompressed version.
On Jan 29, 7:41 am, Phill Midwinter
This is not possible with the 1.0 SDK. This feature will be available
in the Cupcake release.
On Jan 28, 11:16 pm, Raghu gragh...@gmail.com wrote:
Hi,
I want to get thumbnail from a video. So I need to extract first
frame in the video. Please let me know how can I do that in Android.
This is not really possible with 1.0 SDK. Someone has done a walkie-
talkie like program that records audio to a file and then streams it,
but not real-time audio.
The Cupcake release will add the capability to stream audio from the
mic into a Java app and stream audio from a Java app to the
I don't know that there are any specific restrictions on the video
resolution for the software codecs. It's really a matter of whether
the CPU has adequate cycles to decode it in real-time.
Some of the codecs are hardware accelerated and they do have
restrictions about frame size, bit-rate, and
I believe the OpenCore HTTP streaming engine maintains a circular
buffer of data. As data is played out, the buffer space it occupied is
re-filled with new data. When you seek backwards in the stream, it has
to re-fill the buffer from the earlier part of the stream.
On Jan 29, 1:05 pm, ed
The simplest approach is just firing off an intent to the existing
camera app to take a picture. This requires the user to push the
shutter button.
If you want it purely under program control, you could have the
application snap the picture without the user pressing a button. It
just takes a bit
If you specifically want to play a YouTube video, you need to register
as a YouTube developer to get the keys you need to access the Gdata
feeds.
I think that the Cupcake release will support a new intent for playing
YouTube videos with the YouTube app, but it will be some time before
that
We are getting ready to post some new pages in the SDK area that will
cover this information.
If you can't wait for that, do a search on this forum for codecs. It
has been covered a number of times.
On Jan 28, 2:33 am, Tom vmspa...@gmail.com wrote:
Hi All
i want to knw what r the Audio and
If you are really ambitious, you can download the Cupcake source,
unhide all the new API's and build the SDK yourself. However, that is
a topic for a different list.
On Jan 27, 5:58 am, Jean-Baptiste Queru j...@google.com wrote:
You can't. You'll have to wait for an SDK built form the cupcake
This message is off-topic, this forum is for application developers.
Try the android-framework list.
On Jan 26, 10:53 pm, bardshen bard.s...@gmail.com wrote:
Dear Sirs:
when i try to build the Android source code download from internet
use repo. i meet the following problem:
Would you please post a bug with specifics? Thanks!
On Jan 26, 12:03 pm, Tim Bray timb...@gmail.com wrote:
The section Recording Media Resources
ofhttp://code.google.com/android/toolbox/apis/media.htmlseems to be out of
date and wrong. I got working code
in development on the Cupcake branch at
android.git.kernel.org.
On Jan 26, 11:31 am, benmccann benjamin.j.mcc...@gmail.com wrote:
I'm happy to hear future releases will support the ability to stream
audio being recorded. Any ETA on this?
On Dec 30 2008, 9:58 am, Dave Sparks davidspa...@android.com wrote
The camera application in the Cupcake branch does it somehow. You
could try looking at the code in packages/apps/Camera.
On Jan 26, 4:38 pm, GiladH gila...@gmail.com wrote:
Hey,
Is there a way to identify which of the MediaStore images has been
taken
on 'this' device, as opposed to pictures
It is not possible to access call audio in the G1. This is a function
of the firmware in the radio and DSP and is currently not supported.
It is possible that future devices may enable this functionality, but
at the moment it is not part of a planned release.
On Jan 24, 2:32 am, javame_android
The camera service has no concept of foreground activity. It simply
gives the camera to the app that requests it. If another app currently
owns the camera, it is notified when the camera is stolen.
I don't know all the rationale for that design decision. It's probably
not the way I would have
a chance to close it.
On Fri, Jan 23, 2009 at 9:08 AM, Dave Sparks davidspa...@android.comwrote:
The camera service has no concept of foreground activity. It simply
gives the camera to the app that requests it. If another app currently
owns the camera, it is notified when the camera
We do not support native code on Android at this time, but we have
plans to publish a native SDK soon.
On Jan 22, 2:03 am, MRK infoto...@gmail.com wrote:
I am creating an Android application which uses the JMF (SIP, RTP,
JAIN). So i downloaded the JMF source code for some adhoc change to my
Right now, the answer is no. Most cameras require that you go to
preview mode before you can take a picture so that the image processor
can grab some frames for auto-focus, white balance, etc.
I'll see if we can get a change into Cupcake that allows you to start
preview without a surface. That
You need to call startPreview() before takePicture. You also need to
supply a PictureCallback function to receive the encoded JPEG. By
passing null, you are telling the camera service you don't want the
final JPEG image.
On Jan 21, 2:50 am, ANDREA P andrewpag...@gmail.com wrote:
I want to use
by calling auto-focus first. You don't want to move
the camera until after the shutter callback.
On Jan 21, 1:59 am, mobilek...@googlemail.com
mobilek...@googlemail.com wrote:
Could you list the proper sequence as I'm having hard time working it
out! Thanks
On Jan 21, 3:21 am, Dave Sparks davidspa
I suspect that your problem is in some details that you haven't given
us yet.
How many media players are you creating at the same time?
On Jan 20, 10:47 pm, ena enu1...@gmail.com wrote:
On Jan 21, 8:23 am, Dave Sparks davidspa...@android.com wrote: What is the
format of the data in the WAVE
Camera.autoFocus(cb);
where cb is a callback function you supply that tells you focus is
successful or not.
On Jan 20, 5:27 am, mobilek...@googlemail.com
mobilek...@googlemail.com wrote:
Hi,
My app is struggling to take focused shots. Is there a built in
facility that sets an auto-focus
No, this is not supported.
On Jan 20, 3:57 am, jalandar jagtap...@gmail.com wrote:
is it possible to take photo with emulator's camera?, if the pc(on
emulator is there) having web cam
thank you
--~--~-~--~~~---~--~~
You received this message because you are
build successfully the
eclipse
plugin?
Thanks a lot
Breno
On Jan 17, 5:44 am, Dave Sparks davidspa...@android.com wrote: OK, now
I see where you going with it. :)
What you want is coming in Cupcake. There is astreaminginterface
foraudioinput and output that gives you
At this time, there is no mechanism to get at the raw audio after it
is decoded.
On Jan 19, 11:03 am, Valeria vscarba...@gmail.com wrote:
Hi everyone,
I'm developing a multimedia player and I want to create an equalizer
in it , but I can't find any information about how to do it. Could
to where this code is located (ie: what package)?
-peter
have you written up this FAQ?
On Jan 7, 6:50 pm, Dave Sparks davidspa...@android.com wrote:
Thereisnoplantosupportjavax.sound. I guess I need to writeup a
media FAQ because this question gets asked repeatedly.
Cupcake hassupportfor
101 - 200 of 309 matches
Mail list logo