The source for the entire project is at source.android.com.
On Nov 25, 1:22 am, Amit [EMAIL PROTECTED] wrote:
Hello friends,
Can u please tell me where can i get the source code of tools provided
in sdk. I need source code of DDMS and traceview.
Any help appreciated.
Thanks
Amit
The metadata retriever currently doesn't support that level of
detail.
On Nov 24, 12:09 pm, AP08 [EMAIL PROTECTED] wrote:
I checked android.provider.MediaStore.Audio.Media, but it can provide
only limited data about audio file. I want to obtain data such as bit-
rate, how do I get that ?
Bluetooth is a big category. What feature(s) do you need?
On Nov 25, 1:26 am, supernova [EMAIL PROTECTED] wrote:
Any idea when a version with BT support will be released?
I need to choose a mobile platform for a project and BT support is
required.
On 24 nov, 13:45, Mark Murphy [EMAIL
The media framework does support RTSP with RTP payloads. The OpenCore
code includes an RTSP/RTP client built on sockets that will handle
buffering and transport controls.
It does not work in the emulator environment due to some firewall
issues in the emulator network layer.
On Nov 24, 12:30 pm,
I think you are looking for the following method in
android.content.pm.PackageManager:
public abstract Drawable getActivityIcon(Intent intent)
On Nov 24, 9:03 pm, cse.tlin [EMAIL PROTECTED] wrote:
Hi davidsparks,
Can I get the drawable icon ofyoutubeapp from another app?
I want to offer
This is unlikely to work because the OpenCore engine isn't setup to
play from a pipe. It plays from a local file, which allows it to seek
within the file, or from an HTTP or RTSP/RTP stream, which requires
content formatted specifically for streaming.
A future version of the SDK will support
IPC binder calls within the same process are treated as local C++
calls, i.e. they occur synchronously on the same thread.
I'm not sure what you are asking in the second part of your question.
Is this a request for dependencies in the Market, e.g. Application A
requires Application B to be
You must always call prepare() before seekTo(). The player engine
needs to read the first part of the stream to identify the stream type
and required codecs and read the content index (if it exists).
On Nov 26, 2:55 pm, David Given [EMAIL PROTECTED] wrote:
I'm trying to play streaming music.
There is no support for GSM 6.1 and no plans to support it at this
time.
On Nov 26, 9:14 am, Wiggles [EMAIL PROTECTED] wrote:
Is there anyway to get .wav(GSM 6.1) audio files to play on Android?
Or is there anyway to get this feature added in later updates?
I get my voicemails for work
I don't expect this issue to be resolved soon.
On Nov 26, 5:36 am, Jérémie Gavard [EMAIL PROTECTED] wrote:
Thanks for this precision.
Does this issue on emulator planned to be solved soon?
I am trying to develop an application around video streaming.
On Nov 25, 10:58 pm, Dave Sparks
The G1 preview format is YUV 420 semi-planar (U and V are subsampled
by 2 in both X and Y). The Y plane is first, followed by UV pairs - I
believe the U sample comes first in the pair.
Technically it's YCbCr 420 semi-planar, but very few people use that
term.
On Nov 26, 6:27 pm, dmanpearl
The current audio API's would make this very challenging. Look for
improvements in a future SDK.
On Nov 29, 12:10 pm, Ameer Ashanti [EMAIL PROTECTED] wrote:
Is there anyone developing a guitar tuner for the android platform? or
would that even be possible?
The phone application uses the RingtoneManager to play ringtones, so
it definitely works in that application. That doesn't mean it doesn't
have a bug, but I would suggest you check your code first.
On Nov 29, 3:41 pm, Selmi [EMAIL PROTECTED] wrote:
hi, i hope there is someone who has some
The documentation for the ToneGenerator class clearly states that it
is for the purpose of generating tones on the near end. This is so the
user hears the tones when they push the digits on the keypad.
If you want to generate DTMF tones on the far end, you need to use the
PhoneManager API.
On
I can't think of any alternative solution at this time. The code that
handles this is in the OpenCore engine and not something you can
address with a Java API.
Perhaps this is something we can address in a future release.
On Nov 28, 1:22 pm, David Given [EMAIL PROTECTED] wrote:
David Given
There is a speech reco engine in Android 1.0, but it is limited to use
in the voice dialer. The source code is available on
source.android.com.
On Nov 27, 6:18 am, Hui Tianshu (Risker) [EMAIL PROTECTED]
wrote:
does anyone know any latest update for voice recognition in Android
Decoding a single MP3 for a background audio track is not too bad. On
the G1, OGG is a better format, less memory overhead and startup
latency, plus you get seamless looping which MP3 does not do.
I don't recommend using compressed audio like MP3 or OGG if you have
lots of sound effects, because
It's on the TO-DO list, but not targeted for any release at this
time.
On Dec 1, 12:37 pm, AP08 [EMAIL PROTECTED] wrote:
Thank you for the response Dave. Would you know of any plans to
support these ?
On Nov 25, 12:15 pm, Dave Sparks [EMAIL PROTECTED] wrote:
The metadata retriever
MP3 (CBR and VBR up to 320Kbps), M4A (AAC LC, AAC, AAC+, enhanced AAC
+), OGG, 3GP (AMR-NB and AMR-WB), WAVE (8/16-bit PCM) and MIDI
(SMF0/1, XMF0/1, RTTTL/RTX, OTA, iMelody).
On Dec 2, 12:14 am, Jatin [EMAIL PROTECTED] wrote:
Hello,
Can u please let me know the list of the Music file format
The next SDK release will have a feature for grabbing a video
thumbnail.
You can't grab the video frame buffer because it isn't mapped into the
application memory space. I suspect an app with the right privileges
could take a screenshot, but I don't know the details of how that
works.
On Dec 3,
It's not easy to debug code we haven't seen. Can you post a snippet?
On Dec 3, 1:42 pm, ryan84c [EMAIL PROTECTED] wrote:
Hi,
I'm trying to develop an Android app that will allow users to take
pictures. I've tried just about every sample out there, but one
problem I keep having is all my
I'm sure this is an artifact of the way we handle video overlays -
they are treated differently than a regular surface. If I understand
the behavior, I would agree that it's a bug.
As a workaround, you will probably need to tear down the VideoView.
For camera preview, you should be able to get
If you want a video player app for your G1, there are several
available for free in the Market.
If you are looking for sample code, check out the SDK sample code.
There are also several workable snippets posted in various messages on
this list.
On Dec 3, 9:19 pm, Jatin [EMAIL PROTECTED] wrote:
The black screen is probably the result of the 2D engine trying to
composite from an empty camera preview frame buffer.
The video push buffer surfaces were a late addition to SurfaceFlinger
to facilitate the use of video hardware pipelines. I'm not surprised
there are some rough edges because it
The reason this isn't in 1.0 is because we didn't want to ship a half-
baked API. We have a solution coming soon in the form of a flexible
base layer that supports push and pull models. For example, we have
built an InputStream object on top of it that we will offer as sample
code.
In the
This list is for discussing application development. Discussion of
porting, extending, and debugging the Android source code should take
place on the open source mailing lists (android-platform, android-
framework, android-porting, or android-kernel).
On Dec 4, 11:50 pm, Yogi [EMAIL PROTECTED]
Android does not have support for OpenAL at this time. The G1 and
iPhone chipsets are not the same, and even if they were, the software
stacks are different.
On Dec 8, 8:15 am, reillyse [EMAIL PROTECTED] wrote:
Hi All,
I've been searching for 3D audio support on the Android
platform.
at 12:12 AM, Dave Sparks [EMAIL PROTECTED] wrote:
The reason this isn't in 1.0 is because we didn't want to ship a half-
baked API. We have a solution coming soon in the form of a flexible
base layer that supports push and pull models. For example, we have
built an InputStream object on top
I'm not a Java developer, but this seems like a problem in your
manifest.
On Dec 6, 1:37 pm, sam [EMAIL PROTECTED] wrote:
I'm on the 1.0 SDK and I'm having the exact same problem... I spent a
few hours on the forums googling, but the only suggestion I can find
is to make sure you call
The HTTP session isn't controlled by the Java application, the media
framework has its own HTTP client.
The Java proxy object for the network session is the MediaPlayer
object which is owned by VideoView. You could retain a reference to
the VideoView object across the orientation change, so that
when that might happen.
On Dec 9, 6:55 am, reillyse [EMAIL PROTECTED] wrote:
ok, so is there any way to do 3D audio on the G1 ?
regards
Sean
On Dec 8, 4:41 pm, Dave Sparks [EMAIL PROTECTED] wrote:
Android does not have support for OpenAL at this time. The G1 and
iPhone chipsets
I learned something new! Thanks, Dianne.
On Dec 8, 7:24 pm, Dianne Hackborn [EMAIL PROTECTED] wrote:
You can use Activity.onRetainNonConfigurationInstance() to transfer an
active object across activity instances. Just be careful that the object
doesn't continue referencing the old activity.
MediaPlayer m = new MediaPlayer();
m.setDataSource(rtsp://rtsp.yourserver.com/stream.mp3);
m.prepare();
m.start();
You probably want to call the prepare() statement from something other
than your UI thread, because it may take awhile. Alternatively, you
can call prepareAsync() and call start()
You want to register to receive Intent.ACTION_HEADSET_PLUG from
package android.content. Check out:
http://code.google.com/android/reference/android/content/Intent.html
On Dec 10, 2:38 am, eric [EMAIL PROTECTED] wrote:
Hello,
Anyone know how to detect if the ugly earphone has been plugged?
You want some code like this to play from raw resources:
AssetFileDescriptor afd = getResources().openRawResourceFd
(R.raw.test);
if (afd != null) {
mp.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(),
afd.getLength());
} else {
throw new IOException(Unable to open test file);
Do you really mean YouTube player, or are you referring to the media
player?
In any case, the emulator is using the ARM-optimized software codecs
running in QEMU ARM emulation on your workstation. The performance is
not going to be spectacular.
On the G1, the AVC codec runs on the DSP, so the
I'm beginning to grok the fullness... :)
You could use the ALARM stream, though the user might have silenced
alarms and then nothing will be heard.
We are adding the ability to send events to the music player to tell
it to play, pause, skip, etc. to support AVRCP. This will come in a
release in
The length of the notification sound determines the length. You can
select persistent if you want the sound to play until the user
dismisses it.
You can cancel a notification, so you could conceivably send your self
a delayed message to cancel a notification after a period of time.
On Dec 12,
We are aware of the issues in the Market and they will be addressed
soon.
On Dec 12, 7:42 am, Jeff jlb...@gmail.com wrote:
Is there a way that a developer can speak to someone at google?
Is there a way that a developer can make a request to email? e.g.,
removing rating that contain abusive
Some of the ringtones contain special metadata to make them loop.
These sounds will play forever and aren't appropriate for
notifications. Unless you want really annoy your users. :)
The notifications directory has sounds that don't loop.
On Dec 13, 12:02 am, elDoudou
http://code.google.com/android/reference/android/content/Intent.html#ACTION_MEDIA_SCANNER_SCAN_FILE
On Dec 14, 2:56 pm, jphdsn jph...@gmail.com wrote:
hi,
how put images in
MediaStore.Images.Media.INTERNAL_CONTENT¨_URI
thanks
--~--~-~--~~~---~--~~
You
If you're asking how to use the G1, you should post your question in
android-discuss. This forum is about programming for Android, the
software platform that runs on the G1.
Having said that, I think you're looking for the touch tone dialer
which is the little JKL tab at the bottom of the screen
When you are in a call, press the JKL tab at the bottom of the screen
to bring up the DTMF dial pad. This will generate DTMF tones over the
radio as well as generate tones for the user to hear locally (unless
the user has disabled the local tones in the Sound/Display settings).
On Dec 15, 8:49
The radio firmware generates the tones for the far end. The local
tones are generated algorithmically on the app processor. See
ToneGenerator.
On Dec 15, 10:55 pm, Mihai mihai...@gmail.com wrote:
Hi,
I would also be interested in something like this - so I would like
to ask if the G1 has
It sounds like your application is not releasing its media player
resources. When you're done playing a sound, you need to call the
MediaPlayer.release() method. If you're playing a lot of sounds
rapidly, the garbage collector won't be able to keep up.
Arguably, the runtime shouldn't reboot. I
The media server does not have access to your application's data
directory for security reasons. If you want the media server to play a
file in your data directory, you need to open the file in your app and
use the setDataSource(fd) method to pass open file descriptor to media
player.
Video telephony was not a priority for the markets that we launched in
this year. OpenCore will have a H.324M stack in 2009.
However, VT is a complex function that requires close cooperation
between RIL and media stack. Even though Android will have an open
source VT stack, there will still be a
The media scanner automatically extracts metadata from any file that
it recognizes on the SD card. Can you be more explicit about your use
case?
On Oct 24, 6:45 am, CiprianU ch3l...@gmail.com wrote:
Hy guys,
Can you tell me how can I extract the ID3 tags from an mp3 file, using
Android of
I can only guess that the RTSP client in OpenCore didn't like
something it found the DESCRIBE response. I'm not sure that it can
handle broadcast mode. Maybe someone from PV will be able to provide
some insight.
Did you get an error message back from the MediaPlayer through
onErrorListener?
On
setDataSource() with offset is for playing an embedded media file. In
other words, there must be a valid WAVE header at the specified
offset. An example use case is a resource file that contains many WAVE
files with a table of contents at the beginning.
You can call seekTo() to start playback at
No there is no way to play a segment of a media file.
On Dec 19, 9:52 am, Kenneth Loafman kenneth.loaf...@gmail.com wrote:
Is there a way to set the length of the segment to play like
setDataSource() has? I'm not seeing it.
...Thanks,
...Ken
On Fri, 19 Dec 2008 09:03:30 -0800 (PST), Dave
See this thread for using media player to play from APK resource
files:
http://groups.google.com/group/android-developers/browse_thread/thread/6668898856f8f090
On Dec 22, 1:20 pm, Toothy Bunny hongkun...@gmail.com wrote:
Hi All,
After searching developer group, I found out the problem might
1. The Music Player stops playing when a call comes in and resumes
after the call is complete. With Cupcake, the music player will slowly
ramp up the volume after the call is completed.
2. The streaming audio interface is more flexible than InputStream.
You can build an InputStream interface on
The MediaPlayer doesn't support streaming at all. You can get away
with a pseudo-streamed MP3 because the MP3 format was intended to be
broadcast and the MP3 parser is a bit more forgiving about it than the
other parsers.
On Dec 23, 4:21 am, Aldo Neto tumo...@gmail.com wrote:
Hi,
I developed
, 2008 at 4:37 PM, Dave Sparks davidspa...@android.comwrote:
The MediaPlayer doesn't support streaming at all. You can get away
with a pseudo-streamed MP3 because the MP3 format was intended to be
broadcast and the MP3 parser is a bit more forgiving about it than the
other parsers
I assume you mean an EQ insert in the final audio output. If so, the
answer is no, this is not part of the Cupcake release.
Cupcake provides support for streaming microphone input to a Java app
and streaming audio from a Java app into the audio mixer.
On Dec 24, 3:25 am, Aasha
It works for me - I have been using it extensively in the last few
weeks.
On Dec 23, 10:33 pm, develop code developcod...@gmail.com wrote:
Hi,
i tried the above modifications, i am not getting the logs from libraries
(pv player). PV logger enabling method has changed? Any other methods to
There is no support for javax.sound in Android and there are no plans
to support it. We will have support for streaming audio in a future
release.
On Dec 29, 1:11 am, Lei poohd...@gmail.com wrote:
I'm stuck on this.
Help me please.
On Dec 26, 2:23 pm, Lei poohd...@gmail.com wrote:
Hi, all
This forum is for application development. Try asking your questions
in android-framework.
On Dec 25, 3:28 am, m.developer.software
m.developer.softw...@gmail.com wrote:
Hi,
What does DecodeFireWallPackets( ) do in pvmf_jitter_buffer_node.cpp
and why is this requried? What is the concept
It's probably not really streaming audio. Some people are working
around the issue by tailing the file as it is being written.
On Dec 30, 5:03 am, FranckLefevre flas...@gmail.com wrote:
The application Phone Recorder available in Market softwares already
does this pretty well.
I don't know if
There is no support for speech recognition in the current SDK.
On Dec 29, 10:26 pm, michael michael.liu...@gmail.com wrote:
hi all
Does this ability is already provided now?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the
[mailto:android-develop...@googlegroups.com] On Behalf Of Dave Sparks
Sent: Tuesday, December 30, 2008 5:32 PM
To: Android Developers
Subject: [android-developers] Re: About speech recognizer
There is no support for speech recognition in the current SDK.
On Dec 29, 10:26 pm, michael michael.liu
You are running a virtual Linux system on your workstation. It only
has access to the file systems that are mounted, which include the
fixed images required to boot and run the device and and an optional
virtual SD card image.
If you are really ambitious, you could modify the emulator code and
There are no plans for exposing in-call audio to the apps processor.
In-call audio is controlled by the radio and typically not accessible
to the apps processor.
On Dec 26 2008, 10:00 pm, StevenS shearer_ste...@hotmail.com wrote:
If I'm reading the API documentation correctly, neither the
I haven't looked at imeem, but one way to get around the issue is
using an HTTP proxy on the device. The proxy server could be buffering
up the next stream while the current stream is playing.
On Dec 30 2008, 11:37 pm, Dan McGuirk mcgu...@gmail.com wrote:
Hi,
I'm wondering if anyone knows how
sample code for trying
this proxy server workaround for playing audio?
Thanks
On Jan 2, 6:32 pm, Dave Sparks davidspa...@android.com wrote:
I haven't looked at imeem, but one way to get around the issue is
using an HTTP proxy on the device. The proxy server could be buffering
up the next
processor.
On Jan 2, 4:06 pm, mashpl...@gmail.com mashpl...@gmail.com wrote:
This is a mistake. There are many reasons why exposing in-call audio
to the apps process is a good idea. Please reconsider your position
on this.
Kind Regards,
Vince
On Jan 3, 1:27 am, Dave Sparks davidspa
. Are there plans to update the API to allow more
flexibility? I wouldn't really want to put a lot of effort into
developing and maintaining this kind of scheme just to throw it away
in a few months if the API is improved.
On Jan 2, 10:32 am, Dave Sparks davidspa...@android.com wrote:
I haven't
that if the phone app can handle
incoming outgoing calls, then it should be possible to extend it to
include IVR-like features.
Is there a more-appropriate forum to discuss this 'issue' /
requirement / feature request ?
Thanks,
Steven.
On Jan 3, 8:36 am, Dave Sparks davidspa...@android.com wrote
There is no plan to support javax.sound. I guess I need to writeup a
media FAQ because this question gets asked repeatedly.
Cupcake has support for streaming PCM audio in and out of Java. It
also supports static buffers, i.e. load a buffer with sound data and
trigger (one-shot) or loop. Both
I believe this was deliberately left out of the code for 1.0. I'm not
aware of any plans to add it in Cupcake. I suggest you file a feature
request.
On Jan 7, 7:09 am, Blake B. bbuckle...@yahoo.com wrote:
I'll ping the group one last time. Can anyone confirm that this is
not possible?
I
I'll be the first to admit that our error reporting is bad right now.
Most likely is that it's unsupported file format or the file itself is
corrupt.
On Jan 7, 2:59 am, manoj manojkumar.m...@gmail.com wrote:
Hello friends,
I am trying to play some media files which are located in SDCard.
, but the
resulting functionality mentioned here appears quite significant for
many of us - although I cannot judge if it also addresses Dan
McGuirk's needs. Or am I confused about what is coming up in the short
term?
Regards
On Jan 6, 11:49 pm, Dave Sparks davidspa...@android.com wrote:
I don't
The G1 camera driver currently ignores the preview size and forces it
to 320 x 240.
On Jan 6, 11:19 am, Omar omarta...@gmail.com wrote:
You can do:
Camera.Parameters p = c.getParameters();
p.setPreviewSize(width, height);
c.setParameters(p);
There is no API for this.
On Jan 6, 7:30 am, Skywalker rumatah...@gmail.com wrote:
I need to play incoming audio data (raw) on phone speaker (not through
back dynamic).
I have not found any API for this. :(
Help, please...
--~--~-~--~~~---~--~~
You received
You can convert the raw YUV to RGB and draw it on the surface
yourself. There is no API to send encoded frames directly to an
decoder and have them displayed and there are no plans to support
this.
On Jan 6, 4:34 am, iblues iblues@gmail.com wrote:
Hi all,
In my application development,
You can download the Android source code from source.android.com and
build your JNI libraries against the gcc toolchain for testing.
On Jan 5, 5:19 pm, blues bluescapt...@gmail.com wrote:
I have read all the post about JNI. And I know JNI is not offcially
supported and google is working on the
There is a video player widget called VideoView for full screen video
with optional transport controls. Alternatively you can write your own
code around the MediaPlayer object.
On Jan 7, 2:14 pm, Delmarc intact...@gmail.com wrote:
I am working on a game where cut-scenes happen... but I have yet
MediaPlayer is a high level abstraction for OpenCore. There are no
immediate plans to expose any of OpenCore's lower level API's to Java.
On Jan 8, 1:43 am, vishy s vishy.s1...@gmail.com wrote:
Hi folks,
I am trying find the relation between android media api's available
there is no word about (low-level) audio mixing or MediaPlayer
in
http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob_...
So, will AudioTrack and MediaPlayer play along nicely?
Thanks
On Jan 8, 4:07 am, Dave Sparks davidspa...@android.com wrote:
That refers to the AudioTrack
You want to broadcast android.media.action.IMAGE_CAPTURE intent.
On Jan 9, 9:55 am, fala70 fal...@gmail.com wrote:
Hi guys,
Somebody know how can I call the camera capture application from my
application ? I tried to use inject key event without success using
KEY CAMERA BUTTON. If the user
There is an interaction between the screen orientation and the camera
that causes problems for portrait mode. There will be a platform fix
in a future release, however it's possible that some devices will not
be able to support portrait mode.
On Jan 9, 5:37 am, jarkman jark...@gmail.com wrote:
would like to know how to integrate components like OpenMAX
compliant codecs, parsers, protocols.. Is there any document or link
that can guide me?
Thanks and regards,
-Vishwa
On Jan 9, 8:56 am, Dave Sparks davidspa...@android.com wrote:
MediaPlayer is a high level abstraction for OpenCore
There is no platform support for this yet. You should take this up in
android-platform.
On Jan 8, 11:58 am, jas_h jasleen_pah...@yahoo.com wrote:
Hi,
Is there a FM receiver or transmitter application available on the
Android platform?
What is preferred - HCI or I2C?
What about RDS?
You left out a key detail - what is the frame size?
The specs you quoted are for a G1 device - the emulator is probably
not quite that good. In addition to running soft codecs in ARM
emulation, it's also doing color space conversion, scaling, and
rotation in ARM emulation.
On Jan 9, 4:22 pm,
You should be able to write a file in your app's private data
directory.
On Jan 10, 2:19 pm, hmmm akul...@mail.ru wrote:
Hi,
I can see that MediaRecorder class successfully performs audio recording when
I specify an output file on the SD card, such as /sdcard/newaudio.3gpp
But when I
resolution. I want use an other
method. Use the camera default application and listen the new picture
added into the image provider. For that I looking for a solution to
run camera application from my activity.
Somebody know I do it ?
On 9 Gen, 19:39, Dave Sparks davidspa...@android.com wrote
Of the top of my head, I think you need to createThreadEtc with the
flag to indicate that your thread will call into Java.
On Jan 9, 2:00 pm, redlight9...@gmail.com redlight9...@gmail.com
wrote:
i am trying to make callbacks to my android application from a native
C thread using JNI. however
I'd have to look at the code, but it doesn't surprise me. The phone
app/lock screen is a tricky bit of code and I don't think it was
intended for the ringtone to play while the screen is off.
Is there a reason you chose the Ringtone class to play your sound?
There are other options e.g.
If it won't play from the SD card, it isn't going to stream either. I
read the thread you referenced in your original message and it leaves
out a lot of details. Information below is for H.264 AVC codec on the
G1:
Performance is rated for AVC baseline profile Level 1.3. A
conservative bit rate
The Cupcake SDK will include a way of specifying the image quality.
On Jan 12, 8:52 am, Dave Sparks davidspa...@android.com wrote:
I'll have to look into it, but there should be an extra you can put in
in the intent to specify image quality. If not, then we should add it.
On Jan 10, 1:38 am
Video recording is not supported in SDK 1.0.
On Jan 14, 1:55 pm, ANDREA P andrewpag...@gmail.com wrote:
I want to make a program that recording a video in Android
There is an example
herehttp://code.google.com/intl/it-IT/android/toolbox/apis/media.html
The class used is
I am pretty sure that won't work. Why do you want to record a bunch of
small audio files without dropping samples?
On Jan 14, 7:52 pm, flatmax flat...@gmail.com wrote:
Hi there,
Has anyone managed to record audio to small files without dropping
samples between ?
perhaps it is possible to
The takePicture() function captures an image.The callback occurs when
the picture has been captured and encoded. You aren't seeing the
callback in your app because you haven't started preview mode. It
cannot be used to capture an image from another application.
Even this did work, it doesn't
We have never tested that scenario on the G1 or the emulator and I
would be surprised if it worked. The hardware video decoder can only
support one decode at a time, which means the second stream would fall
back to a software codec. I'm not saying it can't work, but if it
doesn't, we probably
OpenCore is the media playback and authoring engine used to render
most (but not all) of the media content for Android. As an application
developer, you don't access it directly, you access it through the
MediaPlayer interface.
On Jan 16, 11:18 pm, Tez earlencefe...@gmail.com wrote:
Hi,
Can
a sound level meter. For privacy reasons, we don't want
the audio lying around on the disk.
We could do it on the fly without recording to disk, however I don't think
that is possible with the sdk ... is it ?
Matt
On Fri, Jan 16, 2009 at 12:16 PM, Dave Sparks davidspa...@android.comwrote:
I
I was just guessing that maybe the OP's use case wasn't actually
recording small files, but rather he was running into a limitation in
the framework and searching for a workaround.
On Jan 16, 3:35 pm, Dan Bornstein danf...@android.com wrote:
On Thu, Jan 15, 2009 at 5:16 PM, Dave Sparks davidspa
Camera.autoFocus(cb);
where cb is a callback function you supply that tells you focus is
successful or not.
On Jan 20, 5:27 am, mobilek...@googlemail.com
mobilek...@googlemail.com wrote:
Hi,
My app is struggling to take focused shots. Is there a built in
facility that sets an auto-focus
No, this is not supported.
On Jan 20, 3:57 am, jalandar jagtap...@gmail.com wrote:
is it possible to take photo with emulator's camera?, if the pc(on
emulator is there) having web cam
thank you
--~--~-~--~~~---~--~~
You received this message because you are
1 - 100 of 309 matches
Mail list logo