Further clarification:
I was under the impression it is possible to download the java
source code AND the C source code and build them all.
It is possible to download the open source code and build for the
emulator. If you want the code to run on a specific device, you need
additional
No, this is not supported in SDK 1.0.
On Feb 6, 8:34 am, Sundog sunns...@gmail.com wrote:
Is it possible to piggyback the audio stream or the microphone and
get raw sample data from it? Can anyone point me to some
documentation?
Thanks.
--~--~-~--~~~---~--~~
You need to tell the media scanner that you have added a new file. See
http://code.google.com/android/reference/android/media/MediaScannerConnection.html
On Feb 6, 4:38 am, Rishi kaurari...@gmail.com wrote:
When i added a media file to the sdcard an update in the MediaProvider
database is not
that same game is working fine on iphone with huge sound files in
term of size also
On Fri, Feb 6, 2009 at 12:04 PM, Dave Sparks davidspa...@android.comwrote:
Suggest you try running top to find out what's hogging the CPU.
On Feb 5, 9:22 pm, suhas gavas suhas.ga...@gmail.com wrote:
Hi
A few of our developers use Eclipse as a front-end for gdb. I recall
that the setup is a bit tricky. Maybe someone can post the magic
formula.
I use gdb myself, but then I still use vi and makefiles. IDE's are for
wimps. :)
On Feb 7, 9:22 am, Sergey Ten sergeyte...@gmail.com wrote:
Hi,
I am
The files are not stored, they are streamed into a temporary memory
buffer.
What kind of file are you trying to stream? If it's an MP4 file, you
need to make sure that the 'moov' atom comes before the 'mdat' atom.
On Feb 9, 3:08 am, AliBaba kanul1...@gmail.com wrote:
Hi All,
I am trying to
First, the surface type needs to be push buffers. In your Preview
constructor, add the following:
getHolder().setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
Second, you need to tell the media player where to display the video.
You have a line commented out:
This is an issue that will be fixed in the Cupcake release.
On Feb 8, 11:31 pm, jj jagtap...@gmail.com wrote:
hello everybody
I am capturing Image from app using :
the image captured using Intent i = new Intent
(android.media.action.IMAGE_CAPTURE);
but it is very small (25*50)
Why this
I'm pretty sure this is due to the way the emulator handles UDP
packets. There is an outstanding bug about this, but no one has had
time to work on it.
On Feb 9, 10:59 pm, Harishkumar V harishpres...@gmail.com wrote:
Michael,
using browser running in the android in emulator mode, i
Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?
On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
I have a video file in 3gpp format and I can use QuickTime to play it.
But when tried with the SDK Windows emulator, I got the error msg:
On the G1, no data is returned - only a null pointer. The original
intent was to return an uncompressed RGB565 frame, but this proved to
be impractical.
On Feb 9, 3:57 pm, Xster xyxyx...@gmail.com wrote:
Hi,
Our university is intending to use the Android as a platform for
mobile image
, Dave Sparks davidspa...@android.comwrote:
Can you elaborate a bit more on how you are playing the file? Did you
write your own video player?
On Feb 9, 2:57 pm, KC grssmount...@gmail.com wrote:
I have a video file in 3gpp format and I can use QuickTime to play it.
But when tried
No, there is no way to do this in SDK 1.0.
On Feb 10, 9:48 am, eliak...@gmail.com eliak...@gmail.com wrote:
hello,
I want to create an application that can change one's voice during a
call in real time
is there a way to do that in android?
can you point me to the right package?
thanx
It's on the roadmap for Cupcake.
On Feb 10, 6:44 pm, clark clarkd...@gmail.com wrote:
How about SDK 1.1? Or 1.2? Any idea where on the roadmap this feature
stands?
On Feb 6, 10:18 am, Dave Sparks davidspa...@android.com wrote:
No, this is not supported in SDK 1.0.
On Feb 6, 8:34 am
such large
images into (java) application memory but I'm hoping the Android
architecture can accommodate this in the not to distant future.
Regards
On Feb 10, 7:01 pm, Dave Sparks davidspa...@android.com wrote:
On the G1, no data is returned - only a null pointer. The original
intent
as well? Any plan
for MP4 video (if it's not doing it), in light of the trend that more and
more codec chip vendors are supporting it.
-KC
On Tue, Feb 10, 2009 at 9:56 AM, Dave Sparks davidspa...@android.comwrote:
The problem is that your video path is on your host machine. You have
A list of media file formats and codecs supported by Android can be
found here:
http://developer.android.com/guide/appendix/media-formats.html
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups Android Developers
You can't play a resource file using VideoView by passing a pathname.
The file doesn't exist; it's just a binary blob inside the APK file.
If you create a content provider, you could pass a URI to setVideo().
I'm not entirely sure that will work because the video file might be
compressed inside
We have no plans to support those formats. Android manufacturers
always have the option of including other file formats and codecs if
the demand is there.
On Feb 11, 5:57 pm, waterblood guoyin.c...@gmail.com wrote:
Does Google has any plan for other format support , as avi, rm?
On 1月15日,
SDK 1.0 only has support for one camera.
When we have demand for a second camera from an Android partner, we'll
add a new API so that you can select the camera.
On Feb 11, 1:35 am, Link link.li...@gmail.com wrote:
hi, all
i wonder how to get camera object in android, if there are two or more
The codec is AMR-NB with an 8KHz sample frequency. In the Cupcake
release we will provide access to the raw 16-bit PCM stream so you can
do your own encoding or signal processing.
On Feb 11, 9:18 am, g1bb corymgibb...@gmail.com wrote:
Hello,
Is anyone else experiencing poor playback quality
There is no support for thumbnail extraction in SDK 1.0. It's coming
in Cupcake as an adjunct to video record.
On Feb 11, 7:30 am, Freepine freep...@gmail.com wrote:
Opencore has a frame and metadata utility, and there is also an API as
android.media.MediaMetadataRetriever.captureFrame()
in
.) So perhaps rather than a
built-in native signal processing *kernel* I am here thinking of a
built-in native signal processing (JIT) *compiler*. ;-)
Regards
On Feb 11, 11:03 am, Dave Sparks davidspa...@android.com wrote:
I'm talking about deprecating the raw picture callback that has never
I looked over the code and didn't see anything obvious. You won't see
anything in the log unless an error occurs - we try to minimize
logging in production code.
Run adb bugreport and take a look at the kernel log. You should see
something like this:
6[ 820.265000] adsp: opening module
This is not the appropriate list for your questions.
There are lots of threads about this in android-framework. Search for
OMX hardware codecs. There is also a guide to integrating OMX codecs
in the OpenCore project.
On Feb 12, 7:57 pm, susanner zsusan...@163.com wrote:
Dear all
Is there any
programmer to make good use of the vector operators to replace the
most costly Android for-loops and conditional branches.
Regards
On Feb 12, 5:29 am, Dave Sparks davidspa...@android.com wrote:
I think we'll be able to give you something that will meet your needs.
It's always a balancing act
Can you supply some links so we can try to figure out what's wrong?
On Feb 12, 12:15 pm, jz0o0z floresje...@gmail.com wrote:
Update: After a little more testing I found some rtsp links that do
play in 1.1, but other links, which I verified are still active and
were working before are giving
I'm pretty sure that OpenCore is going to reject the mms URI.
On Feb 13, 8:57 pm, Rob Franz rob.fr...@gmail.com wrote:
I believe this is WMA on the other end. Does this present a problem?
On Feb 13, 2009 11:13 PM, Rob Franz rob.fr...@gmail.com wrote:
Hi all
I'm trying to get an RTSP
You want something like this in your activity:
import android.media.MediaScannerConnection;
import
android.media.MediaScannerConnection.MediaScannerConnectionClient;
private static class MediaScannerNotifier implements
MediaScannerConnectionClient {
private Context mContext;
private
There are a lot of fixes to SoundPool coming in the Cupcake release.
I need to check on the crash you mentioned - I don't recall seeing
that before and it should give you an error, not crash. The range is
dependent on the ratio of the sample rate of the source and the
hardware output.
On Feb
;
Then scatter log statements around the code in various places:
Log.d(TAG, This will output to the log);
On Feb 14, 11:28 pm, Ash ashwin.disco...@gmail.com wrote:
thanx for reply... the above code after making changes as mentioned by
Dave Sparks
does not show any error... when i found
The G1 does not support recording uplink or downlink audio.
On Feb 14, 6:20 am, Shawn_Chiu qiuping...@gmail.com wrote:
Hello, buddies
It's about android.media.MediaRecorder.
I want to implement a feature to record the conversation, both voice
from speaker and microphone. But I tried on G1
I believe we were able to do the fixes for SoundPool without changing
the public API. There are no plans to deprecate it at this time.
On Feb 16, 6:05 am, Blake B. bbuckle...@yahoo.com wrote:
Great idea, Jon. Thanks for sharing the code. I'm about to start
work on sound in my game, so I'll
microphone
no matter the AudioSource is MIC or DEFAULT. Actually, they do the
same thing.
Would other devices such like speaker or both speakermic could be
supported in further release?
BR
Shawn
On Feb 16, 7:12 am, Dave Sparks davidspa...@android.com wrote:
The G1 does not support recording
Please don't cross-post. This question isn't appropriate for the
application developer forum.
On Feb 16, 4:01 am, getandroid sampath...@gmail.com wrote:
Hi,
As mentioned audio stops after some random number of times when
played from either Music/Video player. After some debugging, I found
This list is for application developers. Please post questions about
source code in one of the open source forums (android-porting, android-
platform, or android-framework).
Short answer: There are no plans to publish source to any of the
Google properties at this time.
On Feb 15, 8:10 pm,
The media player can play 16-bit WAVE files, but only if the format
type is PCM and not the extended format type. I've been meaning to fix
the OpenCore WAVE parser to handle extended format, but it's not a
high priority right now.
On Feb 16, 6:39 pm, herain herainw...@gmail.com wrote:
I tried
I think your confusion probably comes from the phrase plays a series
of audio URI's. AudioSyncPlayert's just a simple helper class for
playing audio files that runs on its own thread, instead of on the UI
thread.
The first time you call play, it will start playing a sound. If you
call it a
by cool edit which saved a pcm file as windows
PCM format.
I guess it doesn't contain any extended format type as your mentioned.
On Feb 17, 10:57 am, Dave Sparks davidspa...@android.com wrote:
The media player can play 16-bit WAVE files, but only if the format
type is PCM
There is no way to synchronize the two players so that they start and
stop synchronously. It involves multiple IPC calls - the best you can
do is probably +/- 100 msecs.
We plan to support shoutcast and improve streaming in general in a
future release.
On Feb 16, 11:13 pm, Dilli
This code shouldn't even compile, this line is wrong:
mCamera.takePicture(null, mPictureCallback); // requires 3 callback
functions
Camera is only supported in landscape mode. Cupcake release will
unofficially support portrait mode (there will be no API for it, but
I'll probably put some sample
if the card has been removed and reinserted?
On Feb 14, 2:01 pm, Dave Sparks davidspa...@android.com wrote:
You want something like this in your activity:
import android.media.MediaScannerConnection;
import
android.media.MediaScannerConnection.MediaScannerConnectionClient;
private static
This topic has been covered many times. See this thread for one
example:
http://groups.google.com/group/android-developers/browse_thread/thread/d68364976e5d98ff/733eea4a1195527e?lnk=gstq=native+support#733eea4a1195527e
On Feb 17, 10:09 pm, Android Groups wqhl.m...@gmail.com wrote:
I'm also
No, I don't believe OpenCore can play raw m4v files, only
containerized mp4 and m4a files.
On Feb 18, 6:17 am, Dilli dilliraomca...@gmail.com wrote:
Hi all
I am developing a simple application to play m4v files
while try to play the m4v files it shows exception
E/QCvdecH264( 31):
Voice search uses a private Google API optimized for search queries.
We are not making those API's public at this time.
On Feb 18, 5:10 am, Rob Franz rob.fr...@gmail.com wrote:
Ok let me ask differently then...is it possible to access the part that
takes the incoming speech, analyzes the
AsyncPlayer is just a helper class on top of MediaPlayer. It is not
going to help you with gapless playback.
I can see how the statement Plays a series of audio URIs is causing
confusion. It does not maintain a list of URI's to play. It just plays
a file until it reaches the end or it is told to
This question should be directed to android-framework.
On Feb 18, 12:40 am, Nasam forum.nami...@gmail.com wrote:
hi
I was going through the window manager of android. I wonder if is it
possible to replace this window manger with out own window manger. Is
this supported? what depencies are
setOutputFile() expects a path to where you want the file stored.
You can take a look at the source to SoundRecorder in the open source
tree for some working code:
http://android.git.kernel.org/?p=platform/packages/apps/SoundRecorder.git;a=summary
On Feb 24, 4:43 pm, benmccann
Please move this question to the android-framework list. This list is
for application developers.
On Feb 25, 10:30 pm, Vishwanatha S vishy.s1...@gmail.com wrote:
Dear All
I am trying to integrate my codec on the ARM side using my own OMX core.
Now I would like to test it on the Android
for the example. This would be a much better example than the
one in the docs that won't compile and implies content must first be
added to a database:
http://developer.android.com/guide/topics/media/index.html
On Feb 24, 8:03 pm, Dave Sparks davidspa...@android.com wrote:
setOutputFile() expects
SoundPool issues were fixed in Cupcake. The fixes were dependent on
other changes to the audio system and it was considered too risky to
port those fixes back to the 1.x branch. We haven't released a Cupcake
SDK yet.
Other have had success with SoundPool by setting the maxStreams to a
large
);
recorder.prepare();
recorder.start();
On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:
You can get the path to external storage (e.g. SD card) with
Environment.getExternalStorageDirectory(). This is world read/
writable.
Alternatively, each application has its own
,
If I record using this Soundrecorder app, it creates a .3gpp file.
But this file is not getting listed in the Music app.
I have tried restarting the emulator.
Any idea what could be the reason for this?
Thanks
~
On 2/27/09, Dave Sparks davidspa...@android.com wrote:
I confess, I don't
The parameter is currently ignored. You should use 0 (default) for
now.
On Feb 28, 10:48 am, clark clarkd...@gmail.com wrote:
Can anyone fill me in as to the possible values that can be passed to
the SoundPool constructor for the srcQuality parameter?
I see that it is an int, but no
You can't get good timing from SoundPool. If you're talking about a
music application where a few msecs of jitter affects the musical
feel, you're going to need to mix your own streams to get the timing
accuracy.
We are planning on adding an API in a future release for music
applications,
The image capture intents for the 1.0 and 1.1 releases only allow for
small images intended for email or MMS attach. The next SDK release
for Cupcake will add support for setting the image size.
On Mar 2, 9:00 pm, Ondra Zahradnik ondra.zahrad...@gmail.com wrote:
Hello I am trying to take
You need to tell the mediaplayer where to display the video with
setDisplaySurface(). Check out the media demo apps on
developer.android.com.
On Mar 4, 11:45 pm, Nithin nithin.war...@gmail.com wrote:
hi,
I tried a simple mediaplayer application, just to run a .3gp file.
First, i put the .3gp
The only audio format supported on G1 is AMR format (raw .AMR file).
On Mar 4, 1:18 pm, zeeshan genx...@gmail.com wrote:
Hi,
can anyone tell me what is the default format of android recording.
i have recorded an audio clip but dont know how can i check its
extention?
i am using this
);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(test.3gpp);
recorder.prepare();
recorder.start();
On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:
You can get the path to external storage (e.g. SD card
If you are trying to play this on a G1, the frame size is too large.
The H.264 codec is base profile up to HVGA (480x320).
It's pointless to encode at VGA when the screen is HGVA - you're
wasting half your bandwidth for pixels you will never see.
On Mar 6, 1:42 am, manoj
The H.264 codec in the G1 is baseline profile Level 1.3. Maximum frame
size is 480x320 (happily the same dimensions as the screen).
Recommended maximum bit rate is 384Kbps. Maximum frame rate is 30 fps.
It does not support B-frames (consistent with baseline profile). If
you stay within these
a frame every few seconds from
outbound video stream and show a little image on screen?
On Feb 11, 11:14 pm, Dave Sparks davidspa...@android.com wrote:
There is no support for thumbnail extraction in SDK 1.0. It's coming
in Cupcake as an adjunct to video record.
On Feb 11, 7:30 am, Freepine
I believe screen capture is disabled in production devices for
security reasons.
On Mar 5, 7:49 pm, volk...@aol.com volk...@aol.com wrote:
I'm having trouble getting a screenshot. I installed the SDk,
Eclipse, the drivers, enable usb debugging, like the instructions say.
I open DDMS and it
Filing a bug is the way to get things fixed. Most Google Android
engineers do not read this list. They do respond to bug reports
though.
On Mar 7, 3:01 pm, Stoyan Damov stoyan.da...@gmail.com wrote:
On Sat, Mar 7, 2009 at 8:58 PM, strazzere str...@gmail.com wrote:
Your interaction with the
If your app has the permission MODIFY_AUDIO_SETTINGS you can route
output to the earpiece using the setRouting API in AudioManager.
On Mar 9, 8:15 am, cht caoht...@gmail.com wrote:
for the OEM, on the product line,we need to test the earpiece whether
it works well.
so I want to write a
Is this on a G1 or on the emulator?
On Mar 8, 10:18 pm, manoj manojkumar.m...@gmail.com wrote:
Hi friends,
I have developed a video playing application.
I have a problem playing a video file.
Actually it plays well (both audio and video are coming).
But nothing is visible in the
You cannot alter the preview frames as they are being displayed.
Camera preview uses a private shared interface between the camera and
SurfaceFlinger to display preview frames. It would be too costly to
pass this data across two process boundaries and a Java VM.
You might be able to get away
Call the release() method on the Camera object and set it to null. In
onResume, create a new Camera object.
On Nov 22, 1:23 pm, joshbeck [EMAIL PROTECTED] wrote:
Here is what my app does:
It opens the camera.
It draws a preview to a surface.
Now, if the user pauses the app,
is the camera
See android.hardware.Camera.setDisplayOrientation (int degrees)
This is the approved way to set the camera orientation as of V2.2.
Please note that this only works for still images, videos will still
record in landscape orientation.
On Jul 15, 5:00 am, Vincent y.ikeda.asa...@gmail.com wrote:
Try m.youtube.com, this works on other Android devices. I don't have a
Hero to test with.
On Jul 14, 12:18 pm, Anthoni anthoni.gard...@gmail.com wrote:
Hello,
I am trying to find a URL that conforms to the proper RTSP protocol
that Android will understand. I've various ones and added them
Progressive streaming is like progressive download except that the
media file is partially cached in memory rather than writing to
permanent storage.
On Jul 13, 1:21 pm, Michel m.co...@nfb.ca wrote:
On top of that my question is what is HTTP progressive streaming
standing for?
Is that a nick
The media player currently does not support https.
On Jul 13, 7:18 pm, zhao zhaoyang...@gmail.com wrote:
I am trying to stream video over https from Android browser. If the
video url is http, everything works fine. But when I switch the url to
https, no video can be played. I tried 2 methods
only see half (width wise)
the video. Noticed this on a few others I've tried as well, so not
sure if it's a problem with the Hero or what :(
Regards
Anthoni
On Jul 15, 9:59 pm, Dave Sparks davidspa...@android.com wrote:
Try m.youtube.com, this works on other Android devices. I don't have
triggerClip() was designed to play synchronized sound effects for
musical games like JetBoy.
If you just want to play random sound effects, I would use SoundPool
instead.
On Jul 30, 5:53 am, kk kkostia...@gmail.com wrote:
Hi all,
I'm using JetPlayer in order to add some audio to a game I'm
Try this:
mp.prepare();
mp.seekTo(0);
mp.start();
And get rid of your onPreparedListener. It is unnecessary since you
are calling prepare().
On Apr 27, 1:20 am, Sudha sudhaker...@gmail.com wrote:
Hi,
I have a requirement to play several sounds many times in my game so
instead of creating
I assume this in the emulator. I believe the issue is that the
emulator does not forward the UDP packets you need for the RTP
session. This should work on a real device.
On Apr 27, 12:28 am, awwa awwa...@gmail.com wrote:
I'm trying to play streaming video(*.3gp) with android SDK 1.5 pre.
I
OK, so it sounds like audio is being produced by the kernel driver.
I just looked at your code, and I think you need to call read() once
to pass in your first input buffer.
On Apr 24, 6:04 pm, Steven_T gwb...@126.com wrote:
hi Dave Sparks:
thank you for reply!
I didn't disable audio input
I believe this is a known limitation of the emulator. There is a
feature request to allow for more sample rates, but no one is actively
working on it. The source code is available if someone wants to take
it on.
On Apr 25, 2:36 pm, szabolcs szabolcs.vr...@gmail.com wrote:
Dave, Yoni,
Thank
Use SoundPool.
On Apr 26, 9:34 am, BlackLight blacklight1...@gmail.com wrote:
I have other problem now. Lets say I have 10 buttons (0-9), when user
press button program should play short (0.3-0.5 secs) sound. I added
them as wav resources. Now I see that each MediaPlayer creates its own
Encode in an mp4 or m4a file.
On Apr 28, 12:35 pm, Moto medicalsou...@gmail.com wrote:
Could I somehow trick the player on playing something of this format?
I know that there is support for AAC encoded files but just how?
Thanks!
Moto!
--~--~-~--~~~---~--~~
AudioRecord gives you access to 16-bit PCM audio from the microphone
and AudioTrack gives you a way to output 16-bit PCM audio to the
output device.
On Apr 28, 8:50 am, intbt tacbe...@gmail.com wrote:
Thanks, I think AudioTrack may be what I am looking for to read the
codec output???
SoundPool has too much jitter for a serious music application. If you
want to control the jitter, you need to output to a single AudioTrack.
On Apr 28, 8:27 am, Marco Nelissen marc...@android.com wrote:
On Mon, Apr 27, 2009 at 5:50 PM, rookie1_1998
eric.yongjun.c...@gmail.comwrote:
I need
Do you have a stack trace from the log?
On Apr 30, 4:51 pm, petunio juanjosegilmen...@hotmail.com wrote:
Hi
I am finally testing my application on a G1 and even though it works
fine on the emulator, it crashes on the G1
it crashes when it does:
setContentView(R.layout.mylayout);
the xml
Android does not support playing two video streams at the same time.
On Apr 30, 1:48 am, N V nithi...@gmail.com wrote:
Hi to all
I am playing 2 videos(.mpg4) at time... Some times its works
fine, But some times give
error like Cannot Play the Video Can any One tell me Why its
Can you repro this with the camera application?
On May 1, 6:22 am, blindfold seeingwithso...@gmail.com wrote:
I found that the old bug reported
inhttp://code.google.com/p/android/issues/detail?id=1578
where only a power cycle brings back the camera still persists with
the official Cupcake
of Proguard perhaps has anything to do with
it, because things seem stable until I prepare a release APK.
Regards
On May 1, 5:00 pm, Dave Sparks davidspa...@android.com wrote:
Can you repro this with the camera application?
On May 1, 6:22 am, blindfold seeingwithso...@gmail.com wrote:
I found
Voice recognition is a technology. You need an application to make use
of it, for example the voice dialer.
On May 1, 11:17 am, Yash Patel yashjpa...@gmail.com wrote:
HI,
does any one know How to turn on Voice Recognition on Emulator. or is it
required to have phone or dev phone to test
Did you include android.permission.CAMERA in your manifest?
On May 1, 3:21 pm, Jason Proctor juvat...@gmail.com wrote:
(resend from different address, see if it makes it this time.)
is video recording supported in 1.5?
i got it mostly working with the Haykuro 1.5 ADP image - the video
file
What is the error?
On May 1, 5:18 pm, Yash Patel yashjpa...@gmail.com wrote:
I mean to say Speech Recognization. I tried to create one small application
but it gives me error.
Thanks
Yash Patel
On Fri, May 1, 2009 at 4:54 PM, Dave Sparks davidspa...@android.com wrote:
Voice
No, you have always needed a camera permission to access the camera.
It's new to the MediaRecorder API because we didn't add video support
until 1.5.
On May 1, 3:59 pm, Jason Proctor juvat...@gmail.com wrote:
nope, never needed it. is the requirement new?
i'll give it a go, thanks.
(looks
,
potentialResults);
// Start the Recognition Activity
startActivityForResult(intent, RESULT_SPEECH);
}
catch(Exception ex) {
ex.printStackTrace();
}
}
On Fri, May 1, 2009 at 9:23 PM, Dave Sparks davidspa
if we are not
using speech recognition to perform web searches?
Thanks,
Jose Luis.
On 4 mayo, 20:46, Dave Sparks davidspa...@android.com wrote:
This intent is handled by the Google Voice Search application. Do you
have it installed?
On May 4, 6:12 am, Yash Patel yashjpa...@gmail.com
You need to call setPreviewDisplay() and pass in a SurfaceView before
you call prepare().
On May 6, 8:45 am, Anders Nilsson Plymoth lanils...@gmail.com wrote:
Hi,
Does anyone know how to use the MediaRecorder to API to capture video?
I am writing an application where I want to be able to
You need to format the mp4 file for streaming. This means that the
moov atom must precede the mdat atom in the file.
On May 7, 5:05 am, N V nithi...@gmail.com wrote:
Hi to all...
I tried for video streaming in sdk 1.5... The video
format .mp4... But it gives error
like This video
The mic on all current Android devices, why would you want to record
stereo?
On May 7, 3:43 am, l hx lihongxia8...@gmail.com wrote:
can we using channels 2 when recording audio?
On Sat, May 2, 2009 at 12:54 AM, Jean-Michel jmtr...@gmail.com wrote:
Hi there,
Looks like sipdroid
This is a limitation of the hardware, the preview size and encoded
size must be the same.
I'm not sure how you were able to change the preview size though. I'd
like to know the code sequence you used, because it's not supposed to
be possible.
On May 6, 11:11 am, Jason Proctor
Wait, when you say corruption, you really mean that there's a mismatch
between the metadata and the actual frame size, is that correct?
On May 7, 11:17 am, Jason Proctor ja...@particularplace.com wrote:
i don't change it, it gets changed by the Author Driver presumably
to avoid colliding with
You need to call the read() method.
On May 15, 3:15 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
Any ideas?
Thanks,
Ben
On May 15, 1:02 am, benmccann benjamin.j.mcc...@gmail.com wrote:
Hi,
I'm trying to figure out how to use theAudioRecordclass. I created
a callback with a
No, there is no API for this.
On May 18, 12:56 pm, Flying Coder av8r.st...@gmail.com wrote:
Hi,
Is there any way to tell if an app is currently using the speaker
(playing music or generating other sounds)? Specifically, I'd like to
detect if an alarm clock is going off (not only the one
This is a hardware-dependent feature. Frankly, I don't see any value
in it because the display devices don't have 24-bit support.
On May 19, 4:57 am, Edware littlenew1...@gmail.com wrote:
Dear Sir,
As I know, Android only supports RGB 16-bit color depth format. Could
Android play 24-bit color
201 - 300 of 309 matches
Mail list logo