only see half (width wise)
the video. Noticed this on a few others I've tried as well, so not
sure if it's a problem with the Hero or what :(
Regards
Anthoni
On Jul 15, 9:59 pm, Dave Sparks davidspa...@android.com wrote:
Try m.youtube.com, this works on other Android devices. I don't have
triggerClip() was designed to play synchronized sound effects for
musical games like JetBoy.
If you just want to play random sound effects, I would use SoundPool
instead.
On Jul 30, 5:53 am, kk kkostia...@gmail.com wrote:
Hi all,
I'm using JetPlayer in order to add some audio to a game I'm
See android.hardware.Camera.setDisplayOrientation (int degrees)
This is the approved way to set the camera orientation as of V2.2.
Please note that this only works for still images, videos will still
record in landscape orientation.
On Jul 15, 5:00 am, Vincent y.ikeda.asa...@gmail.com wrote:
Try m.youtube.com, this works on other Android devices. I don't have a
Hero to test with.
On Jul 14, 12:18 pm, Anthoni anthoni.gard...@gmail.com wrote:
Hello,
I am trying to find a URL that conforms to the proper RTSP protocol
that Android will understand. I've various ones and added them
Progressive streaming is like progressive download except that the
media file is partially cached in memory rather than writing to
permanent storage.
On Jul 13, 1:21 pm, Michel m.co...@nfb.ca wrote:
On top of that my question is what is HTTP progressive streaming
standing for?
Is that a nick
The media player currently does not support https.
On Jul 13, 7:18 pm, zhao zhaoyang...@gmail.com wrote:
I am trying to stream video over https from Android browser. If the
video url is http, everything works fine. But when I switch the url to
https, no video can be played. I tried 2 methods
HTTP progressive and RTSP for 3GPP/MPEG-4 streams.
On May 22, 10:57 am, Emil semil...@gmail.com wrote:
Thanks for the answer.
So which video streaming formats the Android G1 does support?
Emil
On May 22, 4:39 am, Dave Sparks davidspa...@android.com wrote:
The G1 does not support
This is supported, it definitely should not crash even if there's a
problem. Where's the stack trace for the crash?
On Apr 29, 8:30 am, Zhubham sahilz...@gmail.com wrote:
Hi People,
I need your help regarding streaming in android. I read some of the
previous discussions but I have a few
when i call phone. Any
suggestions? Very thanks.
On 5月22日, 上午9时36分, Dave Sparks davidspa...@android.com wrote:
You asked specifically about Bluetooth SCO before. SCO is an 8KHz mono
audio channel compressed down to a low bit-rate stream. To get a
simulation of what it sounds like, try
SCO? What's A2DP? I'm not clear about that. Did that mean if i
using A2DP bluetooth headset, all audioes will auto play by
bluetooth headset? And not all bluetooth headset hardware supports
A2DP. Does that right?
On 5月22日, 上午9时36分, Dave Sparks davidspa...@android.com wrote:
You asked
I can't tell from your code snippet what you are trying to do.
However, I suggest you try using SoundPool. It's designed for this
kind of use.
On May 21, 12:53 am, Sukitha Udugamasooriya suk...@gmail.com wrote:
Hi,
In my application i'm playing a 1 second sound clip when the user
clicks on a
application can't play in
bluetooth headset?
Dave Sparks wrote:
We don't support sending app processor audio over SCO. The audio
quality would be very poor.
On May 20, 6:51 am, jianwei kevin@gmail.com wrote:
Hi all,
I met a problem of bluetooth headset. I want to switch audio
The G1 does not support Windows Media streaming formats, only local
file playback.
On May 21, 1:23 am, semil103 semil...@gmail.com wrote:
Hello,
I would like to know if I can view streamed WMV video format using G1
device (not the developer phone).
I understand that WMV format is not
We don't support sending app processor audio over SCO. The audio
quality would be very poor.
On May 20, 6:51 am, jianwei kevin@gmail.com wrote:
Hi all,
I met a problem of bluetooth headset. I want to switch audio playing
to bluetooth headset when bluetooth headset is paired. I found
No, there is no API for this.
On May 18, 12:56 pm, Flying Coder av8r.st...@gmail.com wrote:
Hi,
Is there any way to tell if an app is currently using the speaker
(playing music or generating other sounds)? Specifically, I'd like to
detect if an alarm clock is going off (not only the one
This is a hardware-dependent feature. Frankly, I don't see any value
in it because the display devices don't have 24-bit support.
On May 19, 4:57 am, Edware littlenew1...@gmail.com wrote:
Dear Sir,
As I know, Android only supports RGB 16-bit color depth format. Could
Android play 24-bit color
(buffer)
}
When I've done the latter I get buffer overflow exceptions.
On May 15, 3:51 pm, Dave Sparks davidspa...@android.com wrote:
You need to call the read() method.
On May 15, 3:15 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
Any ideas?
Thanks,
Ben
On May 15, 1:02
You need to call the read() method.
On May 15, 3:15 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
Any ideas?
Thanks,
Ben
On May 15, 1:02 am, benmccann benjamin.j.mcc...@gmail.com wrote:
Hi,
I'm trying to figure out how to use theAudioRecordclass. I created
a callback with a
You need to format the mp4 file for streaming. This means that the
moov atom must precede the mdat atom in the file.
On May 7, 5:05 am, N V nithi...@gmail.com wrote:
Hi to all...
I tried for video streaming in sdk 1.5... The video
format .mp4... But it gives error
like This video
The mic on all current Android devices, why would you want to record
stereo?
On May 7, 3:43 am, l hx lihongxia8...@gmail.com wrote:
can we using channels 2 when recording audio?
On Sat, May 2, 2009 at 12:54 AM, Jean-Michel jmtr...@gmail.com wrote:
Hi there,
Looks like sipdroid
This is a limitation of the hardware, the preview size and encoded
size must be the same.
I'm not sure how you were able to change the preview size though. I'd
like to know the code sequence you used, because it's not supposed to
be possible.
On May 6, 11:11 am, Jason Proctor
Wait, when you say corruption, you really mean that there's a mismatch
between the metadata and the actual frame size, is that correct?
On May 7, 11:17 am, Jason Proctor ja...@particularplace.com wrote:
i don't change it, it gets changed by the Author Driver presumably
to avoid colliding with
You need to call setPreviewDisplay() and pass in a SurfaceView before
you call prepare().
On May 6, 8:45 am, Anders Nilsson Plymoth lanils...@gmail.com wrote:
Hi,
Does anyone know how to use the MediaRecorder to API to capture video?
I am writing an application where I want to be able to
,
potentialResults);
// Start the Recognition Activity
startActivityForResult(intent, RESULT_SPEECH);
}
catch(Exception ex) {
ex.printStackTrace();
}
}
On Fri, May 1, 2009 at 9:23 PM, Dave Sparks davidspa
if we are not
using speech recognition to perform web searches?
Thanks,
Jose Luis.
On 4 mayo, 20:46, Dave Sparks davidspa...@android.com wrote:
This intent is handled by the Google Voice Search application. Do you
have it installed?
On May 4, 6:12 am, Yash Patel yashjpa...@gmail.com
Can you repro this with the camera application?
On May 1, 6:22 am, blindfold seeingwithso...@gmail.com wrote:
I found that the old bug reported
inhttp://code.google.com/p/android/issues/detail?id=1578
where only a power cycle brings back the camera still persists with
the official Cupcake
of Proguard perhaps has anything to do with
it, because things seem stable until I prepare a release APK.
Regards
On May 1, 5:00 pm, Dave Sparks davidspa...@android.com wrote:
Can you repro this with the camera application?
On May 1, 6:22 am, blindfold seeingwithso...@gmail.com wrote:
I found
Voice recognition is a technology. You need an application to make use
of it, for example the voice dialer.
On May 1, 11:17 am, Yash Patel yashjpa...@gmail.com wrote:
HI,
does any one know How to turn on Voice Recognition on Emulator. or is it
required to have phone or dev phone to test
Did you include android.permission.CAMERA in your manifest?
On May 1, 3:21 pm, Jason Proctor juvat...@gmail.com wrote:
(resend from different address, see if it makes it this time.)
is video recording supported in 1.5?
i got it mostly working with the Haykuro 1.5 ADP image - the video
file
What is the error?
On May 1, 5:18 pm, Yash Patel yashjpa...@gmail.com wrote:
I mean to say Speech Recognization. I tried to create one small application
but it gives me error.
Thanks
Yash Patel
On Fri, May 1, 2009 at 4:54 PM, Dave Sparks davidspa...@android.com wrote:
Voice
No, you have always needed a camera permission to access the camera.
It's new to the MediaRecorder API because we didn't add video support
until 1.5.
On May 1, 3:59 pm, Jason Proctor juvat...@gmail.com wrote:
nope, never needed it. is the requirement new?
i'll give it a go, thanks.
(looks
Do you have a stack trace from the log?
On Apr 30, 4:51 pm, petunio juanjosegilmen...@hotmail.com wrote:
Hi
I am finally testing my application on a G1 and even though it works
fine on the emulator, it crashes on the G1
it crashes when it does:
setContentView(R.layout.mylayout);
the xml
Android does not support playing two video streams at the same time.
On Apr 30, 1:48 am, N V nithi...@gmail.com wrote:
Hi to all
I am playing 2 videos(.mpg4) at time... Some times its works
fine, But some times give
error like Cannot Play the Video Can any One tell me Why its
Encode in an mp4 or m4a file.
On Apr 28, 12:35 pm, Moto medicalsou...@gmail.com wrote:
Could I somehow trick the player on playing something of this format?
I know that there is support for AAC encoded files but just how?
Thanks!
Moto!
--~--~-~--~~~---~--~~
AudioRecord gives you access to 16-bit PCM audio from the microphone
and AudioTrack gives you a way to output 16-bit PCM audio to the
output device.
On Apr 28, 8:50 am, intbt tacbe...@gmail.com wrote:
Thanks, I think AudioTrack may be what I am looking for to read the
codec output???
SoundPool has too much jitter for a serious music application. If you
want to control the jitter, you need to output to a single AudioTrack.
On Apr 28, 8:27 am, Marco Nelissen marc...@android.com wrote:
On Mon, Apr 27, 2009 at 5:50 PM, rookie1_1998
eric.yongjun.c...@gmail.comwrote:
I need
Try this:
mp.prepare();
mp.seekTo(0);
mp.start();
And get rid of your onPreparedListener. It is unnecessary since you
are calling prepare().
On Apr 27, 1:20 am, Sudha sudhaker...@gmail.com wrote:
Hi,
I have a requirement to play several sounds many times in my game so
instead of creating
I assume this in the emulator. I believe the issue is that the
emulator does not forward the UDP packets you need for the RTP
session. This should work on a real device.
On Apr 27, 12:28 am, awwa awwa...@gmail.com wrote:
I'm trying to play streaming video(*.3gp) with android SDK 1.5 pre.
I
OK, so it sounds like audio is being produced by the kernel driver.
I just looked at your code, and I think you need to call read() once
to pass in your first input buffer.
On Apr 24, 6:04 pm, Steven_T gwb...@126.com wrote:
hi Dave Sparks:
thank you for reply!
I didn't disable audio input
I believe this is a known limitation of the emulator. There is a
feature request to allow for more sample rates, but no one is actively
working on it. The source code is available if someone wants to take
it on.
On Apr 25, 2:36 pm, szabolcs szabolcs.vr...@gmail.com wrote:
Dave, Yoni,
Thank
Use SoundPool.
On Apr 26, 9:34 am, BlackLight blacklight1...@gmail.com wrote:
I have other problem now. Lets say I have 10 buttons (0-9), when user
press button program should play short (0.3-0.5 secs) sound. I added
them as wav resources. Now I see that each MediaPlayer creates its own
Is this on the emulator? If so, it may be a limitation of the
emulator.
On Apr 24, 3:25 am, szabolcs szabolcs.vr...@gmail.com wrote:
Steven,
Thank you for your reply.
I know AudioRecord works with a sample frequency of 8000Hz. In my
initial post I was asking why this frequency is the ONLY
Did you enable audio input in the emulator?
On Apr 23, 6:48 pm, Steven_T gwb...@126.com wrote:
hi Dave Sparks:
I have changed 50 frames to 400 frmaes, it doesn't work.
then I set bufferSizeInBytes to 100 to init AudioRecord object,
and set update period to 400,
it dosn't work
I suspect the problem is the interval you chose: 50 frames @ 8KHz is
6.25 msecs. Your app is not going to be able to handle a callback
every 6.25 msecs. Try something more reasonable like 50 msecs (400
frames) and see if that works.
On Apr 23, 1:56 am, Steven_T gwb...@126.com wrote:
hello
.
what is more, how about the support for flash video(flv)? Do you know
anything about that? and will it being supported for streaming?
tainy
On 4月23日, 上午7时59分, Dave Sparks davidspa...@android.com wrote:
Progressive streaming using HTTP is well-supported.
RTSP support isn't great yet
This forum is for application developers to discuss supported features
in the SDK. If you want to discuss Android framework, then please take
this to android-framework.
On Apr 22, 7:40 pm, david 1 david...@gmail.com wrote:
OK, thanks. BTW, does OpenCore-2.01 or 2.02 support it?
david
This is not possible. Telephony audio is handled by the baseband
processor and not accessible to the application processor.
On Apr 22, 3:27 pm, Jens K. jens.k...@googlemail.com wrote:
Hi there,
we're a group of students and trying to implement some kind of voice
cryptography over a gsm
streaming? I found no place saying that is not
supported, but someone said video streaming is not available by now.
and if it will be supported, what format will be ok for streaming?
thanks!
tainy
On 4月16日, 上午3时40分, Dave Sparks davidspa...@android.com wrote:
AAC inside an MP4 file is fine
?
On Apr 21, 5:24 pm, Dave Sparks davidspa...@android.com wrote:
That is correct. The encoders are only available through the OpenCORE
author engine which does not currently supporting a streaming
interface.
On Apr 21, 1:53 pm, j jac...@gmail.com wrote:
I would like to use the new
That is correct. The encoders are only available through the OpenCORE
author engine which does not currently supporting a streaming
interface.
On Apr 21, 1:53 pm, j jac...@gmail.com wrote:
I would like to use the new AudioRecord class to record in AMR-NB
format. But android.media.AudioFormat
There have been no changes in RTSP support for Cupcake other than a
couple of minor bug fixes.
On Apr 21, 2:46 am, caijing jcai2...@gmail.com wrote:
Hi,
I am doing development on RSTP streaming on cupcake. I want know more
clear about it, so my question is, which version of 3GPP PSS has been
I think we're still working on sample code for the SDK.
On Apr 20, 11:05 am, j jac...@gmail.com wrote:
Thank you Dave. Is there any sample code utilizing AudioRecord that I
can refer to?
My goal is to write the audio to a RTP network stream.
On Apr 16, 1:21 pm, Dave Sparks davidspa
No, it only supports raw PCM.
On Apr 20, 2:36 pm, j jac...@gmail.com wrote:
I can see from documentation that AudioRecord supports
PCM 16 bit per sample
and
PCM 8 bit per sample
Does it support things like u-law and a-law like the Java Media
Framework?
Thanks!
We are working on getting this fixed in the Cupcake SDK.
On Apr 16, 1:27 pm, Jason Proctor ja...@redfish.net wrote:
right -- i got an error calling -
recorder.setVideoSource (MediaRecorder.VideoSource.CAMERA)
- which seemed to indicate that there isn't a shim for the video
recording the
No, this would be impractical to do in Java.
On Apr 15, 11:16 pm, Sheado chad...@gmail.com wrote:
Howdy,
Does anybody know of a way to directly access the Video Encoders
provided by the (android.media.MediaRecorder) API? I'd like to make
changes to the raw camera data before it gets
I'm not clear what you mean by full size.
VideoView displays as large as possible without distorting the aspect
ratio. For all releases through Cupcake, we assume the display is
HVGA. If videois to fill the screen, it must be 1.333 aspect ratio.
For CIF or QCIF format, the aspect ratio is
The native code was throwing IOException even in 1.0, but it wasn't a
checked exception in Java. We felt it was best to expose it as a
checked exception since it can occur under normal circumstances when
another application or service has control of the camera.
On Apr 14, 11:10 pm, Tom Gibara
channel?
On Apr 14, 11:56 pm, Dave Sparks davidspa...@android.com wrote:
We don't have an API for progressive download yet. Technically, the
website should say progressive streaming, but from a file authoring
perspective there is no distinction between the two.
RTSP support is only available
I'm pretty sure that this will not work. There is a limit of one video/
camera SurfaceView active at a time.
On Apr 15, 3:52 am, iblues iblues@gmail.com wrote:
Hi,
I am creating an application where I have 2 Surface Views (1 each in
top and bottom half of the screen) along with a button.
This is a limitation in the baseband processor - this is a feature the
hardware vendor would have to support.
On Apr 15, 2:01 am, Selem necrod...@gmail.com wrote:
On Apr 6, 6:39 pm, Dave Sparks davidspa...@android.com wrote:
You can't. The G1 firmware does not support it.
Dave, do you mean
We don't have an API for progressive download yet. Technically, the
website should say progressive streaming, but from a file authoring
perspective there is no distinction between the two.
RTSP support is only available for MPEG-4 file formats. There is no
support for raw AMR, AAC, or MP3
MediaPlayer states are protected by mutexes in the native layer.
However, the playback complete is an asynchronous event that comes on
a binder worker thread. It's possible that you could call reset() in
the window after the native media player service has posted a playback
complete message to
No, this is not supported. If we ever do support it, it would require
a special permission that the user would have to grant, and it would
break for DRM content for obvious reasons.
On Apr 10, 1:44 am, devi prasad dpras...@gmail.com wrote:
I want to develop an app that lets one intercept raw
The Windows Media codecs are not part of OpenCORE. They require a
separate license which is typically negotiated between the hardware
manufacturer and the codec supplier.
On Apr 9, 4:49 am, l hx lihongxia8...@gmail.com wrote:
now , the opencore can not play the audio of wma. is it right? and
You can't. The G1 firmware does not support it.
On Apr 6, 6:00 am, Dilli dilliraomca...@gmail.com wrote:
Hi i am developing a recorder application
It works fine
but when phone call comes it records only one way conversation
but the caller voice is not recorded..
I found audio sources as
This is not the appropriate forum for this question. I suggest you try
in android-platform.
On Apr 2, 10:57 pm, Chen Yang sunsety...@gmail.com wrote:
Resent, it seems that the thread to android-discuss is lost.
-- Forwarded message --
From: Chen Yang sunsety...@gmail.com
Yes, ANR's occur because the task doesn't respond quickly enough to
user input. This can be caused by a background thread hogging the CPU.
I assume you are referring to the Pictures app (which will be called
Gallery in Cupcake). We have made some improvements in Cupcake will
continue to work on
You can access it via the Messaging (SMS/MMS) application by clicking
on attach and selecting record audio. I believe there's also an app in
market that exposes a launcher icon for it.
On Apr 2, 3:36 am, david david...@gmail.com wrote:
hi everyone,
I can see 'Sound Recorder' under
AMR recording does work. Try the SoundRecorder application, which can
be accessed by using the attach function in the Messaging app.
On Apr 2, 1:01 am, ambrosehua huang.p...@zte.com.cn wrote:
It seems that Android doesn't support AMR recording currently, which
is actually different from audio
to out folder.
Where can I get the class file.
I need to import the android.media.MediaMetadataRetriever in my test
AP.
Thanks~~
-wei
On 3月8日, 上午2時29分, Dave Sparks davidspa...@android.com wrote:
No, we're just adding support for extracting a static thumbnail.
What is your use case
It's pretty obvious that the path to your video file is located on
your Windows workstation. You have to remember the emulator is running
Linux in an emulation of an ARM processor. It doesn't have access to
the files on your workstation. You need to copy the video file to the
SD card emulation
to? A summarizing info is also appreciated. Many thanks in
advance.
I've issue this question
athttps://mail.google.com/mail/?hl=enzx=1mf4sjv3oovloshva=1#starred/1
I'm sorry if it trouble you again. Please reply it just here. Thanks again.
david
2009/4/3 Dave Sparks davidspa
Agreed. It takes about 100 msecs to spin up the audio output once it
goes to standby. You will seeing something like this in the log:
W/AudioFlinger( 35): write blocked for 103 msecs
If it's taking several seconds, there must be something else involved.
On Apr 1, 11:51 am, Marco Nelissen
Have you looked at the VideoView widget?
On Apr 1, 2:41 am, jaimin jaiminmeht...@gmail.com wrote:
hi,
i have problem in playing video file. Video file is playing nice but i
want to play video file in the whole emulator (size).Right now video
file is playing but in small size.
so any
Looks like you are repeatedly calling the MediaPlayer.start() method
when it is in an uninitialized state.
On Apr 1, 12:42 am, Ramesh uthir...@gmail.com wrote:
Hi,
I am trying to play media player.Its working fine but in log i am
getting this kind of error
Can anybody tell me why this error
Cupcake is based on OpenCORE 1.0. OpenCORE 2.0 was integrated into the
master branch, not Cupcake. You can find updated docs in the external/
opencore project.
Please bear in mind that while the VT low-level stack is included in
the OpenCORE 2.0 release, no work has been done to integrate with
This is not currently possible.
On Mar 25, 11:11 am, denzel dimitri.deroc...@gmail.com wrote:
Hello,
I want to receive the data from the preview of the Camera, but without
having to set a SurfaceView with a SurfaceHolder in the view of my
application.
I want to do something like this:
The current G1 software uses the OpenCORE software codecs except for H.
264 where the hardware codec is used.
On Mar 25, 3:38 am, wangxianjian8311 wangxianjian8...@163.com wrote:
hi all!
i want to know whether the g1 use the pv omxcore in opencore . if i do
not have any hardware codec.
No, there is no built-in package to convert from AMR to MP3 or WAVE.
If you want to do this in the current SDK, you'll need to write your
own code to do the conversion.
On Mar 18, 2:44 am, zeeshan genx...@gmail.com wrote:
Hi Android Experts,
i want to convert recorded AMR sound into mp3 or
Just in case there is any confusion, your APK is not going to contain
the MP3 files that were loaded on the virtual SD card in the emulator.
If you want to play MP3's on the device's SD card, you need to install
them there, either by copying them from your workstation, downloading
them from the
We are planning to move to a model where most of the apps are built
against the SDK. Unfortunately, we're not quite there yet.
I expect to release a better camera sample application in the next
SDK. This one will be buildable against the SDK.
On Mar 16, 8:59 am, Hans hkess...@gmail.com wrote:
We have a number of issues with metadata that need to be sorted out.
Unfortunately, it will have to wait until the next major release.
Unless I'm mistaken, micro-thumbnails are generated by the music
player for album art. This is just an optimization to improve list
flinging operations in the
There are no public classes for decoding AMR to raw PCM. 3GPP is just
a thin wrapper around the raw AMR stream.
On Mar 13, 4:01 pm, benmccann benjamin.j.mcc...@gmail.com wrote:
Any ideas on the easiest way to get the raw data from a file recorded
by the MediaRecorder class? I am going to have
You should be asking this question in android-framework - this list is
for application developers.
However, the short answer is that we are removing this API in the next
major release. In fact, the entire audio architecture is going to get
a new treatment.
We are planning on a replacement for
I don't know of anyone who has tested a video or camera preview
surface on a ScrollView. I suspect that it won't work on a G1. There
is a known issue using either of these on a TabView.
I derived my own CameraView widget from SurfaceView and placed it in
an XML layout. Works just fine.
On Mar
I've never tried an MP3 that big - it's quite possible that it won't
work. I think the largest file we ever tested was a one hour MP3 mix
file. I think it was about 70 MB.
On Mar 11, 6:25 pm, g1bb corymgibb...@gmail.com wrote:
For the record, here is the code:
MediaPlayer player = new
I can't comment on T-Mobile's obligations to you - that is determined
by the terms of the agreement you signed when you signed up. The
Android team does have liaisons who work with the carriers and device
manufacturers and we receive bug reports from them.
On Mar 11, 2:56 pm, Joseph G
hiraya...@gmail.com wrote:
Hi David
will you officially support the camera API in portrait mode?
On Wed, Feb 18, 2009 at 1:08 AM, Dave Sparks davidspa...@android.comwrote:
This code shouldn't even compile, this line is wrong:
mCamera.takePicture(null, mPictureCallback); // requires 3
nowvolume
adjustment via thevolumebuttons is unworkable.
Thanks.
On Feb 2, 11:35 pm, Dave Sparks davidspa...@android.com wrote:
Thevolumeadjustment is context sensitive. You can tell whichvolume
is being adjusted by thevolumedisplay.
If YouTube or the music
If your app has the permission MODIFY_AUDIO_SETTINGS you can route
output to the earpiece using the setRouting API in AudioManager.
On Mar 9, 8:15 am, cht caoht...@gmail.com wrote:
for the OEM, on the product line,we need to test the earpiece whether
it works well.
so I want to write a
Is this on a G1 or on the emulator?
On Mar 8, 10:18 pm, manoj manojkumar.m...@gmail.com wrote:
Hi friends,
I have developed a video playing application.
I have a problem playing a video file.
Actually it plays well (both audio and video are coming).
But nothing is visible in the
The H.264 codec in the G1 is baseline profile Level 1.3. Maximum frame
size is 480x320 (happily the same dimensions as the screen).
Recommended maximum bit rate is 384Kbps. Maximum frame rate is 30 fps.
It does not support B-frames (consistent with baseline profile). If
you stay within these
a frame every few seconds from
outbound video stream and show a little image on screen?
On Feb 11, 11:14 pm, Dave Sparks davidspa...@android.com wrote:
There is no support for thumbnail extraction in SDK 1.0. It's coming
in Cupcake as an adjunct to video record.
On Feb 11, 7:30 am, Freepine
I believe screen capture is disabled in production devices for
security reasons.
On Mar 5, 7:49 pm, volk...@aol.com volk...@aol.com wrote:
I'm having trouble getting a screenshot. I installed the SDk,
Eclipse, the drivers, enable usb debugging, like the instructions say.
I open DDMS and it
Filing a bug is the way to get things fixed. Most Google Android
engineers do not read this list. They do respond to bug reports
though.
On Mar 7, 3:01 pm, Stoyan Damov stoyan.da...@gmail.com wrote:
On Sat, Mar 7, 2009 at 8:58 PM, strazzere str...@gmail.com wrote:
Your interaction with the
If you are trying to play this on a G1, the frame size is too large.
The H.264 codec is base profile up to HVGA (480x320).
It's pointless to encode at VGA when the screen is HGVA - you're
wasting half your bandwidth for pixels you will never see.
On Mar 6, 1:42 am, manoj
);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(test.3gpp);
recorder.prepare();
recorder.start();
On Feb 26, 12:14 am, Dave Sparks davidspa...@android.com wrote:
You can get the path to external storage (e.g. SD card
You need to tell the mediaplayer where to display the video with
setDisplaySurface(). Check out the media demo apps on
developer.android.com.
On Mar 4, 11:45 pm, Nithin nithin.war...@gmail.com wrote:
hi,
I tried a simple mediaplayer application, just to run a .3gp file.
First, i put the .3gp
The only audio format supported on G1 is AMR format (raw .AMR file).
On Mar 4, 1:18 pm, zeeshan genx...@gmail.com wrote:
Hi,
can anyone tell me what is the default format of android recording.
i have recorded an audio clip but dont know how can i check its
extention?
i am using this
The image capture intents for the 1.0 and 1.1 releases only allow for
small images intended for email or MMS attach. The next SDK release
for Cupcake will add support for setting the image size.
On Mar 2, 9:00 pm, Ondra Zahradnik ondra.zahrad...@gmail.com wrote:
Hello I am trying to take
1 - 100 of 309 matches
Mail list logo