You should be able to use the pvplayer_engine_test and run test 851.
adb shell cd /sdcard; pvplayer_engine_test -test 851 851 -source
rtsp://my.web.server.file.3gp
-Ravi
On Mar 30, 8:12 am, MMF android...@gmail.com wrote:
Hi Friends,
Can anyone please let me know what is the procedure to
You are mixing two different things. You are creating a low-level core
system service, and then trying to connect to it as if it is a high-level
application service. If you are using service manager to publish it, you
need to use service manager (in Java ServiceManager) to access it.
On Mon,
There is technically no limit to the number of nodes that can be
managed. So far, we have not envisioned the player engine directly
marshall more than 3 nodes. However, you may take the case of the
streaming in which we use the streaming manager node. This node in
itself manages 4 other nodes.
what is the advantage and special feature of Binder in android?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
android-framework group.
To post to this group, send email to android-framework@googlegroups.com
To
-- Forwarded message --
From: vishal bhoj vishalb...@gmail.com
Date: Tue, Mar 31, 2009 at 11:39 AM
Subject: web widgets
To: android-platf...@googlegroups.com, android-beginn...@googlegroups.com
hello ALL,
Are there any web widgets for android ? if not does google have plans to
There are a lot post-processing especial in audio, as
Equalization ,Mutichannel down mixer. Typically It should provide a
interface to user to select which post-processing and the parameters
for the post-processing. If we put them in MIO, It will be hard to
dynamically control whether load a
Here is a link for you : http://www.open-binder.org/
On 3月31日, 下午4时11分, bai.luo bai@zte.com.cn wrote:
what is the advantage and special feature of Binder in android?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Hi Ravi,
I set up a ubuntu server on my PC.I am running the test case number 851 just
like what you mentioned.
Using Wire shark I observed that all the RTP packets are being received but
they are not reaching the emulator.
The buffered percentage is always 0.
Can you tell me how you could
Hi rktb,
Thanks for your response.
Now trying to record .amr by using test_pvauthorengine appended with '-
audio amrtestinput.amr -output /sdcard/1.amr -test 11 11', but always
failed at 'Audio Config File not available!!! Specify -audioconfigfile
filename'. What does the configure file look
hmm... this is unfamiliar territory. and I can't find it either :). I
will make an action item on my end to get this documented.
For now, you can look at the API LoadAudioConfiguration() that
attempts to read the the config file. From here, it looks like the
file requires 3 lines of data:
1st
Hi,
Is there an issue with the sdcard not being recognized on master
latest? From the log also, I see the mediascanner service scan only
the internal directory. I don't see the external directory.
yend...@yendurid630:~/oha_android$ grep MediaScanner /tmp/loge.txt
03-31 03:12:09.977 579 579
Google has acknowledged that there is a known issue with emulator and
RTSP streaming. There have been various attempts by folks to get this
working, but only a few have been successful.
For instance, I can do RTSP streaming from my emulator when I am on a
particular network. But, with the same
Yes, I think it's broken, both on cupcake and master. My understand is
that vold (the volume-management daemon, which replaced mountd) is
looking for a file that is not installed by the default configuration.
I've filed http://b.android.com/2335
JBQ
On Tue, Mar 31, 2009 at 3:38 AM, rktb
Thanks, that's very helpful!
I've heard mention of a feature called DPI-independence (or some
such)... is that something coming up in Donut, or later? What would
that give us? I'm looking to figure out what's needed long-term to
make apps run across as wide a range of devices as possible.
Hi,
Part of the resolution independence support is already in Cupcake.
This gives you a proper scaling of the dpi unit to pixels (1 dip at
160 density == 1 pixel, 1 dip at 240 density == 1.5 pixel), as
pre-scaling/auto-scaling of drawables and density specific resources
(res/layout-120dpi/,
Thanks JBQ. Got around it.
-Ravi
On Mar 31, 4:55 am, Jean-Baptiste Queru j...@android.com wrote:
Yes, I think it's broken, both on cupcake and master. My understand is
that vold (the volume-management daemon, which replaced mountd) is
looking for a file that is not installed by the default
I have fixed the master build to include OpenCORE now. The basic
functionality is up an running. There is still a known issue with the
sdcard not being detected as mentioned by JBQ above.
The changes are in
https://review.source.android.com/Gerrit#change,9446
But, the ServiceManager is not an exported java API at all.
Thanks
Yi
On Tue, Mar 31, 2009 at 12:48 AM, Dianne Hackborn hack...@android.comwrote:
You are mixing two different things. You are creating a low-level core
system service, and then trying to connect to it as if it is a high-level
It is not in the SDK, and as a rule applications should not be directly
accessing system services. You'll note that there are tons of system
services in the standard android platform, and they all have appropriate SDK
APIs for calling them (and the Context.getSystemService() API to allow apps
to
So it seems that I only have following choices:
1. write my own JNI to access my service.
2. hack the ApplicationContext to add my service into into getSystemService.
I also need to build my own service class into android.app package. In my
client, I will call getSystemService collect my service
Could you please explain more what you are trying to do? If you are trying
to add a service with a public API for applications to use, one approach you
can take is to make a shared library that apps request with uses-library
which has APIs to retrieving and calling the service. That shared
I'm trying to relay some information (events) between Android application
and a native application. The idea is to have a background native process
running as a service that can be accessed by Android applications through
binder interface. The native process is written by C and it will be a lot
Okay so just do what I suggested, add your own shared library for accessing
it. You are going to need to do that anyway, since you will at least need
to have the binder interface somewhere for someone to link to, or if you
weren't going to use the binder surely you would have something besides a
Dianne,
BTW--- should we make the binder.c in the service_manager directory into a
library? So that people who write native service by C can reuse the code.
What do you think?
Yi
On Tue, Mar 31, 2009 at 4:41 PM, Yi Sun beyo...@gmail.com wrote:
Dianne,
Thank you for the hint. I will try this
The changes are in, and master now builds with OpenCORE.
Thanks Ravi (and team)
JBQ
On Tue, Mar 31, 2009 at 12:22 PM, rktb yend...@pv.com wrote:
I have fixed the master build to include OpenCORE now. The basic
functionality is up an running. There is still a known issue with the
sdcard not
Sorry I have never actually looked at that code; I have only dealt with the
C++ code.
On Tue, Mar 31, 2009 at 5:11 PM, Yi Sun beyo...@gmail.com wrote:
Dianne,
BTW--- should we make the binder.c in the service_manager directory into a
library? So that people who write native service by C can
many thanks for your quick reply anyway and being anxious for your further
info.
david
2009/3/31 rktb yend...@pv.com
hmm... this is unfamiliar territory. and I can't find it either :). I
will make an action item on my end to get this documented.
For now, you can look at the API
2009/3/31 waterblood guoyin.c...@gmail.com
1. As a multi windows systems, each window whill hold two buffers
(front buffer, back buffer in surface). Is all the buffer size
determined by the windows size or the display pannel size? If its size
is same as the window'size, whether surface will
Hello folks,
When we try to implement orientation in AP layer,
we can set android:screenOrientation parameter in
AndroidManifest.xml file or call setRequestOrientation() method of
Activity class.
Can every experts tell me when I set the
android:screenOrientation parameter, which process will
On 4月1日, 上午10时10分, Dianne Hackborn hack...@android.com wrote:
2009/3/31 waterblood guoyin.c...@gmail.com
1. As a multi windows systems, each window whill hold two buffers
(front buffer, back buffer in surface). Is all the buffer size
determined by the windows size or the display pannel
What are you trying to achieve?
JBQ
On Tue, Mar 31, 2009 at 7:23 PM, UJ ujhu...@gmail.com wrote:
Hello folks,
When we try to implement orientation in AP layer,
we can set android:screenOrientation parameter in
AndroidManifest.xml file or call setRequestOrientation() method of
Activity
int foo()
{
try
{
//do something.
}
catch()
{
}
...
}
seems above code can't pass the build in android env, is there anyway
to use it in android ?
Thanks.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Ok...it is more involved than I originally thought. The below is the
format:
/*
* Some of the compressed inputs need a log file to go along with
the bitstream. This logfile is provided
* using the -audiologfile or -videologfile commandline
option. In case of audio and video
2009/3/31 waterblood guoyin.c...@gmail.com
if so, the Layer must have information about its position in display.
But I only see it is created as the code below:
Layer* layer = new Layer(this, display, client, id);
status_t err = layer-setBuffers(client, w, h, format, flags);
so layer
preview and recording interface of Camera has been separated.
Camera Interface
frameworks/base/include/ui/Camera.
mediarecord Interface
frameworks/base/include/media/mediarecorder.h
mediarecord only have 1 stop() interface , it means stop recording and
preview.
If we want to use only stop
Thanks for your answer firstly.
I originally think the data in the 'amrtestinput.amr' should be the raw
data, just as PCM data. But if it was compressed as .amr already, it doesn't
matter either, I think. The key point is the recording procedure works well,
PV does not(and should not) care about
Hi Ravi,
Thanks for the info.
You had mentioned that in some networks it didn't work. But when I
start the emulator it uses the port 5554. When I did a netstat it
should that the protocol was TCP. since RTP packets are streamed using
UDP, does it mean that this port doesn't allow UDP packets
37 matches
Mail list logo