you create two cfg files to register your module?
Normally, the cfg file should be pushed to /system/etc, but /etc should be a
symbol link of the same directory if I remember correctly.
BTW, as JBQ pointed out, perhaps we need move to android platform list to
continue the discussion.
-Freepine
2009
I think you can write a customized MIO which accepts AMR streams and
interacts with DSP directly.
http://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=doc/mio_developers_guide.pdf;h=9a8314fa3d13fa79f591e917fc48bebf22fd47d5;hb=419084f1ef3f188cdf3ea3e5b13ba8664f32ff59
On Thu,
You might want to take a look at SoundPool class and its corresponding
implementation in native layer, which could be a good reference if the class
itself doesn't satisfy your needs:)
http://developer.android.com/reference/android/media/SoundPool.html
On Thu, Jul 16, 2009 at 3:16 PM, sandy8531
in pv_player_node_registry.cpp and
oscl_shared_library.cpp to see what's happening.
-Freepine
2009/7/14 shadow yuri.bul...@gmail.com
Hi Freepine,
I've tried both variants for tracing, including logs file dumping.
--
Best regards,
Yuri
On 13 июл, 21:04, Freepine freep...@gmail.com wrote:
Hi, if you
Hi, if you mean you didn't see the output of
printf(PVGetInterface()\n);in logcat, perhaps you can try to
replace it with LOGE:)
-Freepine
On Mon, Jul 13, 2009 at 6:19 PM, shadow yuri.bul...@gmail.com wrote:
Yes, i've tried that, my library was loaded, but none method has been
called from
So you didn't realize that pv2way is part of Opencore 2.0? ;)
PV stands for Packet Video, the original contributor of Opencore. It
includes pv2way, pvauthor and pvplayer.
On Wed, Jun 24, 2009 at 12:10 PM, Chris gamza...@gmail.com wrote:
As I know 'pv2way' is the video telephony engine based on
Not clear about how you defined g_surface, but yes, the surface type should
be LayerBuffer for video playback, which can be specified by a flag
ePushBuffers
at creation time.
On Wed, Jun 17, 2009 at 5:08 PM, dglushko dglus...@gmail.com wrote:
Hi,
I have the following native source code that
playback.
On Thu, Jun 11, 2009 at 5:42 PM, Freepine freep...@gmail.com wrote:
If you don't want to modify mp4 recognizer plugin to ignore avi files,
perhaps you can try to register avi recognizer plugin before mp4 recognizer.
(e.g. put avi on top of mp4 items in pvplayer.cfg file).
On Wed
How did you add the logs into mp4 recognizer? By printf? Or using android
marco(e.g LOGE) directly?If you were using PV logging mechnism, did you get
a logger object first?
e.g.
iLogger = PVLogger::GetLoggerObject(PVPlayerEngine);
It would be helpful if you pasted your code snippet.
On Tue, Jun
It seems currently there isn't any logs inside recognizer/plugins :)You can
write 8 into pvlogger.txt, then it will capture all logs.
On Mon, Jun 8, 2009 at 7:14 PM, Devaraj devara...@gmail.com wrote:
As we enter 8,PvPlayerEngine in the pvlogger.txt to get the traces
from PVplayerEngine what
#define LOG_NDEBUG 0
is required to enable LOGV.
On Tue, Jun 9, 2009 at 8:38 PM, Dev devara...@gmail.com wrote:
Hi,
I have added the logs using android macros LOGV(); directly.
Thanks and Regards,
-Devaraj
On Tue, Jun 9, 2009 at 5:56 PM, Freepine freep...@gmail.com wrote:
How did you
If the FD stays unchanged during playback session, perhaps you can also pass
it to MIO via PvmiCapabilityAndConfig interface.You can check how video
display info get passed to androidSurfaceOutput.
On Tue, Jun 9, 2009 at 5:00 PM, manish android.mm@gmail.com wrote:
Hi
I saw discussion
Basically, pv_omxmastercore loads all omx cores registered in .cfg files
while OMX_Init() is invoked. Then decoder node can locate the corresponding
omx component for specific media content via OMX_GetComponentsOfRole and
OMX_GetHandle.
There are a couple of documents about interaction with
Hi, then you probably want to read below files, too:)
pvmf_omx_basedec_node.cpphttp://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=nodes/pvomxbasedecnode/src/pvmf_omx_basedec_node.cpp;h=00951a98d7eb0dd4b84b19c66517f372ce5a96a5;hb=642e1d2b4da40c6dcb79b52bc68222ee018e77b2
PV code is part of AOSP, under external/opencore after your syncing from
android repository.And you can also view it from
http://android.git.kernel.org/?p=platform/external/opencore.git;a=summary
http://android.git.kernel.org/?p=platform/external/opencore.git;a=summaryThere
are several documents
http://groups.google.com/group/android-porting/browse_thread/thread/19502fce6473068b#
I guess it's the most popular question about opencore:) So it would be nice
if it can be recorded in pvlogger_users_guide.pdf
Its reference count gets increased by copy constructor operator = while
returned from createInstance().
On Tue, May 19, 2009 at 4:15 PM, xie yili@gmail.com wrote:
hi Dianne~~
thanks for you answer~~ i have implmented the camera hal with v4l2,
although i used the below source, i still
http://groups.google.com/group/android-porting/browse_thread/thread/19502fce6473068b#
On Tue, May 19, 2009 at 5:57 PM, Devaraj devara...@gmail.com wrote:
Hi All,
How to get the traces in opencore components like fileparser, nodes,
recognizer, codecs for the video playbak use case.
As
and composer nodes..
Please help me in using both the nodes in same application. I need
info on how to configure the nodes
On May 7, 6:04 pm, Freepine freep...@gmail.com wrote:
Hi, I don't know what you are trying to accomplish with the nodes:)
Perhaps
you can give more details.
On Wed
You can remove the statement of adding audio sink in playerdriver.cpp and
try again. (line 642,
I guess you can write a customized source node for it.
On Wed, May 6, 2009 at 2:03 AM, goran goranr...@gmail.com wrote:
Hi,
I would like to use the PVPlayer to decode and render audio and video
data. However the data to be decoded is not contained in a 3gp or mp4
file nor is it part of an
Hi, I don't know what you are trying to accomplish with the nodes:) Perhaps
you can give more details.
On Wed, May 6, 2009 at 2:33 PM, jai jaisreeshanmu...@gmail.com wrote:
Thanks for the information Freepine.
Actually I am interseted to use the parser and composer nodes in my
application
of using GDB ? Any one here tried this successfully
??
Thanks in advance !
Tejas
On Mon, Apr 27, 2009 at 7:29 PM, Freepine freep...@gmail.com wrote:
make ENABLE_PV_LOGGING=1 It will build android platform with opencore log
enabled.
On Mon, Apr 27, 2009 at 1:45 PM, Dev devara...@gmail.com
To support a new file format in opencore, basically you'll need 1) A parser
to retrieve track data and related track info from underlying files;
2) A node to connect the parser into player engine framework;
3) A recognizer to identify the corresponding file format;
4) Then register the node and
Is your lib built with android toolchain?
2009/4/16 DkRd pieter.schelfh...@gmail.com
libAlEngine.so is the shared library I wish to use in my player
implementation AlPlayer.cpp . The library is pushed to the out folder
and it is there after the first block in the make file (I checked the
to use that library to make it
work.
On 16 apr, 14:48, Freepine freep...@gmail.com wrote:
Is your lib built with android toolchain?
2009/4/16 DkRd pieter.schelfh...@gmail.com
libAlEngine.so is the shared library I wish to use in my player
implementation AlPlayer.cpp . The library
/sdcard is mounted as noexec, so you can't execute the test app from there.You
can mv it to /data and try again.
BTW: I think pvplayer_engine_test is built into system image by default?
/system/bin/pvplayer_engine_test
On Tue, Apr 7, 2009 at 3:04 PM, rk raj.10...@gmail.com wrote:
But that is
A possible way is to use SurfaceComposerClient to create a surface and set
its z-order big enough if you just want to play with it.
1. int pid = getpid();
2. spSurfaceComposerClient videoClient = new
SurfaceComposerClient;
3. spsurface
startThreadPool() will spawn a new thread into the thread pool which talks
with binder driver, while joinThreadPool() will put the calling thread
itself into thread pool.
It seems there is no API to control the maximum number of binder threads in
the pool, and sometimes driver will tell the
Yes, you can run pvplayer_engine_test from console directly.It's a native
test application and is built into system.img by default currently.You can
read below document for more information:
Maybe you need check the include path or something else. I was able to link
libmedia in native app without any tricks:)
On Fri, Feb 27, 2009 at 7:47 PM, Nikhil nikhi...@gmail.com wrote:
Sorry, just to correct myself, it's not linking error but a
compilation error.
Native MediaPlayer member
And they are all derived from well-defined interfaces, so replacing them
won't impact 3rd-party applications:)
On Sun, Feb 22, 2009 at 11:53 AM, Freepine freep...@gmail.com wrote:
I guess the entire media framework includes not only opencore, but also
media api, audio flinger and perhaps part
video MIO node is created in external/opencore/android/playerdriver.cpp.
I think you can replace AndroidSurfaceOutput with your MIO node there.
Did you leverage surface defined in framework for video display in your MIO
node? If not, it might not work well with the rest of system:)
On Fri, Feb
For playback, there are ogg, midi and PV players.For recording, currently
only PV recorder is available.
You can refer to to MediaPlayerService's implementation for details:
frameworks/base/media/libmediaplayerservice
And can you provide more information about the conflicts from HAL with
Please refer to the code in external/opencore/
android/android_audio_output.cpphttp://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=android/android_audio_output.cpp
http://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=android/android_audio_output.cpp
And you
Ah? Then how to remove a thread?I know it's off-topic, just curious:)
On Fri, Feb 13, 2009 at 9:58 PM, rktb yend...@pv.com wrote:
Looks like the OP removed the thread. !!
On Feb 13, 7:46 am, Freepine freep...@gmail.com wrote:
Please refer to the code in external/opencore/
android
it jumpInPool(). :)
On Feb 11, 10:52 pm, Freepine freep...@gmail.com wrote:
Yep. I think that explains why sometimes my test app aborted during
exiting... Thanks:)And I guess you were saying
IPCThreadState::Self()-stopProcess(), right? I thought joinThreadPool()
would put the calling thread
wrote:
Thanks for Freepine and Dave,
After modifying MediaScanner.java and MediaFile.jave,
the music player can scan .acc file and show it in the player list.
But when I choose the .aac file to play, it still can't work.
From your suggestion, I need to take some work to extract metadata
processes alive? (just like media
server:)
On Thu, Feb 12, 2009 at 2:22 PM, Dave Sparks davidspa...@android.comwrote:
Minor technicality, you should call proc-joinThreadPool() on exit to
ensure that all the binder worker threads have terminated.
On Feb 11, 9:18 pm, Freepine freep...@gmail.com
If aac files didn't appear in SONGS of music library, you will need change
media scanner to not filter out .aac files.
BTW, did you register aac recognizer plugin in recognizer registry?
shortage. I did a
simply test to kill the test app while playback is ongoing and found that
music stopped immediately. So it seems binder driver did a good job to
decrease the reference count of player object when client app is killed.
-Freepine
On Mon, Feb 9, 2009 at 4:57 AM, hdandroid hdandr
)
Even if the component is moved to OMX IDLE state, user interface can
show that player is PAUSED. There is no change in UI behavior with
this approach.
On Feb 8, 6:33 pm, Freepine freep...@gmail.com wrote:
I think Ravi has made a good point and as far as I know staying in PAUSED
state after
There are several documents available in git, not sure if there is detailed
instructions about extending Opencore with new
nodes:)http://android.git.kernel.org/?p=platform/external/opencore.git;a=tree;f=doc;h=dc6682b55061c156c1641d564e72cfb00a8ecb82;hb=d8b443ddaa386ed85ba31fbd663c40423a8d4ded
You can look into the source code of PVMFOMXAudioDecNode.Or maybe
PVMFOMXBaseDecNode
in the latest opencore 2.0 release.
It's called when you invoke System.loadLibrary(xxx) in java layer. Dalvik VM
interprets it and invokes c++ function nativeLoad() finally which actually
loads the native library and calls JNI_OnLoad.
On Wed, Jan 21, 2009 at 4:08 PM, srini amul srinia...@yahoo.com wrote:
when JNI_OnLoad()
You can take a look at MediaPlayerService.cpp in
frameworks/base/media/libmediaplayerservice.
It seems android guys are doing some refactoring:)And you can find in latest
master branch, they even changed AndroidSurfaceOutput from constructor
injection to setter injection. But it seems AndroidSurfaceOutput isn't an
interface? I am really curious what below code is prepared for :)
@@
, freepine freep...@gmail.com wrote:
You mean your case is similar with the URLs of HTTP download or RTSP
playback in Opencore, right?
If so, I think you might need to implement a customized source node to
retrieve encoded data from modem and integrated into opencore framework.
You
can
of PVMFDownloadManagerNode in
external/opencore/nodes/pvdownloadmanagernode.
Then you can configure the node registry to create your customized node when
recognized some special URL like modem://xxx...
Just my 2 cents,
-freepine
On Thu, Jan 8, 2009 at 12:43 PM, Girish htgir...@gmail.com wrote:
Hi Dave
I think it depends:).
Are you trying to implement something like progressive download in opencore
player?
If so, I think you might not want to grant the access permission of
downloaded files only to the default music player?
Will it be played as ringtone or inserted to MMS? :)
-freepine
On Thu
and also depends
on the concrete requirements.
-freepine
On Thu, Dec 25, 2008 at 2:41 PM, rktb yend...@pv.com wrote:
Hi freepine,
I would like to read more about the permissions and access before
posting any educated question. Can you point me to any code or doc? If
not, read further
You need to implement an AAC file recognizer and register it in recognizer
registry to enable AAC playback officially.
On Wed, Dec 17, 2008 at 5:08 PM, Wei twc0...@gmail.com wrote:
hi Ravi,
Thanks for your response.
I modify the PVMF_FORMAT_UNKNOWN to PVMF_AACFF and it work fine for
aac
52 matches
Mail list logo