oh? will provide 2 way engine too?
On Jan 9, 12:04 pm, rktb <[email protected]> wrote:
> Very soon :-)
>
> On Jan 9, 8:21 am, freepine <[email protected]> wrote:
>
> > Thanks, Dave. that's great.
> > Is there a planed timeline for it?
>
> > On Fri, Jan 9, 2009 at 11:16 AM, Dave Sparks <[email protected]>wrote:
>
> > > This sounds a little like a H.324M stack. If so, you should know that
> > > Packet Video is planning to supply an H.324M stack for OpenCore.
>
> > > On Jan 7, 11:05 pm, freepine <[email protected]> wrote:
> > > > You mean your case is similar with the URLs of HTTP download or RTSP
> > > > playback in Opencore, right?
> > > > If so, I think you might need to implement a customized source node to
> > > > retrieve encoded data from modem and integrated into opencore framework.
> > > You
> > > > can refer to the implementation of PVMFDownloadManagerNode in
> > > > external/opencore/nodes/pvdownloadmanagernode.
>
> > > > Then you can configure the node registry to create your customized node
> > > when
> > > > recognized some special URL like "modem://xxx"...
>
> > > > Just my 2 cents,
> > > > -freepine
>
> > > > On Thu, Jan 8, 2009 at 12:43 PM, Girish <[email protected]> wrote:
>
> > > > > Hi Dave and freepine,
>
> > > > > In my case the encoded data to ¨my native c code¨ is received from
> > > > > modem. Which is nothing but ENCODED FRAME.
>
> > > > > But in case if i pass this encoded frame data to the application,
> > > > > media player cant decode this data. As it either expects a URL or file
> > > > > stream.
>
> > > > > This scenario nearly meets the requirement of passing URL to the media
> > > > > player.
>
> > > > > In URL case also do modem gets the data from network sends the data
> > > > > from network to MMFW or to Application ? How this is taken care ?
>
> > > > > Below is my understanding of the android architecture.
>
> > > > > ----------------------------------------
> > > > > Application
> > > > > ----------------------------------------
> > > > > Application framework
> > > > > ----------------------------------------
> > > > > native applications
>
> > > > > ******************** ****************
> > > > > * my_native_c_code *====> * MMFW *
> > > > > ******************** ****************
>
> > > > > ----------------------------------------
> > > > > Kernel
>
> > > > > ====================================================== MODEM-
> > > > > APPLICATION interface
>
> > > > > MODEM
>
> > > > > ---------------------------------
>
> > > > > Please clarify if my understaning is wrong.
>
> > > > > Regards
> > > > > Girish
>
> > > > > On Jan 7, 9:15 pm, freepine <[email protected]> wrote:
> > > > > > I wrote a simple native app for testing only. With it PV can
> > > construct
> > > > > node
> > > > > > graph and start data flow correctly, but obviously it can't display
> > > video
> > > > > > frames on UI:) You can dump decoded output to a file and analyze it
> > > > > manually
> > > > > > or play it with YUV player. just FYI.
> > > > > > BTW, opencore itself brings with a unit test framework under
> > > > > > external/opencore/engines/player/test. You can build it by
> > > uncommenting
> > > > > its
> > > > > > make file in external/opencore/Android.mk.
>
> > > > > > =======================================
> > > > > > int main(int argc, char** argv)
> > > > > > {
> > > > > > sp<ProcessState> proc = ProcessState::self();
> > > > > > proc->startThreadPool();
> > > > > > MediaPlayer mediaplayer;
> > > > > > if(argc > 0)
> > > > > > {
> > > > > > LOGI("set datasource: %s", argv[0]);
> > > > > > mediaplayer.setDataSource(argv[1]);
> > > > > > }
> > > > > > else
> > > > > > {
> > > > > > LOGI("set default datasource: /data/test.mp4");
> > > > > > mediaplayer.setDataSource("/data/test.mp4");
> > > > > > }
> > > > > > sp<SurfaceComposerClient> client = new SurfaceComposerClient;
> > > > > > int pid = getpid();
> > > > > > sp<Surface> surface(client->createSurface(pid, 0, 176, 144,
> > > > > > PIXEL_FORMAT_OPAQUE,
> > > > > > ISurfaceComposer::eFXSurfaceNormal|ISurfaceComposer::ePushBuffers));
> > > > > > mediaplayer.setVideoSurface(surface);
> > > > > > mediaplayer.prepare();
> > > > > > mediaplayer.start();
> > > > > > for(int i=0; i<10; i++)
> > > > > > {
> > > > > > sleep(1);
> > > > > > LOGI("playing, %d seconds\n", i);
> > > > > > }
> > > > > > mediaplayer.stop();
> > > > > > LOGI("quiting...");
>
> > > > > > }
>
> > > > > > On Thu, Jan 8, 2009 at 10:02 AM, Dave Sparks <
> > > [email protected]
> > > > > >wrote:
>
> > > > > > > That's not a scenario we plan to support.
>
> > > > > > > At the very least, you need a thin Java app that interacts with
> > > > > > > all
> > > > > > > the API's that are not available in native code. If you want a
> > > > > > > pure
> > > > > > > native app, you are going to end up writing a lot of native shims
> > > to
> > > > > > > talk to Java code.
>
> > > > > > > On Jan 7, 2:51 am, Girish <[email protected]> wrote:
> > > > > > > > Dear all,
>
> > > > > > > > In my requirement i have to access OpenCORE MMFW from my native
> > > > > > > > c
> > > > > code
> > > > > > > > for decoding video data . Is it possible to access this way ?
>
> > > > > > > > Is OpenCORE MMFW exposes any APIs for accessing these decoder
> > > APIs ?
> > > > > > > > like initdecoder, decode_one_frame , deinitdecoder ? which file
> > > > > > > > exposes this kind of APIs. Please give some clues.
>
> > > > > > > > Even i want to display the decoded data on to UI . Is this
> > > possible?
>
> > > > > > > > Regards
> > > > > > > > Girish
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"android-framework" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-framework?hl=en
-~----------~----~----~----~------~----~------~--~---