Your requirement seems to play a raw 264 NAL streams, which shall be
recieved and assembled in a Java app. If you want to reuse opencore
framework, you need write a new source and file parser to receive data from
java side. Since opencore is a standalone process, you shall use any IPC
mechanism to do that (Binder, share memory, ...). To make opencore lauch
your plugins, you can define a specific URL for this case, for example,

p2pvideo://......


On Thu, Mar 5, 2009 at 4:11 AM, Hun <[email protected]> wrote:

>
> An original video is split into many substreams, and they are
> assembled back to the original stream at the mobile. So, it is just a
> single H264 video stream. I'm simply using multiple sites as a video
> source.
>
> I wonder what would be the easiest way to achieve this plan.
> Hun
>
> On Mar 4, 5:28 am, Prajnashi S <[email protected]> wrote:
> > What's assembled stream? Do you mean you receive H264 streams from
> > multisite, then play them together (display each stream in an separated
> > window)?
> >
> >
> >
> > On Wed, Mar 4, 2009 at 2:57 PM, Hun <[email protected]> wrote:
> >
> > > Hi,
> >
> > > I am developing a peer-to-peer video streaming application for
> > > Android, and I have been around this discussion for two months. Still
> > > I haven't been able to find a good approach for my application.
> >
> > > Basically, I have multiple sources in the network (as peers). Each of
> > > them sends a substream of a video to an Android mobile. A P2P
> > > application (I plan to develop) is running on the mobile, and
> > > assembles these substreams into a single stream. Then, the P2P
> > > application passes the assembled stream to a media player for
> > > playback.
> >
> > > For my purpose, none of "local file playback" and "RTSP/HTTP
> > > streaming" work. I tried to send an MP4 clip from the multiple
> > > sources, but I couldn't remove "gaps" between the playback of clips.
> >
> > > Next, I have been thinking about creating a local MP4 file by
> > > encapsulating an assembled stream. I don't think it is a good idea,
> > > because 1) MP4 encapsulation should be done in real time, and since
> > > the file will grow over time, fileOpen() won't work. 2) It is just an
> > > unnecessary, awkward step.
> >
> > > Finally, I came to a conclusion that I should write a new dataSource
> > > node that will be used by PVPlayerEngine. This node will get input
> > > from my P2P app ( I think this app should reside on top of JAVA API ),
> > > and pass the assembled stream to the next node ( decoder node? ).
> >
> > > The assembled stream will be a train of H264 NAL units (created by
> > > x264 encoder). I believe the AVC decoder included in OpenCore will
> > > handle pure H264 NAL units, since any format parser  (such as MP4)
> > > will return one or multiple NAL units that correspond to a single
> > > video frame, if the input is an H264 encoded video.
> >
> > > Please share your thoughts on my plan. Can someone provide a better,
> > > or strikingly easy solution?
> >
> > > Hun
> >
> > --
> > -- Prajnashi S
> >
>


-- 
-- Prajnashi S

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to