I have an existing multi rtsp source application that records video to disk and 
streams saved video and live video across our own http protocol. I am now 
trying to add HTTP Live Streaming for portable devices. I go to the point where 
all the connections are happening and index files and ts files are created but 
the .ts files containing the H264 video are not playable. I don't know what I 
am doing wrong.

I have an evvironment and scheduler and a MPEG2TransportStreamFrameFromSource 
instance with a ESSource modeled after the DeviceSource and a class that 
inherits from MediaSink for the sink. Frames are added into the source and 
everything flows with 188 byte packets coming out the other side. The eventloop 
calls my sink and I add the 188 byte packets into the buffer that represents 
the .ts file. (it is all in memory).  The web server creates the index file 
(.m3u8 playlist) and the client comes back for the file. It downloads the file 
to disk in Firefox on windows and tries to play the video on the iPad.

Do I need rtp around es then go to ts ?


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to