> Continuing question 1, I see that deliverFrame() is called by two callers,
> doGetNextFrame(), which is called by sink object, and deliverFrame0(), which
> is called by event loop when signalNewFrameData() emits an event. In my case,
> I left signalNewFrameData() never called, hence deliverFra
Media - development & use
Subject: Re: [Live-devel] how to make latency as low as possible
> My question is this.
> 1. How the sink object decide its timing of fetching data from the source?
It doesn’t. Instead, your video source object (if it’s programmed correctly)
decides when to
> My question is this.
> 1. How the sink object decide its timing of fetching data from the source?
It doesn’t. Instead, your video source object (if it’s programmed correctly)
decides when to deliver a new frame of data (by arranging for “deliverFrame()”
to get called - i.e., in handling an
yed
with Mplayer)
Thank you!
Xin
Mobile: +86 186-1245-1524
Email: x...@vscenevideo.com
QQ: 156678745
From: Ross Finlayson
Date: 2017-01-16 20:12
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] how to make latency as low as possible
You need to copy data to
You need to copy data to “fTo” and call “FramedSource::afterGetting(this)” only
*once*, for each NAL unit that you deliver. (Your code seems to be doing this
multiple times for each delivery; this is wrong.)
In other words, each call to “doGetNextFrame()” must (eventually) lead to the
delivery
sr/lib/libBasicUsageEnvironment.so.1
#6 0x76dfe224 in BasicTaskScheduler0::doEventLoop(char volatile*) ()
from /mnt/nfs/target/usr/lib/libBasicUsageEnvironment.so.1
#7 0xbc44 in main (argc=1, argv=0x7efcfdc4) at ../rtspd_main.cpp:74
Again, thanks for your patience.
Xin
Mobile: +86 1
> For the start code, I have eleminated them all. My camera's working pattern
> is like this, when encoding key frame, it outputs SC+SPS+SC+PPS+SC+Frame,
> when encoding non-key frame, it outputs SC+Frame. So, after I eleminated the
> SC, what I sent to the sink object is SPS+PPS+FRAME for key f
issed something here?
Xin
Mobile: +86 186-1245-1524
Email: x...@vscenevideo.com
QQ: 156678745
From: Ross Finlayson
Date: 2017-01-13 01:30
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] how to make latency as low as possible
Your data source class (“FramedSource” s
Your data source class (“FramedSource” subclass) looks mostly OK; however, I
suspect that your problem is that you are not using the correct ‘framer’ class
for your H.264 video ‘frames’ (in reality, H.264 NAL units).
Look at your implementation of the “createNewStreamSource()” virtual function
ad this far.
Xin Liu
VsceneVideo Co. Ltd.
Mobile:+86 186 1245 1524
Email:x...@vscenevideo.com
From: Ross Finlayson
Date: 2017-01-11 20:04
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] how to make latency as low as possible
Our server software - by itself - contribut
I’m currently a lurker on this list, but hope to eventually include RTSP
streaming in our educational presentation Mac app, iQPresenter. Latency is a
major factor. We currently produce movies, create HLS variants and segments
using Compressor. Talk about latency, “HLS” is a misnomer. Round trip
lat
Our server software - by itself - contributes no significant latency. I
suspect that most of your latency comes from the interface between your camera
and our server. (You didn’t say how you are feeding your camera’s output to
our server; but that’s where I would look first.)
Ross Finlayson
L
12 matches
Mail list logo