I was discussing with a TV station engineer some sort of disturbance
he's seeing in a video feed which crosses a section of our network.
This is crossing a blend of fiber and part 101 microwave, and it's been
working fine for several years until suddenly their problem cropped up
about a month ago.
His words, emphasis mine:
"We are seeing PCR clocking intolerance in our television data streams
(~19.392685 Mbps, plus overhead; PCR is sent at a defined interval, at
least once every 40ms, for each of five embedded streams with a drift
tolerance of <10mHz and a /*jitter error of <25us per */ETSI TR 101 290), "
I know jack-all about TV broadcasting, but I discussed packet to packet
delay variation of less than 1 millisecond being considered perfect in
my world, and "do I understand you correctly that you really need clock
signals transmitted across the network with less than 25 /micro/ seconds
of jitter?" He seems to feel that yes, that is the case. Is this guy
mistaken? I can't believe whatever converts the TV signal to ethernet
and back wouldn't have at least some minimal jitter buffer.
Even if he's right....how do you even test that? A wireshark capture
will have a time attached to each packet, and that _is _displayed in
microseconds, but how precise could that be in real life? I mean
hypothetically, by the time a frame gets copied to a mirrored switch
port, hits my ethernet card, passes through the whole software stack to
get into Wireshark couldn't that have introduced 25us worth of new
variance?
What about MEF OAM statistics? Would that be precise enough?
More than anything I'm shocked at the assertion about the required
precision. I feel like on a one-way transmission like TV they could add
a half second delay to accommodate jitter or retransmissions and nobody
watching at home would ever know the difference. But I'm _also _curious
about how you would check that assuming you had to.
-Adam
--
AF mailing list
[email protected]
http://af.afmug.com/mailman/listinfo/af_af.afmug.com