Them preverts tend to do that.

From: Steve Jones 
Sent: Tuesday, December 10, 2019 2:53 PM
To: AnimalFarm Microwave Users Group 
Subject: Re: [AFMUG] Testing ridiculous jitter constraints

I dont know alot of those words, but we dealt with a radio station complaining 
of the same type of nonsense, any disturbance would cause momentary dead air. 
Turns out he didnt actually know what he was doing and was running with no 
buffer. got him to change his components finally after convincing him that a 
buffer would compensate for drift.  
hes been happy ever since, with the exception of the times his devices get 
hijacked because apparently security of internet facing communication devices 
dont matter. for a while he had a ton of sip traffic, im assuming was some 
commie calling another commie to talk about commie stuff

On Tue, Dec 10, 2019 at 3:41 PM <[email protected]> wrote:

  The have to buffer in an elastic store to be able to do this.  
  Similar to pseudowire for T1.  

  Then he has to sync the output of his buffer with GPS or colorburst or WWVB 
or some other external sync on his end.  
  Nobody can sync that tight over the internet, not even dedicated ethernet is 
good enough for that tight.  

  From: Adam Moffett 
  Sent: Tuesday, December 10, 2019 2:32 PM
  To: [email protected] 
  Subject: [AFMUG] Testing ridiculous jitter constraints



  I was discussing with a TV station engineer some sort of disturbance he's 
seeing in a video feed which crosses a section of our network.  This is 
crossing a blend of fiber and part 101 microwave, and it's been working fine 
for several years until suddenly their problem cropped up about a month ago. 

  His words, emphasis mine:


  "We are seeing PCR clocking intolerance in our television data streams 
(~19.392685 Mbps, plus overhead; PCR is sent at a defined interval, at least 
once every 40ms, for each of five embedded streams with a drift tolerance of 
<10mHz and a jitter error of <25us per ETSI TR 101 290), "

  I know jack-all about TV broadcasting, but I discussed packet to packet delay 
variation of less than 1 millisecond being considered perfect in my world, and 
"do I understand you correctly that you really need clock signals transmitted 
across the network with less than 25 micro seconds of jitter?" He seems to feel 
that yes, that is the case.  Is this guy mistaken?   I can't believe whatever 
converts the TV signal to ethernet and back wouldn't have at least some minimal 
jitter buffer.

  Even if he's right....how do you even test that?  A wireshark capture will 
have a time attached to each packet, and that is displayed in microseconds, but 
how precise could that be in real life?  I mean hypothetically, by the time a 
frame gets copied to a mirrored switch port, hits my ethernet card, passes 
through the whole software stack to get into Wireshark couldn't that have 
introduced 25us worth of new variance?  

  What about MEF OAM statistics?  Would that be precise enough?

  More than anything I'm shocked at the assertion about the required precision. 
 I feel like on a one-way transmission like TV they could add a half second 
delay to accommodate jitter or retransmissions and nobody watching at home 
would ever know the difference.  But I'm also curious about how you would check 
that assuming you had to.

  -Adam


------------------------------------------------------------------------------
  -- 
  AF mailing list
  [email protected]
  http://af.afmug.com/mailman/listinfo/af_af.afmug.com

  -- 
  AF mailing list
  [email protected]
  http://af.afmug.com/mailman/listinfo/af_af.afmug.com



--------------------------------------------------------------------------------
-- 
AF mailing list
[email protected]
http://af.afmug.com/mailman/listinfo/af_af.afmug.com
-- 
AF mailing list
[email protected]
http://af.afmug.com/mailman/listinfo/af_af.afmug.com

Reply via email to