Dear all,

the following question popped up on IRC:

22:08 < roh> is there some documentation how the minLatency() call in osmo-trx 
works?
22:10 < roh> it looks to me like the values we use for usrp1 and lms are not 
correct/cargocult

I agree, the values are likely wrong for the non-UHD devices.  

Interestingly, they are auto-tuned at runtime, see the following piece in 
Transceiver.cpp:

          // if underrun hasn't occurred in the last sec (216 frames) drop
          //    transmit latency by a timeslot
          if (mTransmitLatency > mRadioInterface->minLatency()) {
              if (radioClock->get() > mLatencyUpdateTime + GSM::Time(216,0)) {
              mTransmitLatency.decTN();
              LOG(INFO) << "reduced latency: " << mTransmitLatency;
              mLatencyUpdateTime = radioClock->get();
            }
          }

However, that block only applies to devices with TX_WINDOW_USRP1 set, that is
USRP1, B100 and B2xx devices.

In fact, I cannot find any user of the minLatency() method outside the context
of TX_WINDOW_USRP1, and hence I think it doesn't matter what kind of magic
value the LMS driver supplies?

So at least I conclude:
* it's only ever used on USRP1, B100 and B2xx, and it is dynamically adjusted
  at runtime on those platforms

Regards,
        Harald
-- 
- Harald Welte <[email protected]>           http://laforge.gnumonks.org/
============================================================================
"Privacy in residential applications is a desirable marketing option."
                                                  (ETSI EN 300 175-7 Ch. A6)

Reply via email to