Dear USRP and GnuRadio folks,

I'm not sure whom to whom I should address this question, since I don't
know if the problem arises in the GnuRadio code, the drivers, or the actual
physical USRP.

I was attempting to run an extremely simple GRC graph (attached to this
email) where the input antenna of the USRP is grounded, the signal is taken
in with  UHD: USRP Source, then the samples' magnitudes are squared, then
fed through a moving average (100,000 points), and then the result is
displayed on the screen in dB.

As I altered the sampling rate, I observed a very strange thing: the
average for the noise readings were, by and large, smoothly increasing with
the sampling rate, except at certain frequencies (specifically, 30*2^k,
k=0,1,2,3, etc). At these specific sampling rates, the noise level would
shoot up by about 20 dB.

I'm attaching a graph of the some of the values I got, where the big jumps
are marked by their respective sampling rates.

Can someone explain why these jumps occur? I'm guessing that when I adjust
the samp_rate, it selects the front-end filter from a filter bank and
combines that with decimation. Maybe at certain samp_rates (30*2^k,
k=0,1,2,3,...) it's right at a threshold edge where the noise is worse. Or
maybe the problem is entirely at the other end and GnuRadio has a bug in
its assignment of samp_rate values?

Two attachments included. Comments much appreciated.

Thanks and regards,
Ale

Attachment: noise_level_unexpected.grc
Description: Binary data

_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to