I had reason to return to using the old
spectrometer of CASPER Tutorial 3. This
spectrometer is old enough to be useful
for diagnostics with new designs, but I
noticed a problem when viewing the FFT
spectra. With an input of a 1Vpp sine
wave at 5, 10, 15, 20, 25 or 30 MHz,
the amplitude of the peak in the FFT
spectrum changes in two ways: during a
single run and between runs. I was
trying to determine the source of this
change because I am trying to calibrate
the vertical (amplitude) axis of the
spectrum obtained in the tutorial.

For example, during a run the amplitude
either rises to almost four times its
original value or drops to a quarter of
its original value. If I wait for a
long time, the amplitude seems to rise
slowly. If I shut down the spectrometer
and all the instruments except the
ROACH board, and then repeat the run at
a later time, the readings may be
different by a factor of five or even
ten. This is making calibration very
difficult. If this is normal behaviour
for the spectrometer of Tutorial 3,
then I may be forced to give up an easy
calibration and use a switch for all
measurements so I can monitor the drift
during a run.

The apparatus consists of the ROACH 1
board, an old Bee2 ADC board 1GHz
oscillator for the ADC clock, an old
Wavetek oscillator for the input sine
wave and a blocking capacitor at the
input of the ADC board. The oscillator
does not give a perfect sine wave. Is
the drift normal behaviour for the FFT
spectra?



Reply via email to