Hello,
I am investigating the accuracy and reliability of my spectrometer
design, so I set up a test vector generator that feeds a known
sequence (a counter in fact) into the PFB/FFT. The test vector is set
up so that it has exactly the same period as the FFT length so that
the spectrum should be static. I find that when I run my design at the
speed I compiled it for, 1024 MHz ADC clock, 256 MHz FPGA clock, I get
the expected spectrum. However, if I change the frequency, even by
just a few MHz, the spectrum changes very slightly (seems to be the
least significant bit or so). The spectrum is still static, just
slightly off from what it should be. Is this behavior expected? My
best guess is that the DCM on the ADC clock is configured optimally
for the compiled speed, and thus the clock is not locking as well when
run at a different speed. Is there anything that can be done about
this? It seems like a major hassle to have to make different builds
just to accommodate different sampling clocks. By the way, I am
sending a sync pulse through at regular intervals that precisely match
the integration time (I verified in simulation that the sync pulse is
occurring exactly when it should to keep every TVG spectrum
identical), so it is not a matter of the design free running for long
periods of time.
Glenn

Reply via email to