I am trying to implement a DVB-T2 receiver's P1 symbol detection block
using C++ in GNURadio. From a non-real time code in MATLAB, I can see that
the correlation peaks during the P1 symbols are almost 50 to 100 times
larger than the rest of the time. This rise is not steep in consecutive
However, I am having trouble in implementing this as a kind of
stream-algorithm that operates on blocks of data (like a real receiver
would) rather than the entire thing (in GNU Radio, that is). The problem is
I want to detect when the signal correlation reaches its
peak-plateau region. While the overall trend in correlations does rise, the
consecutive values may fluctuate in both positive and negative directions
due to noise/fading effects. I've tried a few rudimentary approaches:
1. Average the correlations over some period and see if the current average
is greater than some sample at an offset in the history by a threshold
value to term it as rising. While it can give some degree of rise, it is
not very good at peak detection.
2. Keep a history of consecutive slopes in my block and use a small
average. The fluctuations should die down and give some value near to 0.
This doesn't seem to happen. The results are not satisfactory. Also, the
consecutive slopes are also not that spectacular to contrast with the
Possibly, I need to use some smoothening and differentiator filter. Can you
point to a sample implementation of a similar kind? Or give some more ideas
I could experiment with?
Discuss-gnuradio mailing list