Hi all,

I am trying to use the ofdm_tx.grc and ofdm_rx.grc to transmit data from one 
N210 controlled by a host to another N210 controlled by a separate host. 
Eventually I want to build an 8x8 MIMO system using OFDMA. I want to verify 
that the correct data is being received so I sent all of the decoded payload 
data into a file sink. The file I sent has 10e6 random samples from 0-255 being 
stored into bytes. I received at least 5x that amount to ensure that I got all 
the data. I imported the received data and the known sent data into Matlab and 
performed the cross-correlation. The figure looks kind of okay as it showed 
multiple peaks (see figure below). However when I went to actually calculate 
the BER, I never got more than 20% of the correct bits. It was behaving 
strangely as 20% of the bits would be correct in a row, and then after that it 
would just be completely wrong. I know the ofdm_tx.grc graphs inserts metadata 
into the signal, and my theory is that the meta data is also being decoded and 
it won’t match exactly what was sent in the file because of the extra 
information. My understanding might be completely off, so if anyone can clarify 
it would be greatly appreciated! I just want to be able to have a way that 
calculates the BER between what is transmitted and what is received.

Thank you for your time!!


Sarah Tran
Electrical Engineer
USRP-users mailing list

Reply via email to