Hi,

I remark that the "time_spec" passed to the "recv" method of an rx_streamer is 
not in
exact relation
with the length of the received vectors. For example, when receiving buffers of 
length
16384 at Fs=362319 Hz,
the buffer length should be equal to (16384+1)/362319=0.0452225 seconds, while 
the time
spec difference
between two successive calls to recv is 0.0483315 and sometimes 0.033402 (with 
a mean
value equals to 0.0452225).

My first idea was that the time_spec returned by the recv method was the time 
stamp of the
first sample of the
buffer, but I realize that it is false. Is it exact ?

As a consequence, how can I know the number of losted samples when an overflow 
occurs ?

thanks.

matis



_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to