Hello everyone,

I am trying to program an Iris mote to transmit all the packets with
maximum power but the observed received signal strength (RSSI) at a
receiver is causing me to doubt that the packets are being transmitted
at full power. Here is my setup.

One transmitter mote transmits packets and a second mote connected to
a computer through the serial port displays the RSSI (taken from the
packet metadata) of the received packets. The motes are placed on a
desk and the distance between the motes is roughly 1.2m. While
programming the Iris motes I had used the flag -DRF230_DEF_RFPOWER =
0x0. With this setup, the observed RSSI on the receiver side is around
-80dBm. I have used the following relation to convert from raw value
to dBm

dBm = -91 + 1 * (raw - 1)

When repeating the same experiment with two MicaZ motes, the observed
RSSI on the receiver side is around -60 dBm. For MicaZ motes I have
used the flag -DCC2420_TXPOWER=0x1F and the following relation to
convert from raw values to dBm.

dBm = (raw > 127 ? raw - 256 : raw) - 45

The datasheets for ATRF230 and CC2420 state that the maximum
transmission power is +3dBm and 0dBm for ATRF230 and CC2420
respectively. Given that the radio chip on Iris mote transmits at 3dBm
as compared to 0dBm of MicaZ, why is there a significant difference
between the received signal strengths when everything else has been
kept same. I know that the RSSI measurements in these chips are not
very accurate but a 30dBm difference is quite significant and the fact
that the chip that should transmit at higher power seems to be
performing poorly when compared to its weaker cousin. We have observed
similar behaviour in outdoor clear line of sight conditions.

Any insight, comment or explanation would be much appreciated. I am
using a recent CVS check out of tinyos-2.x

Regards
_______________________________________________
Tinyos-help mailing list
[email protected]
https://www.millennium.berkeley.edu/cgi-bin/mailman/listinfo/tinyos-help

Reply via email to