There is a calculation for "free-space path loss" that calculates signal fade over distance :

Loss = 96.6 + 10log(d-squared) + 10log(f-squared) dB

        d = distance in miles
        f = frequency in gigahertz

So if you know your EIRP (transmit power + antenna gain) you can estimate the signal strength at "d" distance.


-Rob





PS: I got the equation from the cisco press book "deploying license-free wireless wide-area networks" - a good generic resource (not cisco specific)




At 12:36 PM 1/5/2004 -0800, you wrote:
I know this question is very vague, but still I wonder if there is an answer
to it.

Is there a theoretical formula that links antenna dB gain with increase in
maximum range of the wireless signal. I understand that in theory RF signal
range is infinity, but I can not figure out if there is a correlation
between antenna dB gain and signal range at certain frequency. Following the
common logic it is obvious that range will increase as antenna has better dB
gain, but how much?

A formula for electric field goes something like: E=9500*power/distance (I
might be wrong about this), but this does not take in account the frequency
of our signal.

P.S. I am not concerned with terrain configuration, weather condition, and
other factors at this time.

--
general wireless list, a bawug thing <http://www.bawug.org/>
[un]subscribe: http://lists.bawug.org/mailman/listinfo/wireless

-- general wireless list, a bawug thing <http://www.bawug.org/> [un]subscribe: http://lists.bawug.org/mailman/listinfo/wireless

Reply via email to