We are obviously WAY off topic, but...

You are generally correct, however your model is a little too simple.

First, the source is not truly a point.  The size will vary depending upon
frequency and antenna arrangement, but it can be on the order of an inch or
more.  Not a huge source, but at distances of a few feet or less a big
difference in calculations.

Second, at least with the high frequency digital phones, transmission is a
lot less omnidirectional than you might think.  Usually more of a double
cone.  This not only changes the calculations, it partially explains why
small adjustments in orientation can make a difference in actual exposure.

Third, and this was pointed out in an IEEE Spectrum article, most relatively
new phones are partially shielded on the comm side.  It is easy to think of
the shield as casting a 'shadow' - IE, the further you are from the comm
side, the larger the 'shadow' of muted signal strength.  But this model is
not nec. correct.  Radio is made up of EMR, which is fundamentally photons.
Photons don't act like tiny bowling balls, their behavior follows the QED
model.  Because of this, an area close to the shielding might be very well
protected, but an area just a little further away bombarded with 'scatter'.
The classic light/slit experiment to demonstrate the wave/particle 'duality'
of photons is a good example of this.

Last, power is just part of the equation.  The EMR not only has to be
transmitted, it has to be received.  Think about radio, using the Newtonian
model for light with radio would indicate that the closer you are to the
transmitter, the stronger the signal.  This is generally true, but it falls
apart at the margins (really close and really far).  For example, if you are
too close to an AM station it can be very difficult to tune it in, yet you
might be able to pick up the signal half a continent away - out of line of
sight.

Of course, these factors have a lot to do with wavelength.  In some ways,
higher frequency makes EMR more predicable (FM is generally line of sight
and does not usually bounce mysteriously around the world).  But, in other
ways it gets more confusing.  Think of a road mirage (looks like a pool of
water on the road ahead), that is visible light (fairly high frequency EMR)
taking the shortest time path to you.  Instead of a straight line from
source to you, it is shortest distance through the cool air to the road, and
then zooming along faster in the layer of warm air just above the road.

ALL THAT SAID, I readily concede that, in general, potential power drops
very rapidly with distance...

Best Regards,
-jjf

-----Original Message-----
From: John E. Christ III [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, June 28, 2000 7:01 PM
To: Palm Developer Forum
Subject: Re: cancer causing devices (was Re: QualComm PDQ Phone)

[SNIP]

Actually since cell phones are essentially point source omnidirectional
antennas (you don't have to aim them, right?), the electromagnetic field
disperses equally in three dimensions; therefore the intensity of the field
drops by distance to the third power.  That means the intensity at 10 inches
away is 1000 times less than at one inch.  Think about that the next time
you've got that phone jammed up against your ear :)

Electromagnetic Field Theory 101 was such a long time ago.  Makes my head
hurt to think about it :)

John E. Christ III
UTA Software Centre
Tampa, FL.

-- 
For information on using the Palm Developer Forums, or to unsubscribe, please see 
http://www.palmos.com/dev/tech/support/forums/

Reply via email to