Below is the long version of why you put should all APs on the /same/
channel if they have overlapping coverage.
Another practical reason is limitations of the client-side
radio/software - e.g. scanning from channels 1-11 it will latch on the
first 'reasonable' (seeming) signal, instead of looking through the rest
of the channels to find a much better signal.
I had first hand experience of this a few weeks ago when I tried the new
Meraki feature that allows its radios to operate on different channels
at my office. Everything went to hell, esp with the Mac laptop clients,
that decided the AP at the other end of the office was the preferred AP.
Much better is the 'band-steering' support that identifies the client as
802.11a capable and stops accepting 802.11b/g connections. Our
incredibly cluttered building airspace has heaps of 2.4GHz traffic, but
almost no 5.x GHz, so this has been a real boon after we got past some
initial 'my saved (2.4) connection no longer works' problems.
-------- Original Message --------
Subject: [IP] The 'Wi-Fi At Conferences' Problem
Date: Sun, 11 Oct 2009 10:11:47 -0400
From: David Farber <[email protected]>
Reply-To: [email protected]
To: ip <[email protected]>
References: <[email protected]>
Begin forwarded message:
From: Jim Thompson <[email protected]>
Date: October 11, 2009 10:08:21 AM EDT
To: [email protected]
Cc: Brett Glass <[email protected]>
Subject: Re: [IP] Re: The 'Wi-Fi At Conferences' Problem
Dave, for IP, as you desire.
On Oct 10, 2009, at 3:47 AM, David Farber wrote:
Begin forwarded message:
From: Brett Glass <[email protected]>
Because our ISP has developed BSD UNIX-based traffic shaping and
anti-bandwidth hogging software for its own networks, we have also
deployed it for conferences. It does a good job of keeping each
user's bandwidth use to a sane level, and is capable of
administratively blocking activities that have no place on a shared
conference LAN.
We also use RF tricks. One of the best is to use 802.11a, which has
lots of non-overlapping channels. (You can even use alternate
channels, especially if you use the lower, indoor-only ones.) Also,
when they process 802.11a, the chipsets also tend to be better at
filtering out unwanted signals on nearby frequencies. 5 GHz signals
also penetrate walls poorly, which is a good thing; it allows
frequency reuse and helps to avoid interference from sources outside
the building or in other parts of the venue.
Hmm,
First, an apology. My previous message was poorly-edited. My
concentration and memory recall seem somewhat less than I'd like since
my late August "open heart" surgery. I've tried to improve below.
Lets start with what was.
To my knowledge, none of the 802.11b radios formerly on the market had
enough selectivity to eliminate in-band signals from a nearby radio
operating on an adjacent (25Mhz away) or alternate (50MHz away) channel.
Minimum IEEE specs for an 802.11b receiver is 35dB of adjacent (25MHz
away) channel rejection. IEEE doesn't publish specs for alternate
channel rejection on 802.11b, but I can tell you that the best designs
(in terms of ACR) are the "old" super-het receivers. Intersil
(subsequently Connextant)'s Prism2/2.5 is good for about 41dB of ACR,
the Lucent/Orinoco design was similar. Alternate channel rejection is
perhaps 20dB "better" with these chipset. (Note that I have real-
world experience with both courtesy of Vivato.)
This means that if you're operating (your Prism 2/2.5 or Orinoco
chipset radio) on channel 11, a signal on channel 1 will be 60dB down
from where it would be if your radio was on channel 1.
Free space path loss (**) in the first meter @ 2.4GHz is -41dB. Lets
say you've got a garden-variety radio that puts up 32mW (15dBm) of tx
power, and ignore antenna gain for now (so 0 dBi antennas on both
radios). 15dBm - 41dB = -26dBm, and 15dBm - 60dB = -45dBm. These are
the in-channel 'noise power' of the alternate channel radio for the
chipsets detailed above, for adjacent and alternate channel operation.
Notice that it is at least 55dB (> 200,000:1) above the thermal noise
floor (lets say -100dB for now.) Translated: you've significantly
lowered your SINR. Moreover, this signal level is at least 40dB
(10,000:1) higher than the 802.11b CCA threshold, so you'll quiesce
the entire cell (no matter which radio was trying to operate) whenever
one transmits. Its worse than this, if one radio was receiving,
you've destroyed the incoming packet(s).
Thus, in 802.11b, even if you pick "1 and 11", you're going to end up
with significant in-channel power from any operation on the alternate
channel by a close-by transmitter. This is why you can't "co-locate"
APs in the same band on a rooftop or tower. Now try, just *try* to
discuss this in a rational manner with most "wireless ISPs".
Now, if you add any antenna gain to the equation, then, unless you
manage to find really special antennas, (or use channel filters, etc)
then the gain of both antennas affects the above in a negative way.
At the end of the day, a 10m separation with small (2.2dBi) antennas
was often practical with 802.11 (pre-b) and 802.11b. By the time you
move from 1m to 10m, the free space path loss has increased from -41dB
to -60dB, which is "barely enough". (do the math!)
So the minimum (and essentially optimal) 802.11b AP separation with
2dBi antennas on channels 1 and 11, in free-space, is around 10m. Any
closer and you're raising the noise floor in any AP or client that is
receiving a packet. Note that a client between the two APs may be on
either channel. (In 802.11, the client gets to pick which AP to
associate with.) Where there is one, there can be two, and these two
might end up on different APs (and different channels), and as a
result, interfere with each other's frames. The answer here (and
with 802.11g/802.11a networks) is not the little cells on a hexagon
layout that you so-often see in the literature, but rather larger
cells of co-channel APs, laid out on the same hexagonal pattern. This
minimizes the 'edge cases' just described and lets CCA (see below) do
its job.
This situation is (much) worse today with 802.11g and 802.11a. Due to
their (now universal) direct conversion archectures, and some
interesting properties of OFDM, most 802.11g and 802.11a receivers
can't muster even the 35dB of ACR that the IEEE specifies for 802.11b
when operating in the 802.11g / 802.11a modes on adjacent channels,
and for these reasons, the ACR specification was reduced for 802.11g
and 802.11a receivers.
Brett is incorrect (to the limits of my knowledge) that 802.11a
equipment has superior ACR compared to 802.11g or 802.11b gear. In
the IEEE specification, ACR varies by modulation rate, but the minimum
ACR for 802.11g/a at 6Mbps is 16dB (alternate is 16dB more, for 32dB).
This goes down to -1dB (yes, -1dB) ACR @ 54Mbps (with alternate
channel rejection at 54Mbps down to 15dB.) The chipset vendors I've
spoken with during past jobs have stated that they might be able to
get a dB or two better than the spec, (and we've not begun to discuss
tolerance issues) but the point is moot, you have to deal with the
clients that walk in the conference room door, and these are likely to
be bottom-of-the-barrel minimum spec units, if for no other reason
than the manufacturer was shaving pennies.
Brett is correct that there is much more opportunity for separating
APs into different sub-bands of the spectrum allocated for 802.11a.
This spectrum covers from 5.150 - 5.350 GHz and 5.470-5.825 GHz (U-NII
rules) and/or 5.725 - 5.850 (ISM rules). There are some restrictions
here (5.150 - 5.250 GHz is for 'indoor-only' operations, 23dBm EIRP,
and must use fixed antennas. Operation between 5.250 - 5.350 and
5.470-5.725 GHz is limited to 30dBm EIRP, and such operations must
support both dynamic frequency selection (DFS) and transmit power
control (TPC). These restrictions serve to 'limit' the applicability
of these frequencies to outdoor use, but most conferences are indoors.
So, assuming a conforming AP can be operated as low on the 5.150 -
5.250 GHz band as possible (without violating the restriction that any
out of band emission be no higher than -27dBm/MHz from the band edge
(*)), and assuming that operation is under the FCC (USA) regulatory
domain, the lowest available channel is "36" with a center frequency
of 5.180GHz. We've already seen (above) that alternate channel
operation (50Mhz away) **at 6Mbps** results in a mere 32dB of ACR
(IEEE minimum), and moving beyond a 60MHz spread is going to put the
next AP in the middle of 5.250 - 5.350, operating on either "58" 5.280
or "60" 5.300GHz (better). Now we've used 2 'sub-bands' for two APs.
Assuming your (potential) clients all support operation in the 'new'
5.470 - 5.725 GHz band, there are opportunities for 3, or possibly 4
more APs that won't interfere with each other, or nearby clients on an
adjacent 'sub-band'. There is potential for one more 'cell' in the
5.725-5.825 GHz band,
but don't let it interfere with operation in the band directly below,
so likely operations will occur on 5.805 GHz (channel 161). So yes,
you can support 6 (perhaps 7) more "cells" using these bands, but
first the gear that comes in the door (in customers hands) has to
support 802.11a, and in particular, has to support the bands you're
interested in furnishing. Note that the iPhone does not support
802.11a, and that, to date, no Android or Windows Mobile phone does,
either. Many of the "netbooks" that are so popular these days do not
support 802.11a, either.
The thing you really have to understand is this: In wireless, it is a
sin to throw what would have been an otherwise correctly-received
packet on the floor (or destroy it as outlined above.) You really
can't afford to have a near-by AP think the air is clear (because its
off- channel) and then stomp on the reception of some near-by AP or
client.
(As an aside, if you think about it, since "traffic shaping" works by
dropping already-received frames, its use with 802.11 is also a sin.
There are a plethora of other issues with "traffic shaping", and the
IP list has been over the discontinuities in statements similar to
Brett's, "administratively blocking activities that have no place on a
shared conference LAN".)
In co-channel operations, there is a function called Clear Channel
Assessment (CCA) that holds any given radio from transmitting if either
a) there is a strong signal using the wireless medium, or
b) there is a somewhat weaker signal, which is clearly an 802.11
signal (that is, "this" radio has decoded the preamble), above some
threshold.
We did a bit of work at Vivato to make several radios (13 in the 11b
product, 6 in the 11g product) synchronize via the CCA signal, such
that if any of them was receiving a packet, none of the others would
transmit. This is the key to making a "conference room" network
really fly. Multiple, independent APs, all on the same channel, which
are able to co-ordinate operation between themselves in a highly-
similar manner. Given the higher rates of 802.11g (54Mbps translates
to about 27Mbps of throughput for a single client, but this degrades
with increased loading by additional clients), or, better, 802.11n,
its entirely possible to have the available 'in-room" bandwidth exceed
the typical "Internet connectivity" bandwidth for most situations.
(e.g. A single cell of 802.11g APs and a single cell of 802.11a APs
could match the available bandwidth of a DS-3 exiting the conference
center.)
Another issue at Vivato was that the 'antennas' for these radios were
both highly directional and pointed in different directions, so the
actual signal level might not be high enough to set CCA to a given
client, but any the clients transmitting would ruin the reception (at
the AP) of any other. If you look at the patent art out of VIvato,
one of the big ideas was "complementary" beam forming, which was a
method to raise the signal level "high enough" for the rest of the
clients to keep them from transmitting (and ruining the inbound
packet. See:
<http://www.ll.mit.edu/asap/asap_03/6-page_Papers/tarokh_ASAP.pdf
When you consider that channel models out of AT&T Bell Research and
Harvard show that the 54Mbps 'rate' of 802.11g / 802.11a can only be
achievable at a radius of 8 meters in a NLOS (non line-of-sight)
environment, and that to maximize throughput to/from a room-full of
conference attendees, you want to be able to send (and receive) at the
highest rate possible, the desirability of an architecture designed to
do same seems inevitable. As a reminder, a 'radius of 8m' will cover
a room where the walls are no farther apart than 37 feet. However,
with 802.11n, range can be as much as doubled (even through walls)
compared to the same operation using conventional IEEE 802.11 OFDM, if
the environment supports "sufficient" orthogonality in the channel
matrix. (You can think of this as 'scattering', see below.)
However, 802.11n also supports an optional 40MHz wide 'channel' that
can be used to increase bandwidth (and therefore modulation rate), and
this plays against all the channel spacing discussion above.
I have ideas on how to make the above work (sans any back-end 'co-
ordination' such as that found in several "Wi-Fi switch" products), if
anyone wants to get in-touch. It requires hardware modifications to
the APs, Gigabit Ethernet uplinks, and a bit of software, but nothing
else.
It also helps to know your equipment and use quality gear. When I
attended David Isenberg's conference a couple of years back, the
folks who were running the network use Linksys routers, but with the
upstream network cables plugged into the 4 port switch rather than
into the upstream port of the NAT router.
I believe they did this -- in a spirit of extreme "network purism"
-- so that they could avoid doing NAT and give every client a
public, static IP. (You can't turn NAT off on the Linksys routers,
so they couldn't just give each router a subnet.)
I don't know which Linksys model was used at Isenberg's conference,
(and Brett doesn't say), but for at least some models of Linksys
wireless APs, you can disable NAT. Moreover, putting each router on
a single subnet seems like either overkill or a waste to me.
An interesting idea in theory, but a bad one in practice. Not only
did this architecture fail to prevent broadcast packets from being
relayed between the APs, jamming the network; it also triggered a
known bug in the Linksys routers that caused packets to be reflected
back from the access points.
Broadcast isn't necessarily bad, or even poor network design. If your
back-end network (hopefully a switched Ethernet) can't deal with the
frames uplinked from the APs, then something is very wrong, or at a
minimum, very misconfigured.
This paper from TI touches on the topic of ACR and gives additional
detail to what I've written above.
<http://focus.ti.com/pdfs/bcg/80211_acr_wp.pdf
Meru also seems to have a clue about these subjects.
As for Brett's assertion that 802.11b/g signals "penetrate" walls more
than 802.11a signals, the research doesn't tend to support him. Yes,
there is an approximate 3dB absorption loss difference between 2.4GHz
and 5Ghz through walls, which is approximately linear with frequency,
but scattering is at least as important as absorption, and the
scattering due to the signal passing through walls is not
appreciatively different at 2.4GHz than 5Ghz. Further, 3dB, while "a
lot" (2x or 1/2x, depending on how you wish to view it), its probably
not enough to 'separate' rooms in the manner which Brett implies,
other than in 'edge' cases. Countering this is the simple fact that a
given size antenna is "larger" at 5Ghz than 2.4GHz by the same
(frequency-linear) amount, and the antenna gains at both ends add,
though these 'larger' antennas are of necessity, more 'directive.
For more on this, see:
<http://www.triquint.com/prodserv/tech_info/docs/WJ/Indoor_prop_and_80211.pdf
for more but note that the paper states in conclusion:
"We suggest that there is no intrinsic impairment of 5.2-5.8 GHz
propagation vs. 2.4-2.5 GHz propagation in office/light manufacturing
environments, and thus no intrinsic impediment to roughly equivalent
deployment of 802.11a and 802.11b wireless LAN systems. Note,
however, that this does not imply equivalence of practical systems:
higher frequencies suffer higher losses in cables and circuit boards,
and low-cost devices may suffer from reduced gain or lower output
power at higher frequencies. Further, to achieve equivalent collecting
area, higher-frequency antennas become more directional, which may be
inconvenient for end-users."
Some of these factors may account for the effects Brett describes,
rather than (lack of) "penetration".
Jim
(*) note to non-RF engineers, this is HARD work.
(**) yes, I'll live in my own state of sin, and speak about this as
though it were reality rather than merely a model.
-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/
Powered by Listbox: http://www.listbox.com
_______________________________________________
SoCalFreeNet.org General Discussion List
To unsubscribe, please visit:
http://socalfreenet.org/mailman/listinfo/discuss_socalfreenet.org