802.11c

802.11a

When the 802.11 standard was finalized in 1997, it was clear that the initial standard's 2 Mbps data rates were lower than the market was likely to accept. Very soon thereafter, the IEEE created two task groups to define faster refinements to 802.11 wireless networking. The 802.11a and 802.11b task groups were formed at almost the same time, but the 802.11a concept was more ambitious, and even though both 802.11a and 802.11b were finalized in 1999, it took more than a year longer for 802.11a hardware to reach the market. Indeed, the 802.11a equipment market was just beginning to hit critical mass in late 2002.

The reasons are not hard to explain. 802.11b was intended to be backwardscompatible with the initial 802.11 standard. It uses the same frequency band and much technology in common with 802.11. Companies with existing 802.11 products were able to upgrade their products quickly to 802.11b. 802.11a, on the other hand, was a radical rethinking of the 802.11 idea, especially at the lowest, physical layer of data transfer, where the radio waves and modulation schemes happen.

802.11a completely re-makes the network physical layer. It operates in the 5 GHz portion of the radio spectrum, literally more than twice the frequency of 802.11. Radio technology at 5 GHz is only distantly related to radio technology at 2.4 GHz. The parts and assembly techniques used are incompatible.

Perhaps even more radical differences lie in the frequency management and modulation schemes used by the two standards. Frequency management is a scheme for controlling where in the frequency spectrum a radio's signal exists at any given moment. Modulation is a scheme for imposing data on the radio signal. 802.11 uses two different spread spectrum techniques for frequency management, Frequency Hopping Spread Spectrum (FHSS) and Direct Sequence Spread Spectrum (DSSS). A different modulation scheme is used in 802.11 for each of the two supported bit rates. 802.11b improves a little on DSSS to boost its bit rate to 11 Mbps.

802.11a systems, by contrast, use a frequency management system with the intimidating name Orthogonal Division Frequency Multiplexing (ODFM). ODFM allows eight different bit rates, from 6 Mbps to 54 Mbps, by using five different modulation schemes. An 802.11a connection will use the highest data speed possible given current conditions between the two 802.11a devices. This speed is chosen automatically.

It's not necessary for you to digest or remember the heavy tech here, so much as the fact that 802.11a and 802.11b are radically different in how they work, so much so that there is zero compatibility between the two.

What is important is that 802.11a gives you speed. 54 Mbps is major, almost five times what 802.11b can develop under the best of circumstances.

The bad news (as always) is that 'circumstances' do matter. The range of 802.11a wireless networking is significantly shorter than 802.11b. This has less to do with the frequencies used (as many have said) than the simple fact that high bit rates don't 'travel' as well over radio waves as lower bit rates do, power levels being equal. The quality of a high bit rate signal degrades faster with distance than a low bit rate signal. With more bits flying through the air per unit time, even minuscule noise pulses can corrupt a packet and force a resend. The quality of radio reception (in terms of signal to noise ratio) is better closer to the receiver, so the closer you are to the receiver, the higher a data rate is possible. In part it's engineering, of course, but a lot of the problem is simple physics.

The practical consequences are that as you move an 802.11a client adapter farther from an 802.11a access point, the bit rate used will drop. On the fringes of reception, don't expect much more than the lowest rate of 6 Mbps. You'll get your best rates when the clients are in the same room with the access point, without any obstructions between the two. (Think of a conference room with an access point on the wall and everybody tapping away on wireless-equipped laptops at the main conference table.) Put any walls in the way, and 54 Mbps will drop to a lower rate. How much lower depends on the distance and the construction of the wall, and as always in fluky things like radio work, Your Mileage Will Vary.

It probably doesn't matter to small office and home office people, but 802.11a is still strictly an American phenomenon. (Don't expect your laptop to talk to European access points with an 802.11a PC card stuck in it.) European governments have not set aside precisely the same spectrum space at 5 GHz as our American FCC has, and there is already a high-speed European wireless LAN standard called HIPERLAN/2. The IEEE has a couple of task groups working on ironing out differences between 802.11a and HIPERLAN/2 (as well as a few other non-US wireless protocols) so European and American high-rate wireless LAN hardware can interoperate at 5 GHz. This is being addressed in task groups 802.11h and 802.11j. See those topics for more information.

On the upside, interference from other devices is much less of a problem with 802.11a. The 5 GHz band it uses is part of the National Information Infrastructure program (remember the Information Superhighway?) and it does not have to share radio space with cordless phones, microwave ovens, and Bluetooth smart gadgetry.

Costs on 802.11a hardware are still higher-significantly higher-than 802.11b. Toward the end of 2002, 802.11a access points were hovering around $350, with 802.11a PCMCIA cards coming in at about $130 each. This was almost twice what 802.11b hardware is selling for at the same time.

Should you replace your existing 802.11b hardware with 802.11a hardware? No. (It's too early to tell, but 802.11g hardware may eventually allow you to add faster nodes to an existing 802.11b network without scrapping everything whole.) But if you're still scoping out your first wireless network, make sure to read Chapter 4, where I discuss the decision of 802.11 a vs. b in considerable detail. The decision isn't obvious, and it turns on more than just costs. (If you hate making decisions, 'dual-mode' hardware may soon make the a vs. b debate moot; more on this shortly.)

My expectation is that within two years, 802.11a hardware will be completely mainstream, and no more expensive than 802.11b hardware is as I write this. 802.11b hardware will still be available, but mostly for compatibility reasons, to allow people to add more nodes to existing 802.11b networks.

In the meantime, we have 'dual mode' access points and residential gateways, which package independent 802.11b and 802.11a radios in a single box, allowing both 802.11b clients and 802.11a clients access to the same network. These first appeared late in 2002, and are no more expensive than 802.11a hardware. It is possible-we won't know until we get there!-that dual mode will eventually replace both 802.11a and 802.11b hardware, at least for access points and gateways. Experts on the chip end of the field tell me that a lot of research is leaning in the direction of integrating 802.11a and b in the same chipset, to bring down the cost of dual-mode radically.

As if that weren't enough to consider, if 802.11g hardware comes to market cheaply enough, it has the potential to replace 802.11b completely, even in dual-mode hardware. See the 802.11g topic for more on that subject.



Jeff Duntemann's Drive-By Wi-Fi Guide
Jeff Duntemanns Drive-By Wi-Fi Guide
ISBN: 1932111743
EAN: 2147483647
Year: 2005
Pages: 181

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net