|
|
Knowing whether and how much and in what ways to trust a technology requires understanding that technology to some extent. I'm reminded of the old joke about the person who responds with wide-eyed surprise at the news that his checking account is overdrawn: 'But that's impossible! I still have checks in my checkbook!' Not too clear on the concept, we say with a grin. But be honest: How clear are you on the concept of network operation, much less network security? And more to the point, how clear on the concept do you have to be?
To use technology without being a hazard, you have the responsibility of understanding it to the extent that you mitigate the chance of being a hazard. There's a well-traveled urban legend about a newly retired man who finally achieves his dream of buying a 40-foot Winnebago RV so he can spend a year on the road seeing the country. Half an hour into his first trek down the Interstate, he pushes the cruise control button and heads back into the RV's kitchen to make himself a cup of coffee. Seconds later, the RV goes off the highway and into the ditch. In some versions of the legend, he sues Winnebago for not telling him that the cruise control wasn't the same as an automatic pilot. (In reality, he would be unlikely to survive such an adventure. Darwin knew what he was talking about.)
This story is a howler because we all know that only a knucklehead would assume that cruise controls can steer a vehicle. Faced with a computer full of complicated and obscure mechanisms, however, how confident are you that you understand what's a hazard and what isn't? Working a computer (especially a networked computer) and driving a car both require a certain amount of study and practice. You don't have to be a degreed expert to understand the hazards that networking carries with it. You must, however, make the effort to learn what the hazards are and what you must do to avoid them. Trusting the machine too much is worse than not trusting it at all.
The easiest way to do this is by reading books (like you're doing right now) and talking to smart people who have been down that road before you. I also recommend the short night courses in computer operation that you'll find at community colleges, public libraries, and senior centers.
However you do it, do it. Be as clear on the critical computing concepts as you can, so you know what parts of the system you can trust, and how much.
I was in high school when I heard the story (possibly but not inevitably an urban legend) of the student in machine shop who needed a couple of small C-clamps to hold two pieces of metal together while he drilled them. The C-clamp drawer was empty, so he went to the micrometer drawer and pulled out two micrometers, which he then used as clamps, at least until the instructor saw him. (I admit that if you've never used a micrometer this won't make sense at all; if you have, well, it's very funny. See Figure 11.1 to better understand how the two tools differ.)
Figure 11.1: C-Clamps and Micrometers.
Micrometers are not C-clamps, even though they're shaped a little like C-clamps and a naÔve person might assume that in a pinch they might serve as C-clamps. However, if you trust a micrometer to do the job of a C-clamp, you run the risk of having your lashup fly apart on the drill press, scattering chunks of metal and micrometers in all directions. You can't trust a micrometer to do a C-clamp's job, and it's not the micrometer's fault when it fails.
This problem infects the Wi-Fi world, where a relatively simple security mechanism called Wired Equivalent Privacy (WEP) is often trusted to do many things it was never intended to do. Big portions of the Wi-Fi security drawer are not only empty but simply missing -so because WEP is at least present, people expect it to do everything.
This problem can be ameliorated a little by ameliorating the first problem: Know your technology at least well enough to know what problem it was intended to solve, and don't expect it to solve other problems just because it's handy.
|
|