Quantum Nonsense


Let's move on from relativity to quantum mechanics. Recently I had someone who was unwilling to make a forecast say to me:

"It's just like quantum mechanics. All I can give you is a probability."

Although the second part of his claim was most assuredly true, I am certain that it had absolutely nothing to do with quantum mechanics.

About 20 years after the relativity revolution, circa 1927, quantum mechanics burst upon the human race with equally momentous and unsettling effect.[6]

[6] Some date the origin of quantum mechanics back to Planck's work in the early 1900s, which was contemporaneous with that of Einstein. I use 1927, because the papers that Schrödinger published in 1926 were publicized in early 1927, giving us Schrödinger's Equation. That formalized things and really launched the revolution.

All you have to remember about quantum mechanics is all you have to remember about relativity. Neither theory replaces Newton's Laws. Whereas Einstein's Relativity Theory extends Newton's Laws into the domain of the very fast (velocities near the speed of light), quantum mechanics extends classical physics into the domain of the very small. That is, when we get down to subatomic dimensions, new rules come into play. That's when we need to use quantum mechanics. For everything else, the rules of quantum mechanics still apply, but the effects are so small that they are irrelevant.

It's important to realize that quantum mechanics is one of the most successful theories of all time. It has been able to explain a vast array of very counterintuitive things that we now take for granted in our everyday lives. Without it, we wouldn't understand how semiconductors work, which would make my use of a Pentium processor to write this book somewhat moot. It goes right to the matter of explaining why atoms are stable, without which, as John Walker points out, your whole day would be ruined. So while we daily deal with technology that derives from quantum mechanics, we rarely see phenomena that directly exhibit quantum effects. It's a subtle point, perhaps.

The reason it took so long to discover both bodies of knowledge is that we could not measure either stuff that went really fast or things that were really small much before the second half of the 19th century. Actually, it was the invention and perfection of the vacuum pumpan engineering featthat facilitated measurement in both arenas. This also explains why the effects that required the application of either Einstein's theory or quantum mechanics were not observed; except for the conundrum about the wave-particle duality of lightknown even in Newton's timenothing in our plodding macroscopic world hinted that anything was "wrong."

The "It's just like quantum mechanics" quote reveals another interesting misconception. Because quantum theory depends on calculations involving probabilities, many people think that predictions based on quantum mechanics are somehow imprecise. The reality is just the opposite.

For example, we can determine a, the fine structure constant,[7] experimentally. Now this number is quintessentially a "modern" physics number: It is made up of, among other things, the charge on the electron, Planck's constant (see more on this later), and the speed of light. When you are measuring it by any method, you are doing quantum mechanics, and the theoretical predictions of the number involve some of the deepest applications of quantum theory we know. Yet we can do experiments that measure its value to about one part in 108. Now that is pretty good in anybody's book.

[7] The fine structure constant comes up when considering the separation of lines observed when doing spectroscopy on the atoms of an element. Quantum mechanics evolved as physicists tried to explain the various separations for different elements; later, quantum theory was used to predict higher-order effects on the spectra when, for example, the atom in question was subjected to an electrical or magnetic field.

By contrast, G, the universal gravitational constanta perfectly "classical" quantity known since the time of Newtonhas been experimentally measured to only about one part in 104. That's not bad either; it corresponds to 0.01 percent precision. Yet we know a with several thousand times more precision. Somewhat ironic, isn't it?

So much for the probabilistic nature of quantum mechanics and its relation to making predictions.

More Quantum Nonsense

Actually, my pet peeve is the frequent misuse of "Heisenberg's Uncertainty Principle." If you are interested in a particularly droll example of this, see Freddy Riedenschneider's monolog in the Coen brothers' movie, "The Man Who Wasn't There."[8] It is easy to see why Billy Bob Thornton got the chair after his lawyer tried to use the principle to convince (or confuse) a jury.

[8] USA Films, 2001.

I hear a common lament when someone is asked to make a difficult measurement:

"We're screwed. Heisenberg tells us we can't measure something without disturbing it."

Here's another example: Software people now talk about "Heisenbugs."

"Man, it took us weeks to track down that defect. Turned out to be a Heisenbug."

These are bugs that are very hard to eliminate because, in the process of trying to do so, we change the working of the program, and our original bug is further hidden by the actions of the debugging apparatus.

What is really going on here?

Measuring Stuff

The fundamental issue is this: Can you measure something without at the same time disturbing the thing you are trying to measure? That is, when you perform the measurement, do you influence in some way the very thing that you are trying to determine? If so, then you have a problem, because your measurement will be contaminated by your perturbation of the system you are trying to measure.

Now, this is not an extremely "deep" problem. Medical diagnosticians have to deal with it all the time. They spend a lot of time and energy making assessment procedures as minimally invasive as possible. Yet we know that some people's blood pressure goes up the minute a cuff is put on their arm. Ergo, their measured blood pressure is higher than their normal resting blood pressure.

In software, we work very hard to make debuggers "non-intrusive." Nonetheless, sometimes the act of debugging changes something that causes the program to behave differently from when it is running without the debugger. Whether this is the fault of the program or the debugger is somewhat moot; in either case, the programmer has a big problem.

And in the 1920s, Elton Mayo discovered the Hawthorne Effecthe demonstrated that in studies involving human behavior, it is difficult to disentangle the behavior under investigation from the changes that invariably occur when the group under study knows it is being studied. That is why medical experiments today are performed using a double-blind methodology: neither the patients nor the administering doctors know who is getting the treatment and who is getting the placebo.

Note that these phenomena are perfectly "classical;" you don't need quantum mechanics or the Heisenberg Uncertainty Principle to explain them.

Before we delve more into Heisenberg, we might ask the following question: Is it possible to do any measurement, even a macroscopic one, which is totally "non-intrusive"? If I can find just one example, then I can debunk the idea that it is impossible.

So here's my example. I wake up in a hospital bed in a room I have never seen before. I want to figure out how large the room is. So I count the ceiling tiles. There are 16 running along the length, and 12 along the width. I know that ceiling tiles are standardized to be 1 foot by 1 foot. Hence, I know that the room measures 16 feet by 12 feet for an area of 192 square feet. Bingo! I have performed a measurement without even getting off my back, and I claim that I have not disturbed the room at all.[9]

[9] If you're a semanticist, you may claim that I have made an estimate, not a measurement. I respond by pointing out that every measurement is an estimate, in that it has uncertainty attached to it. For example, when you "measure" one of those ceiling tiles with a ruler and determine that it is "12 inches by 12 inches," do you really believe it is exactly so?

Applying Heisenberg

Where the Heisenberg Uncertainty Principle applies is in the atomic and subatomic realm. Basically, it posits that it is impossible, quantum-mechanically, to specify both the position and the momentum of a particle to arbitrary precision. If you want to make your knowledge of the particle's position more exact, then you will have less precision on its momentum, and vice versa.

To observe said particle, you have to "shine a light on it." But when you do so, the light itself affects the particle's momentum and therefore makes it impossible to know the particle's position exactly. So "non-intrusiveness" is impossible at quantum dimensions, and Heisenberg supplies you a formula to compute just how much the intrusion will affect your measurement.

One caution: Heisenberg's Uncertainty Principle uses Planck's constant, which is very, very small. So small, in fact, that the Heisenberg Uncertainty Principle yields nonsensical results once you investigate anything greater than atomic and subatomic distances. That is, if you shine a light on an electron, you will affect it measurably. On the other hand, if you shine a light on the ceiling tiles of the hospital room, you affect them not at all. So using the Heisenberg Uncertainty Principle for macroscopic objects is just nonsense.




The Software Development Edge(c) Essays on Managing Successful Projects
The Software Development Edge(c) Essays on Managing Successful Projects
ISBN: N/A
EAN: N/A
Year: 2006
Pages: 269

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net