Monday, April 29, 2013

History of Science -- Part Eighteen: Protons

Ernest Rutherford
Around 460 B.C., a Greek philosopher, Democritus, developed the idea of atoms. He asked this question: If you break a piece of matter in half, and then break it in half again, how many breaks will you have to make before you can break it no further? Democritus thought that it ended at some point, a smallest possible bit of matter. He called these basic matter particles, atoms.

For more than 2000 years nobody did anything to continue the explorations that the Greeks had started into the nature of matter. Not until the early 1800's did people begin again to question the structure of matter.

Then an English chemist, John Dalton performed experiments with various chemicals that showed that matter seemed to consist of elementary lumpy particles (atoms). Although he did not know about their structure, he knew that the evidence pointed to something fundamental.

The German physicist Johann Wilhelm Hittorf studied electrical conductivity in rarefied gases. In 1869, he discovered a glow emitted from the cathode, the part connected to the negative terminal of a battery. In 1876, the German physicist Eugen Goldstein showed that the rays from this glow cast a shadow, and he dubbed the rays cathode rays.

During the 1870s, the English chemist and physicist Sir William Crookes developed the first cathode ray tube to have a high vacuum inside thereafter called the “Crookes’ tube.” He then showed that the luminescence rays appearing within the tube carried energy and moved from the cathode to the anode, the part connected to the positive terminal of the battery. Furthermore, by applying a magnetic field, he was able to deflect the rays, thereby demonstrating that the beam behaved as though it were negatively charged.

In 1896, the British physicist J. J. Thomson, with his colleagues, performed experiments indicating that cathode rays really were unique particles, rather than waves, atoms, or molecules as was believed earlier. Thomson made good estimates of both the charge e and the mass m, finding that cathode ray particles, which he called "corpuscles," had perhaps one thousandth of the mass of the least massive ion known: hydrogen. He showed that their charge to mass ratio, e/m, was independent of cathode material. He further showed that the negatively charged particles produced by radioactive materials, by heated materials, and by illuminated materials were universal. The name electron was proposed for these particles by the Irish physicist George F. Fitzgerald.

These cathode rays and tubes with cathodes and anodes enclosed in a high vacuum became the “high tech” of the first half of the twentieth century. The so called “vacuum tubes” had a cathode “emitter” that was heated by a light bulb-like filament to “boil off electrons.” For example, the 6L6 vacuum tube has a six-volt filament and the 12AT7 has a 12-volt filament. Between the cathode and the anode are one or more screens called “grids.” A triode has a single grid in addition to the cathode and anode. The “A” battery in old radios would provide the filament power and the “B” battery would provide a positive voltage to the anode or “plate” called B+. Vacuum tubes required several hundred volts B+ to operate.

There are vacuum tube “diodes” with only a cathode and an anode. They can’t be used to amplify a signal, but they can be used like one way valves ("valve" is also the British term for a vacuum tube) to change AC current into DC current.

Another form of cathode ray tube or CRT applied a very large negative voltage to the cathode, over a thousand volts in some cases, and then surrounded the sides of a funnel shaped tube with a grounded anode. The electron beam would fly down the middle of the tube and hit a fluorescent screen at the end. The beam would be bent by magnetic fields in a TV and the CRT is called a “picture tube.”

Comparing mass and charge to the simplest atom of all, the hydrogen atom (with its one electron and one proton) soon convinced scientists that this cathode ray consisted of a component of an atom, and that these electrons could be detached from the atom by heat (and, as additional studies showed, by large electric fields and also the photoelectric effect explained by Einstein.)

Since atoms are electrically neutral, it was obvious that an atom must have some positive charged material to offset the negative charge of the electron.

I’ve described before the various theories of a “rice pudding” like atom and that Ernest Rutherford arrived at the correct structure of the electrons spinning around the nucleus. So what is the nucleus made of?

Was the nucleus one solid substance? Further, since we know electrons are very light compared to even the hydrogen nucleus, what forms the nucleus of larger atoms such as carbon, oxygen, iron, or gold.

One theory was that the more massive (and “larger”) atoms were made up of a combination of hydrogen atoms. This concept of a hydrogen-like particle as a constituent of other atoms was developed over a long period. As early as 1815, William Prout proposed that all atoms are composed of hydrogen atoms (which he called "protyles"), based on a simplistic interpretation of early values of atomic weights called “Prout's hypothesis,” which was disproved when more accurate values were measured. (Prout had failed to account for neutrons, which we'll hear about in another chapter.)

Ernest Rutherford was the son of James Rutherford, a farmer, and his wife Martha Thompson, originally from England. James had emigrated to New Zealand from Perth, Scotland, "to raise a little flax and a lot of children." Ernest was born at Spring Grove (now Brightwater), near Nelson, New Zealand.

He studied at Havelock School and then Nelson College and won a scholarship to study at Canterbury College, University of New Zealand where he was president of the debating society, among other things. After gaining his BA, MA and BSc, and doing two years of research during which he invented a new form of radio receiver, in 1895 Rutherford was awarded an "1851 Exhibition Scholarship" to travel to England for postgraduate study at the Cavendish Laboratory, University of Cambridge.

Under the inspiring leadership of J. J. Thomson he managed to detect radio waves at half a mile and briefly held the world record for the distance over which electromagnetic waves could be detected, though when he presented his results at a meeting in 1896, he discovered he had been outdone by another lecturer by the name of Marconi.

In 1898 Thomson offered Rutherford the chance of a post at McGill University in Montreal, Canada. Rutherford accepted, which meant that in 1900 he could marry Mary Georgina Newton to whom he had become engaged before leaving New Zealand. In 1907 Rutherford returned to Britain to take the chair of physics at the University of Manchester.

Along with Hans Geiger and Ernest Marsden in 1909, he carried out the Geiger–Marsden experiment, which demonstrated the nuclear nature of atoms. It was Rutherford's interpretation of this data that led him to formulate his model of the atom in 1911 — that a very small charged nucleus, containing much of the atom's mass, was orbited by low-mass electrons.

In 1917, Rutherford proved that the hydrogen nucleus is present in other nuclei, a result usually described as the discovery of the proton. Rutherford had earlier learned to produce hydrogen nuclei as a type of radiation produced as a result of the impact of Alpha particles on hydrogen gas, and recognized them by their unique penetration signature in air and their appearance in scintillation detectors.

(Alpha particles are two protons and two neutrons bound together. It is the nucleus of the helium atom stripped of its two electrons. Alpha particles, were known before anyone understood the structure of the atom and the nucleus. So called Beta particles are actually high speed electrons (or their antimatter twin, positrons). These “rays” were first discovered in the study of radioactivity and X-rays, which was also part of the discovery of the atom and its components.)

Rutherford knew hydrogen to be the simplest and lightest element and was influenced by Prout's hypothesis that hydrogen was the building block of all elements. Discovery that the hydrogen nucleus is present in all other nuclei as an elementary particle, led Rutherford to give the hydrogen nucleus a special name as a particle, since he suspected that hydrogen, the lightest element, contained only one of these particles.

He named this new fundamental building block of the nucleus the proton, after the Greek word for "first", πρῶτον. Plus, Rutherford also had in mind the word protyle as used by Prout.

The single proton in the nucleus of hydrogen was easy to understand. But what about helium and heavier atoms such as carbon or oxygen. You see, the problem is, what would hold the nucleus together. Recall that we know that like charges repel. So what holds the two protons in helium together in the nucleus, not to mention the six protons in carbon or the eight in oxygen, or the 82 in lead.

To overcome the repulsion of the electric force would require a force even stronger than the electric force. And if this force was so strong, why hadn't it been detected before? Was it a force that was only present in the nucleus of an atom?

Some of the answers hinted at another particle in the nucleus as well as a new force. Up until this point, only gravity and electromagnetic force were known. The study of the nucleus led to a discovery of a third force called the “strong nuclear force.” However, that has to wait for the 1930s and the discovery of the neutron to really be understood. And that leads to the discovery of a fourth fundamental force, also present in the nucleus.

So many questions. So many chapters. Another coming your way.

Sunday, April 28, 2013

Feeling Rosey

I was at a concert last night and I ran into Brett Michaels back stage. He was about to go on and perform “Every Rose Has Its Thorn.” I asked if they were going to use any keyboards, and Brett said they planned to just do guitar and drums. I saw a big Hammond B-3 and some Leslie’s in the wings and said if they’d roll that onto the stage I’d just provide a mellow “G – C (add a ninth).” Brett said that would be cool, so I went on with them for that one song.

We got a standing ovation. I really couldn’t see too well over the tall Hammond, but I think people were throwing flowers on the stage.

As I walked off, Axl Rose grabbed me by the arm. He insisted that I play drums for “Paradise City” since he'd fired Steven Adler.

I explained I didn’t know how to play drums, but Axl demanded and said it is an easy beat. So I started out with the simple bass – snare with a double bass beat every fourth measure and a sort of triple bass off the beat in the last bar.

Everything was going so well … then the song began to pick up. All the players kept turning to me and motioning for me to pick it up and add some more … I don’t know what … I guess “drums.”

I was clueless. I told Axl I didn’t play drums. Why didn’t he believe me? Then the audience began to throw things at me and they weren't flowers. I ducked a tomato as a whole cabbage sailed past my shoulder. Now I really wished I’d taken those drum lessons my parents insisted I take.

But nooooooooo, I wouldn't listen to my parents. I wanted to be a rebel. I had to take piano lessons. Now what was I gonna do?

Then I woke up!

History of Science -- Part Seventeen: Bell’s Theorem

John Stewart Bell
By 1935, the basic form of quantum mechanics was clear. Schrödinger’s equation was the new universal law of motion. Although it was only required for objects on the atomic scale, quantum theory presumably governed the behavior of everything. The earlier physics, by then called “classical,” sort of like the ancient Greek theories, was the much easier to use approximation for macroscopic behavior. Nonetheless, quantum science has crept into modern technology in day-to-day objects such as transistors and microprocessors or lasers, items we take for granted today.

Although quantum theory works perfectly, it seems to imply something weird, almost absurd. That has been a problem for scientists and philosophers alike.

Yet quantum theory underlies all physics, which underlies all other sciences. So is all this theory built on sand? Copenhagen insists, and most scientists pragmatically accept the formulas for the practical reason that they work, what’s the problem?

(A minor note: mathematics is not based on physical science. Therefore, math does escape the paradox and does not require interpretation. It is still an unfettered and unrestricted tool, although a fellow by the name of Gödel proved his “Incompleteness Theorem” for mathematics in 1931, which is a sort of “Heisenberg Uncertainty” for math. But mathematical processes still are sharp tools for examining what we don’t know and proving what we can know. Not all is lost to crystal balls and stochastic processes.)

Schrödinger, himself, was bothered by the Copenhagen interpretation. He boiled his concerns down into a thought experiment, now quite famous, called “Schrödinger’s Cat.” It attempts to put the microscopic, quantum world into our macro-world and show the craziness and absurdity of what it implies.

I won’t describe the “cat in the box” story, but Google with quickly provide the details. This experiment has echoed down through the history of quantum theory and led to many an explanation, refutation, and even the title for a book. Stephen Hawkings once said, “When I hear about Schrödinger’s Cat, I reach for my gun.”

That quotation is taken out of context and doesn’t get into the subtleties of what Hawkings has said about it, so time for some more Google-ing. You can spend the better part of the day following the cat / rabbit trail.

As for another often-misquoted scientist, Albert Einstein, he said, “I think that a particle must have a separate reality independent of the measurements. That is, an electron has spin, location, and so forth even when it is not being measured. I like to think the moon is there even if I am not looking at it.”

Einstein rejected the Copenhagen interpretation and spent his lifetime in friendly jousting with Niels Bohr in an attempt to disprove Copenhagen.

Quantum theory has an atom being either a spread-out wave or a concentrated particle. If, on the one hand, you detect it in a single box (or through a single slit), you show it to be a compact particle. On the other hand, it can participate in an interference pattern that shows it to be an extended wave — an apparent contradiction. But the theory is protected from refutation by the Heisenberg uncertainty principle, which shows that checking to see through which slit an atom comes kicks it hard enough to blur any interference pattern. So you thus can’t demonstrate a contradiction.

To argue that quantum theory led to an inconsistency and was therefore wrong, Einstein attempted to show that even though an atom participated in an interference patter, it actually came through a single slit. To demonstrate this he had to evade the uncertainty principle. (Ironically, Heisenberg attributed his original idea for the uncertainty principle to a conversation with Einstein.)

Einstein presented his explanation at the 1927 Solvay conference. Niels Bohr then rose to point out a flaw in Einstein’s reasoning. With simple algebra, Bohr was able to show that the uncertainty principle would foil Einstein’s demonstration.

Three years later, at another conference, Einstein proposed an ingenious thought experiment claiming to violate a version of the uncertainty principle. This one stumped Bohr and he had a sleepless night. By morning, however, he was able to embarrass Einstein by showing that Einstein’s experiment actually violated Einstein’s own general theory of relativity. A humbled Einstein went home from the conference to concentrate on general relativity, his theory of gravity, or so Bohr assumed.

Four years later, in 1935, Einstein wrote a paper with two young colleagues, Boris Podolsky and Nathan Rosen. This paper hit Bohr like a “bolt out of the blue.” The paper, now famous as “EPR” for the three authors, did not claim that quantum theory was wrong, just that it was incomplete. They designed a thought experiment intended to reveal what they believed to be inadequacies of quantum mechanics. To that end they pointed to a consequence of quantum mechanics that its supporters had not noticed.

According to quantum mechanics, under some conditions, a pair of quantum systems may be described by a single wave function, which encodes the probabilities of the outcomes of experiments that may be performed on the two systems, whether jointly or individually.

For example, there is something called “twin-state” particles, such as two photons created together and moving in opposite directions. Quantum theory predicted a particular state for each photon depending on the state of the other. Basically, they are described by a single wavefunction, just like the split electron in two boxes I described in a previous chapter.

So what if the two photons traveled some great distance before one is observed. Then, once observed the wavefunction would collapse to a particular state. But that implied the other photon, now some great distance away, would also have to "decohere" to the opposite state. How would the photon so far away, under the limitations of instant communications denied by relativity, obtain the opposite polarization? How would it “know” that its twin had been “observed”?

In modern terminology we say that the two particles are “entangled”; and it is called “quantum entanglement.” Einstein called it “spooky actions at a distance” and proposed a much simpler explanation that defied quantum theory, namely that the states of the two particles were established when they were emitted and didn’t require an observer or any long distance communications. Recall, he didn’t believe “God plays dice.”

These ideas are often explored in a set of experiments with polarized light. The original paper talked of a complex combination of position and momentum of particles instead of photons of light, but the polarization explanation is considered equivalent and much easier to understand, even to a layperson.

Bohr refuted EPR in a paper, but, like the Copenhagen interpretation, it did leave one with a feeling that something was missing; the explanation seemed more philosophical than scientific, and the argument went on.

Einstein seems in later years to have given up a bit on his view. Although he continued to search for the secrets of the universe, he did write to a friend later, “I have second thoughts. Maybe God is malicious.”

I suspect the continual agreement of all experiments with the descriptions and equations of quantum theory eventually wore Einstein down. (Not that some theories proposed in the intervening years didn’t turn out to be incorrect or incomplete, but the core of Schrödinger’s equations never was found in error … ever.)

The argument was laid to rest in 1970 when John Stewart Bell published his theorem. Bell’s theorem has been called “the most profound discovery in science in the last half of the twentieth century.” It rubbed physics’ nose in the weirdness of quantum mechanics. As a result of Bell’s theorem and the experiments it stimulated, a once “philosophical” question has now been answered in the laboratory.

There is universal connectedness. Einstein’s “spooky interactions” in fact do exist. There is some sort of connection between two particles in certain situations that can exist over great distances and seemingly defy relativistic speed constraints.

These “entanglements,” as modern science calls them, are part of the new frontier of physics. Bell’s theorem lays to rest, once and for all, Einstein’s argument about hidden variables. No, matter and behavior at atomic levels actually is probabilistic and cause-effect has a new interpretation.

Both Einstein and Bohr died before Bell’s explanation. We are sure Bohr would have predicted the experimental result confirming quantum theory. It is not clear what Einstein would have predicted had he seen Bell’s proof. He said he believed that quantum theory’s predictions would always be correct. How would he feel if the predicted result was an actual demonstration of what he denied as “spooky actions”?

The universal connectedness predicted by quantum theory (“Thou canst not stir a flower / Without troubling a star”) is now demonstrated routinely in the laboratory. So, is "non-locality" incompatible with fundamental relativity? Or is space-time folded — just one speculated result of Bell's theorem.

It supports wild speculations. It also, like the rest of quantum mechanics, just works. Now let’s use it to dig deeper into the nucleus.

Saturday, April 27, 2013

History of Science -- Part Sixteen: Copenhagen Interpretation

Werner Heisenberg
The meaning of Newton’s mechanics is clear. It describes a “clockwork universe.” It needs no “interpretation.” Einstein’s relativity is surely counterintuitive, but no one interprets relativity. We get used to the idea that moving clocks run slow or that space is non-Euclidian. It’s harder to accept the quantum theory premise that observation creates the reality observed. That requires interpretation.

Physics is supposed to be about the physical. Now we find we need to interpret what physics has found (and proved), almost like Daniel interpreting King Nebuchadnezzar’s dream. Physics is supposed to be about material things, not supernatural, mental, or spiritual!

Filling that need for interpretation, within a year after Schrödinger’s equation, the “Copenhagen interpretation” was developed at Bohr’s institute in Copenhagen with Niels Bohr as its principal architect. Werner Heisenberg, his younger colleague, was the other major contributor. Although there is no “official” version, there is a general understanding of three principles that make up the interpretation.

Copenhagen softens the interpretation of “observation,” describing it as taking place whenever a microscopic, atomic-scale, object interacts with the macroscopic. When a piece of photographic film or a sensor “observes” the photon, or when a Geiger counter clicks in response to a particle entering a discharge tube, etc., we say the event has been observed.

The three pillars of Copenhagen consist of:

1) The probability interpretation of the wavefunction.

I’ve explained this previously. I described the waviness in a region (technically the absolute square of the wavefunction) as the probability that the object will be found in that region. This is central to the Copenhagen interpretation.

But, remember, it isn’t like the likelihood of winning (or loosing) the lottery. In the “two boxes” experiment the object actually was in both boxes until the wavefunction is collapsed by an observation. I suppose you could consider you might win or loose the lottery before the winning number is drawn, that isn’t what’s going on here. It is sort of as if you have both won and lost … you’ve gotten the money and you’ve torn up the loosing ticket … until the winning number is drawn.

This is hard to grasp. That’s why I keep repeating it. To quote another physicist, Pascual Jordan, one of the founders of quantum theory, “Observations not only disturb what is being measured, they produce it.” That’s the essence of the “superposition state.”

While classical physics is strictly deterministic, quantum mechanics tells of the ultimate randomness in Nature. At the atomic level, God does play dice.

Cosmologist John Wheeler puts it concisely, “No microscopic property is a property until it is an observed property.” Bohr said, “There is no quantum world. There is only an abstract quantum description. It is wrong to think that the task of physics is to find out the how nature is. Physics concerns what we can say about nature.” Some would be even less tactful and just say, “Shut up and compute.”

Einstein rejected Bohr’s attitude as defeatist, saying he came to physics to discover what is really going on, to learn “God’s thoughts.” Schrödinger also rejected the Copenhagen interpretation.

Would Bohr actually deny that a goal of science is to explain the natural world? Perhaps not; he once said, “The opposite of a correct statement is an incorrect statement, but the opposite of a great truth may be another great truth.”

Is there another “great truth” out there still to be discovered? Is this just an episode of the “X Files”? Yeah, I sometimes think that. So, am I Fox Mulder or Dana Scully? What about you? Are you the believer or the skeptic?

2) The Heisenberg uncertainty principle.

Bohr’s assistant, Heisenberg, showed that any demonstration to refute the Copenhagen interpretation’s claim of observer-created reality would be frustrated.

His principle describes a fundamental limit to measurement. If one choses to measure the position of a particle very accurately, then you can’t measure the velocity accurately. And, vice-versa attempts to measure the velocity accurately, limit the ability to measure the location.

For example, to observe an atom with a microscope, one must bounce a photon off the atom that is then reflected into the microscope. But the photon will change the motion or position of the atom. The measurement has limits … and it is not just that the scientist needs to build a better measurement tool. It is not the tool at fault, it is the principle. Sure, the impact of measuring either position or motion will change the other. But it is really more of a fundamental issue, not just a statement about the experiment apparatus.

Again Planck’s constant appears. That is the true limit. As you design an experiment to measure either property precisely, you disturb the other property and the product of the measurement accuracy is always greater than or equal to Planck’s constant. (That’s a non-precise statement of Heisenberg’s equation. It is the “delta” or variation of the position measurement times the delta of the velocity measurement must be greater than or equal to Planck’s constant.) If you reduce the delta of one of the measurements, that must cause the uncertainty or delta of the other to increase.

Like most of these quantum theories, Heisenberg’s uncertainty principle has also been proven time and time again by all kinds of ingenious experiments.

Heisenberg’s uncertainty applies to other quantum characteristics too, such as energy or spin. The uncertainty principle can also be derived directly from the Schrödinger equation.

The uncertainty principle comes from the wave-like property of quantum mechanics and the uncertainty principle actually states a fundamental property of quantum systems, and is not a statement about the observational success of current technology.

This basic limitation to both experiment and interpretation is now well accepted, regardless of other views and theories. Yet it isn’t quite enough to prevent certain contradictions in accepted quantum theory.

That leads to the third principle:

3) Complementarity

Copenhagen invokes the “complementarity” principle to confront a spooky aspect of observation: the instantaneous collapse of an object’s wavefunction everywhere by an observation anywhere.

Bohr understood that the two possibilities of an object being both a wave, spread-out and probabilistic as well as a real particle when observed is contradictory. So he supposed a concept called “complementary,” which allows the contradiction since we must consider only one aspect at a time. That’s what we mean when we say the wavefunction collapses upon observation. The contradictory view disappears, leaving us with a physical object. This covers both Schrödingers equations and the observed dual nature of both waves like electromagnetic waves and photons as well as matter such as electrons and atoms also having this dual nature and wave-like properties.

Recall that the spread-out wave could cover a lot of space. So the collapse to a physical object would suddenly put all the matter, or all the information, in one specific location. If the wavefunction really were spread-out matter, then that matter (or even energy or information) would have to move faster than the speed of light to collect in one place. As crazy as all we’ve talked about so far, we don’t think that happens. As kooky as the probability wave interpretation seems to be, it would be downright impossible to believe the object is actually smeared out across space. Remember, there is only the wave, not the wave and the particle, it is just the wave … until observation. Then there is just the particle.

What we are left with is the view that all we can understand is the results of our experiments. Stop worrying about what “really” happens. That we can’t know. We just know the click of the Geiger counter, the trail in a cloud chamber, the flick of a meter, the view in the microscope, the computer printout of the sensor data … all macroscopic phenomenon.

This almost appears to eliminate all physical objects at the atomic level. It appears that all we have are the measurements of our instruments and the design of our experiments. We can’t speak of atoms directly because we can’t observe them directly like we can a rock or water.

(This is no longer true. IBM labs produced one of the first photographs of actual atoms — atoms of argon. They even moved them around and spelled out “IBM.” So this “we can’t see or know” explanation is getting a bit thin. Modern, young physicists are continually challenging these explanations, even if they have been accepted as Gospel for almost one hundred years.)

This calls back Newton’s response when asked just what gravity “is.” He answered, hypothesis non fingo which means, “I make no hypothesis.” He didn’t claim to explain gravity. His equations simply predicted how it works. Einstein did extend that, giving great insight into the nature of space and time. Will some new “Einstein” eventually describe what is really going on in quantum physic?

Essential to the Copenhagen interpretation is a clear separation of the quantum microworld from the classical macroworld. May we some day eliminate the need for the Copenhagen interpretation in favor of a true story of Nature?

Only time will tell. So far we haven’t. Copenhagen is the primary explanation used today. Sure there are others like the “Many Worlds” explanation so popular with science fiction writers, and several more, but Copenhagen has stood the test of time and it is what is taught today.

So what about Einstein? Did he try to come up with an interpretation. You bet. That’s next.

Friday, April 26, 2013

History of Science -- Part Fifteen: Interpreting Schrödinger

Schrödinger speculated that an object’s waviness was the smeared out object itself. Where, for example, the electron fog is densest, the material of the electron is most concentrated. The electron itself would thus be smeared over the extent of its waviness. The waviness of one of the states of the hydrogen electron might then morph smoothly into another state without the quantum jumping the Schrödinger detested.

This reasonable-seeming interpretation of waviness is wrong. This is the start of the real mystery. When one looks at a particular spot, one finds either a whole object or no object at that spot. For example, an alpha particle emitted from a nucleus might have a waviness extending over kilometers. But as soon as a Geiger counter detects an alpha, there is a whole alpha right there inside the counter and nowhere else. All the waviness is suddenly concentrated one spot where the particle is observed. If the particle was actually spread out throughout the wave, then it would have to collect at one point almost instantly, which would require traveling faster than the speed of light. That idea just doesn’t work.

Once the modern interpretation of his wavefunction was known, Schrödinger stated that he was sorry that he had anything to do with quantum theory. What his equation predicts is something a lot crazier than “those damn quantum jumps.”

The modern, accepted interpretation is that the waviness in a region is the probability of finding the object in that location. Be careful! It is not the probability of the object being there!

The object was not there before you found it there!!

Your happening to find it caused it to be there!!!

This is tricky and the essence of the weirdness of quantum mechanics — yet it works … every time!!!!

It is all about probability. It was, in fact, only a few months after Schrödinger announced his equation that Max Born realized that the waviness in a region is probability, the probability for the whole object being found in that region. Once the particle is detected, the probability becomes “1” at that point and “0” everywhere else. We say that the probability wavefunction “collapses” upon detection.

Max Born was born in 1882 in Breslau, Germany. He participated in all the great discoveries of both relativity and quantum mechanics in the first half of the twentieth century. He pioneered the use of matrices and other advanced mathematics in both relativity and quantum theory and was closely associated with many of the people I’ve mentioned in this history. He worked with J J Thomson in Cambridge on atomic particle scattering and he added to the fundamental principles first proposed by Einstein. He formulated the dynamics of a crystal lattices and he shared the Nobel Prize for Physics in 1954 with Walther Bothe for his probabilistic interpretation of quantum mechanics. Perhaps Born’s greatest contribution was his work introducing Matrices to physics and quantum calculations. (A matrix (plural matrices) is a rectangular array of numbers, symbols, or expressions, arranged in rows and columns, sort of like a crossword puzzle.)

Prior to his work, Matrices were rarely used by physicists, but he showed their powerful capabilities in exactly the type of problems being worked on in quantum mechanics. Today it seems odd to not use matrices in physics, but in the early years of last century they were much more the providence of mathematicians, rather than physicists.

So, if the wavefunction is about probability, you might say, “The waveform is a probabilistic prediction of where the particle is.” No, we actually interpret it that the particle is sort of physically spread out over the entire waveform … but not really … and the act of detecting the specific location causes the wavefunction to collapse.

Trust me, it is not easier to understand from the equations than from this explanation. It is mind-blowing no matter how you think about it. That’s the really weird part I spoke of earlier. Schrödinger’s equation will accurately predict the location and motion (and a lot in addition to that such as energy and spin), but it predicts it as a region with probabilities of detection. Upon actual detection, things seem to suddenly change. We say the wavefunction collapses.

Here is a little experiment that has actually been performed thousands of times. Any wave can be reflected. Semitransparent mirrors reflect part of a wave and allow the rest to go through just like a window pane that lets light in, but also reflects some too. A semitransparent mirror for light is also semitransparent for photons — the particle nature of light. We can also construct semitransparent mirrors for atoms. After all, matter also has the dual nature of wave - particle. Encountering such a mirror, an atom’s wavefunction splits into two wavepackets.

We can use such a construct to send half of the wavepacket into one box and half into another. Now the wavefunction is trapped in two boxes. Holding an atom in a box pair without disturbing its wave function would be tricky, but it can be done. So, which box is the atom actually in? You might say the probability is 50% for each box, and you would be right. If we put a detector into each box, sure enough, it will be in one or the other. Note that, after detection, the odds are now 100% it is in the box detected and 0% it is in the empty box.

But, without putting a detector in each box, if you open a little hole in both box and shine the wavefunction onto a detection screen, you’ll get the familiar interference pattern. That implies that “something” is in each box. Is it the atom? No, when we measure (or detect) in each box, we find the atom in only one box … so what causes the interference? Something must be in both boxes if we perform the second experiment with the little holes. (Note carefully that you must perform one or the other experiment. You can’t do both. If you put a detector in the box, the wavefunction collapses, and now the atom will only be in one box … no interference pattern!)

So, the second experiment seems to makes more sense to say that part of the atom is in each box. That would explain the interference pattern when that experiment is performed. But what about the first experiment where you actually detect the atom? That shows it is all in one box.

The most accurate way to describe this in non-mathematical English is to say that a physical thing was in two places at the same time. The quantum mechanical term for such a situation is “superposition state.” And that is the state that “collapses” (we now say “decoherence”) when the particle is detected. An object in two places at once is counter-intuitive and inevitably confusing.

"Superposition" seems even stranger than the dual nature of waves and particles. How can something be in two (or a lot more) places at once. Yet that is the interpretation of many of the experiments. A wave is spread-out. Now we've reached the most fundamental strange and weird property of Quantum Theory. Yet it has been proven true tens of thousands of times and is the basis for about one-third of our current economy. It works! Strange or not.

To quote Paul Dirac, who made fundamental contributions to the early development of both quantum mechanics and quantum electrodynamics, and first predicted the existence of antimatter (without which the Star Ship Enterprise would be dead in the water):

The general principle of superposition of quantum mechanics applies to the states [that are theoretically possible without mutual interference or contradiction] … of any one dynamical system. It requires us to assume that between these states there exist peculiar relationships such that whenever the system is definitely in one state we can consider it as being partly in each of two or more other states. The original state must be regarded as the result of a kind of superposition of the two or more new states, in a way that cannot be conceived on classical ideas. Any state may be considered as the result of a superposition of two or more other states, and indeed in an infinite number of ways. Conversely any two or more states may be superposed to give a new state …

This is the modern explanation of the double-slit experiment and the two boxes experiment I just described.

Anton Zeilinger, an Austrian quantum physicist and professor of physics at the University of Vienna, famous for his work in “quantum information” and particle “entanglement” referring to the prototypical example of the double-slit experiment, has elaborated regarding the creation and destruction of quantum superposition:

"[T]he superposition of amplitudes … is only valid if there is no way to know, even in principle, which path the particle took. It is important to realize that this does not imply that an observer actually takes note of what happens. It is sufficient to destroy the interference pattern, if the path information is accessible in principle from the experiment or even if it is dispersed in the environment and beyond any technical possibility to be recovered, but in principle still "out there." The absence of any such information is the essential criterion for quantum interference to appear.

Does this make sense? Of course not. Does Nature’s fundamental law, the Schrödinger equation, give only a probability? Einstein felt that there must be an underlying deterministic explanation. “God does not play dice with the Universe,” is his often-quoted remark. (Bohr told him not to tell God how to run the Universe.)

But randomness was not Einstein’s most serious problem with quantum mechanics. What disturbed Einstein and Schrödinger, and more people today, is the quantum mechanics’ apparent denial of ordinary physical reality — or, maybe the same thing, the need to include the observer in the physical description — that just doesn’t seem to make sense.

Newton’s laws didn’t need an observer. The famous philosophical question, “If no one hears a tree falling in the forest, does it make a sound?” is typically answered, “Yes!” Why would you need an observer to hear the sound to make the sound occur. That seems like crazy talk and bad science. Yet it is the observer that seems to cause the waveiness to collapse into a single location. WHAT ?!!?

If you’re not a bit baffled at this point, then you missed the point. According to Richard Feynman, who understood quantum mechanics as well as anyone ever did said, “Nobody understand quantum mechanics.”

Yet this is the most tested and most successful theory in the history of theories and one estimate put one-third of our economy based on quantum mechanics from lasers to transistors to MRI machines to a future of quantum computers. (The latter does not exist yet.)

Quantum mechanics isn’t just hard to understand because the math is hard … although the math is plenty hard … it is hard to understand because it seems to not fit our reality at all. I can conceive of quantum steps in electron energy, that isn’t too hard to understand. The dual wave-particle nature … well maybe I can fit my little brain around that concept. But this is the final straw.

This is so weird that Schrödinger actually resorted to a good old thought experiment to either explain it or to prove it wrong. It is called "Schrödinger’s Cat." You should Google it sometime. But don’t blame me if it keeps you up at night or gives you nightmares when you do sleep. Welcome to my world!

Thursday, April 25, 2013

History of Science -- Part Fourteen: The Wavefunction

The letter "psi" is the 23rd letter in the Greek alphabet. In physics it is most often used as the symbol for Schrödinger's wave equation.

The Schrödinger equation plays the role of Newton's laws and conservation of energy in classical mechanics — it predicts the future behavior of a dynamic system. It is a wave equation in terms of the wavefunction which predicts analytically and precisely the probability of events or outcome. The detailed outcome is not strictly determined, but given a large number of events, the Schrödinger equation will predict the distribution of results.

In the last episode we learned that Schrödinger and Heisenberg simultaneously discovered the equations that describe motion for tiny particles such as electrons and atoms. Although there was some initial confusion, Schrödinger quickly proved they were equivalent. In Heisenberg’s formulation operators incorporate a dependency on time, but the state vectors are time-independent.

It stands in contrast to the Schrödinger picture in which the operators are constant, instead, and the states evolve in time. The two pictures only differ by a basis change with respect to time-dependency, which is the difference between active and passive transformations. In other words, they were just two different views of the same phenomenon.

Heisenberg did, however, have a point about the physical aspect of Schrödinger’s theory. What was waving in Schrödinger’s matter wave? The mathematical representation of the wave is called the “wavefunction.” In some very real sense, the wavefunction of an object is the object. In quantum theory there is no atom in addition to the wavefunction of the atom. There is just the wavefunction.

But what, exactly, is Schrödinger’s wavefunction physically? At first, Schrödinger didn’t know, and when he speculated, he was wrong. So, let’s just look at some wavefunctions that the equations tell us exist. That’s what Schrödinger did.

The essentials of quantum mechanics can be seen with the wavefunction of a simple little thing moving along in a straight line. It could be an electron or an atom, for example. To be general, we usually refer to an “object.”

A couple of years before Schrödinger’s vacation inspiration, Compton showed that photons bounced off electrons as if they were each tiny billiard balls. On the other hand, to display interference, each and every photon or electron had to be a widely spread-out thing. Each photon, for example, had to go through both slits in the double-slit experiment. How can any object be both compact and spread-out too? Well, a wave can be either compact or spread out. (But, of course, it can’t be both at the same time.)

The waveform of a moving atom might look much like ripples, or a series of waves, a “wave packet,” moving on water. A wave equation, the one for water waves or matter waves, can describe a spread-out packet with many crests, or a compact packet with only a few crests, or even a single crest moving along.

For big things, object much larger than atoms, Schrödinger’s equations just turns into Newton’s universal equations of motion. Schrodinger’s equation governs not only the behavior of electrons and atoms, but also the behavior of everything made of atoms — molecules, baseballs, and planets given an initial wavefunction, it tells what the wavefunction will be like later. It’s the new universal law of motion. Newton’s equation is just an approximation for big things.

Schrödinger’s equation says a moving object is a moving packet of waves. But what is waving? Think of these analogies — Schrödinger no doubt did:

At a stormy place in the ocean, the waves are big. Let’s call that a region of large “waviness.” The boom of a drum, on its way to you from a distant drummer, is where the air pressure waviness is large, where the sound is. The bright patch where the sunlight hits the wall, the region of large electric field waviness, is where the light is. Waviness somehow tells where something is. It might seem reasonable to carry this notion over to the quantum case.

The waviness of a packet of quantum waves is large where the amplitude of the waves is large. Perhaps that is where the object is. (In quantum theory, the technical expression for the waviness is the “absolute square of the wavefunction.” By squaring we make the “negative” troughs add to the “positive” crests instead of subtracting since any number, when squared, is now positive.) The square of the wavefunction finds common use in quantum theory as a probability.

Not only is there a wavefunction for a moving atom, but we know parts within the atom move too. The electrons are in orbits around the nucleus. Early on, Schrödinger calculated the wavefunction off the single electron within the hydrogen atom and duplicated Bohr’s results for the experimentally observed hydrogen spectrum — without needing Bohr’s arbitrary assumptions. He was elated. He thought he had gotten rid of quantum jumps. He was wrong!

Modern descriptions of the orbits of electrons often refer to an “electron cloud.” That is, rather than a specific electron in orbit, it is more like a cloud. And it is a cloud … a “probability cloud.” You can visualize the waviness as clumps of fog. The fog is densest where the waviness is largest. Pictures such as these provide chemists with insight into how atoms and molecules bind with each other.

The wavefunction, being the object itself, actually includes everything knowable about an object, the velocity of an atom or its rate of spin, for example. So I’ve suggested that the waviness perhaps tells where the object is. It’s not quite like that … but close. But what exactly is the waviness?

Well, I see we’ve run out of time. You’ll just have to wait for the next chapter to learn of the modern interpretation of the wavefunction … just what it is!

Tuesday, April 23, 2013

History of Science -- Part Thirteen: Erwin Schrödinger

By the early 1920s physicists had accepted the fact that, depending on the experimental setup, matter as well as light could be displayed either as compact lumps or as widely spread-out waves. Few pretended to understand this seeming contradiction. The significance of this came a few years later with the Schrödinger equation. But Erwin Schrödinger wasn’t looking for significance. He saw de Broglie’s matter waves as a way to get rid of Bohr’s “damn quantum jumps.” Out of disbelief comes truth!

Erwin Schrödinger, the only child of a prosperous Viennese family, was an outstanding student. As an adolescent he became intensely interested in the theater and in art. Both were areas of rebellion against the bourgeois society of late nineteenth-century Vienna. Schrödinger himself rejected the Victorian morality of his upbringing. Throughout his life he channeled much energy into intense romances, his lifelong marriage notwithstanding.

After serving in the First World War as a lieutenant in the Austrian army on the Italian front, Schrödinger started teaching at the University of Vienna. About this time he embraced the Indian mystical teaching Vedanta, but always kept his philosophical leaning apart from his physics.

In 1927, just after his spectacular work in quantum mechanics, he was invited to Berlin University as Planck’s successor. With Hitler’s coming to power in 1933, Schrödinger, though not Jewish, left Germany. After visits to England and the United States, he incautiously returned to his native Austria. He was in trouble. His leaving Germany established his opposition to the Nazis. He escaped to Italy and spent the rest of his career at the School of Theoretical Physics in Dublin, Ireland.

Despite the successes of the early quantum theory, often based on Bohr’s quantum rule, Schrödinger rejected a physics where electrons moved only in “allowed orbits” and then, without cause, abruptly jumped from one orbit to another. He was outspoken:

You surely must understand, Bohr, that the whole idea of quantum jumps necessarily leads to nonsense. It is claimed that the electron in a stationary state of an atom first revolves periodically in some sort of an orbit without radiating. There is no explanation of why it should not radiate; according to Maxwell’s theory, it must radiate. Then the electron jumps from this orbit to another one and thereby radiates. Does the transition occur gradually or suddenly? … And what laws determine its motion in a jump? Well, the whole idea of quantum jumps must simply be nonsense.

Schrödinger credits Einstein’s “brief but infinitely far-seeing remarks” for calling his attention to de Broglie’s speculation that material objects could display a wave nature. The idea appealed to Schrödinger. Waves might evolve smoothly from one state to another. Electrons would not need to orbit without radiating. He might get rid of Bohr’s “damn quantum jumps.”

Willing to amend Newton’s laws to account for the quantum behavior of small objects, Schrödinger nevertheless wanted a description of the world that had electrons and atoms behaving reasonably. He would seek an equation governing waves of matter. It would be new physics, a guess that would have to be tested. Schrödinger would seek the new universal equation of motion. The old classical physics would be merely the good approximation for large objects.

From the position and motion of a tossed stone at one moment, Newton’s law predicts its future position and motion. Similarly, from a wave’s initial shape, a wave equation predicts its shape at any later time. It describes how the ripples spread from the spot where a tossed pebble hits the water, or how waves propagate on a taut rope.

However, the single-wave equation that works for waves of water, light, and sound doesn’t work for matter waves. Water, light, and sound waves move at the single speed determined by the medium in which the wave propagates. Sound, for example, moves at 330 meters per second in air. The wave equation Schrödinger sought had to allow matter waves to move at any speed because electrons, atoms — and baseballs — move at any speed (at least up to the limit of the speed of light).

Now the story gets good, this could be on daytime television: The breakthrough came during a mountain vacation with a girlfriend in 1925. His wife stayed home. To aid his concentration, Schrödinger brought with him two pearls to keep noise out of his ears. Exactly what noise he wished to avoid is not clear. Nor do we know the identity of the girlfriend, nor whether she was inspiration or distraction. Schrödinger kept discreetly coded diaries, but the one for just this period is missing.

In four papers published within the next six months, Schrödinger laid down the basis of modern quantum mechanics with an equation describing waves of matter. Almost all the puzzles of the early quantum theory seemed resolved. The work was immediately recognized as a triumph. Einstein said it sprang from “true genius.” Planck called it “epoch making.” Schrödinger himself was delighted to think that he had gotten rid of quantum jumping. He wrote:

It is hardly necessary to point out how much more gratifying it would be to conceive a quantum transition as an energy change from one vibrational mode to another than to regard it as a jumping of electrons. The variation of vibrational modes may be treated as a process continuous in space and time and enduring as long as the emission process persists.

(The Schrödinger equation is actually a non-relativistic approximation. That is, it holds only when speeds are not close to that of light. The conceptual issues are still with us in the more general case. It is simpler, clearer, and also customary to deal with the quantum situation in terms of the Schrödinger equation. And even though photons move at the speed of light, essentially everything applies equally to photons for purposes of understanding and visualizing.)

History is more complicated than the story I just told, and more acrimonious. Almost simultaneously with Schrödinger’s discovery, Bohr’s young postdoc, Werner Heisenberg, presented his own version of quantum mechanics. It was an abstract mathematical method for obtaining numerical results. It denied any pictorial description of what was going on. Schrödinger criticized Heisenberg’s approach. “I was discouraged, if not repelled, by what appeared to me a rather difficult method of transcendental algebra, defying any visualization.” Heisenberg was equally unimpressed by Schrödinger’s wave picture. In a letter to a colleague he stated, “The more I ponder the physical part of Schrödinger’s theory the more disgusting it appears to me.”

For a while it seemed that two intrinsically different theories explained the same physical phenomena, a disturbing possibility that philosophers had long speculated about. But within a few months, Schrödinger proved that Heisenberg’s theory was logically identical to his own, just a different mathematical representation.

The more mathematically tractable Schrödinger version is generally used today, although the matrices from Heisenberg are often applied to today’s latest quantum problems. Heisenberg’s concept of “Commutators” is essential in current science and a focus on Conservation and Symmetry. You never know what will spark a new discovery. Plus, today’s physicists are well schooled in the matrix and group algebras used by these predecessors.

In the following essays, I’ll focus on Schrödinger’s equations and their interpretation. If Erwin thought he had brought sense to the nonsense of “quanta,” well … we’ll see how much sense this does make. Wave-particle duality is hard to understand, but there’s even more to come. Prepare to be truly astonished!

History of Science -- Part Twelve: Louis de Broglie

Louis de Broglie was Prince Louis de Broglie. His aristocratic family intended a career in the French diplomatic service for him, and young Prince Louis studied history at the Sorbonne. But after receiving an arts degree, he moved to theoretical physics. Before he could do much physics, World War One broke out, and de Broglie served in the French army at a telegraph station in the Eiffel Tower.

With the war over, de Broglie started work on his physics Ph.D., attracted he says, “by the strange concept of the quantum.” Three years into his studies, he read the recent work of the American physicist Arthur Compton. An idea clicked in his head. It led to a short doctoral thesis and eventually to a Nobel Prize.

Compton had, in 1923, almost two decades after Einstein proposed the photon, discovered, to his surprise, that when light bounced off electrons its frequency changed. This is not wave behavior. When a wave reflects from an object, each incident crest produces one other wave crest. The frequency of the wave therefore does not change in reflection from a stationary object. On the other hand, if Compton assumed that light was a stream of particles, each with the energy of an Einstein photon, he got a perfect fit to his data.

The “Compton effect” did it. Physicists now accepted photons. Sure, in certain experiments light displayed its spread-out wave properties and in others its compact particle properties. As long as one knew under what conditions each property would be seen, the photon idea seemed less troublesome than finding another explanation for the Compton effect. Einstein, however, still “a man apart,” insisted a mystery remained, once saying “Every Tom, Dick, and Harry thinks they know what the photon is, but they’re wrong.”

Graduate student de Broglie shared Einstein’s feeling that there was a deep meaning to light’s duality, being either extended wave or compact particle. He wondered whether there might be symmetry in Nature. If light was either wave or particle, perhaps matter was also either particle or wave. He wrote a simple expression for the wavelength of a particle of matter This formula for the “de Broglie wavelength” of a particle is something every beginning quantum mechanics student quickly learns.

The first test of that formula came from a puzzle that stimulated de Broglie’s wave idea. If an electron in a hydrogen atom were a compact particle, how could it possibly “know” the size of an orbit in order to follow only those orbits allowed by Bohr’s by-now-famous formula?

The length of violin string required to produce a given pitch are determined by the whole number of half-wavelengths of vibrations that fit along the length of the string. Similarly, if the electron was a wave, the allowed orbits might be determined by a whole number of electron wavelengths that fit around the orbit’s circumference. Applying this idea, de Broglie was able to derive Bohr’s ad hoc quantum rule.

(In the violin, it’s the material of the string that vibrates. What vibrates in the case of an electron “wave” was a mystery. It’s become an even deeper one.)

It’s not clear how seriously de Broglie took his conjecture. He certainly did not recognize it as advancing a revolutionary view of the world. In his own later words:

[H]e who puts forward the fundamental ideas of a new doctrine often fails to realize at the outset all the consequences; guided by his personal intuitions, constrained by the internal force of mathematical analogies, he is carried away, almost in spite of himself, into a path of whose final destination he himself is ignorant.

De Broglie took his speculation to his thesis adviser, Paul Langevin, famous for his work on magnetism. Langevin was not impressed. He noted that in deriving Bohr’s formula de Broglie merely replaced one ad hoc assumption with another. And de Broglie’s assumption, that electrons could be waves, seemed ridiculous.

Were de Broglie an ordinary graduate student, Langevin might have summarily dismissed his idea. But he was Prince Louis de Broglie. Aristocracy was meaningful, even in the French republic. So no doubt to cover himself, Langevin asked for a comment on de Broglie’s idea from the world’s most eminent physicist. Einstein replied that this young man has “lifted a corner of the veil that shrouds the Old One.”

Meanwhile, there was a minor accident in the laboratories of the telephone company in New York. Clinton Davisson was experimenting with the scattering of electrons from metal surfaces. While Davisson’s interests were largely scientific, the phone company was developing vacuum tube amplifiers for telephone transmissions, and for that the behavior of electrons striking metal was important.

Electrons usually bounced off a rough metal surface in all directions. But after the accident, in which a leak allowed air into his vacuum system and oxidized a nickel surface, Davisson heated the metal to drive off the oxygen. The nickel crystallized, essentially forming an array of slits. Electrons now bounced off in only a few well-defined directions. The discovery confirmed de Broglie’s speculations that material objects could also be waves.

I began these recent “Histories” with the first hint of quantum in 1900. It was a hint largely ignored. We now reach the state of physics in 1923 where scientists finally are forced to accept a wave-particle duality and the concept of quantum. A photon, an electron, an atom, a molecule — in principle any object — can be either compact or widely spread-out. You can choose which of these two contradictory features to demonstrate. The physical reality of an object depends on how you choose to look at it.

So we go from the strange concept of quantum energy in physics to the even weirder view that energy waves can be particles and solid objects can be waves. This duality of Nature is not the strangest thing in Quantum Mechanics. It’s strange and weird, that’s for sure, but there’s more to come — even stranger and weirder.


Since I really consider this series in my blog a “rough draft” of some ideas that need polish and editing, I still have time to go back and correct an oversight in Chapter One. The ancient Greeks recognized that the music from their stringed instruments was most pleasing when the various notes in a melody came from whole number fractions of the basic (or tonic) string. Out of that came modern music with octaves (when the string length is one-half), thirds, fourths, fifths, etc. This was reflected in their mathematics and what we call “rational numbers.”

The Greeks thought that all numbers were made up of fractions of whole numbers just like the pleasing music of the whole number fractions on stringed instruments.

A story is told that when the first mathematician realized that the square root of two was not a number made from a fraction, that he was thrown overboard (literally) as some kind of heretic.

It has always been my hypothesis that it was this recognition of the connection between simple fractional string lengths and the pleasing esthetics of the musical result that demonstrated to the Greeks that concepts of Nature could be represented as simple formulas. That idea was the beginning of scientific thought.

So isn’t it interesting that the concept of string lengths and half-wavelength explanation that impressed the Greeks appears again a couple of thousand of years later in the most sophisticated science we know of.

That fact has always been a significant point to me. I probably won’t win the Nobel Prize for describing it. But it is something to think about — isn’t it? Who knew that science would be so advanced by “music.”

Monday, April 22, 2013

History of Science -- Part Eleven: The Atom

Niels Bohr grew up in a comfortable and respected family that nurtured independent thought. His father, a professor of psychology at Copenhagen University, was interested in philosophy and science and encouraged those interests in his two sons. Niel’s brother, Harald, eventually became an outstanding mathematician. Niels Bohr’s early years were supportive. Unlike Einstein, he was never the rebel.

In college in Denmark, Bohr won a medal for some clever experiments with fluids. But I’ll skip ahead to 1912 when, with his new Ph.D., Bohr went to England as a “postdoc,” a postdoctoral student.

By this time the atomic nature of matter was generally accepted, but the atom’s internal structure was unknown — actually, it was in dispute. Electrons, negatively charged particles thousands of time lighter than the atom, had been discovered a decade earlier by J.J. Thompson. An atom, being electrically neutral, must somewhere have a positive charge equal to that of its negative electrons, and that positive charge presumably had most of the mass of the atom. How are the atom’s electrons and its positive charge distributed?

Thompson had made the simplest assumption, that the massive positive charge uniformly filled the atomic volume and the electrons — one in hydrogen and almost 100 in the heaviest known atoms — were distributed throughout the positive background like raisins in a rice pudding. (I personally call that the chocolate chip cookie theory, since I prefer cookies to pudding.) Theorists tried to calculate how various distributions of electrons might give each element its characteristic properties.

There was a competing model for the atom. Ernest Rutherford at the University of Manchester in England explored the atom by shooting alpha particles (helium atoms stripped of their electrons) through a gold foil. He saw something inconsistent with Thompson’s uniformly distributed positive mass. About one alpha in 10,000 would bounce off at a large angle, sometimes backwards. The rest of the particles seemed to pass through the thin gold without any deflection. The experiment was likened to shooting prunes through rice pudding — collisions with raisins could not knock a fast prune much off track. Rutherford concluded that his alpha particles were colliding with an atom’s positive charge, and that almost all the atom’s mass was concentrated in a small lump, a “nucleus.”

Why, then, didn’t the negative electrons, attracted by the positive nucleus, not just fall into it? For the same reason that the planets don’t crash down into the son. They orbit the sun. Rutherford decided that electrons orbited a small, massive, positive nucleus.

There was a problem with Rutherford’s planetary model: instability. Since an electron is charged, it should radiate as it races around its orbit. Maxwell’s equations had shown that a charge in motion would radiate electromagnetic waves. Calculations showed that an electron should give off its energy as light and spiral down to crash into the nucleus in less than a millionth of a second.

Most of the physics community considered the instability of the planetary model a more serious problem than the rice pudding model’s inability to explain the rare large-angle deflections of Rutherford’s alpha particles. But Rutherford, a supremely confident fellow, knew his planetary model was basically right. Experiments trump philosophy!

When the young postdoc Bohr arrived in Manchester, Rutherford assigned him the job of explaining how the planetary atom might be stable. Bohr’s tenure in Manchester lasted only six months, supposedly because his support money ran out. But eagerness to get back to Denmark to marry the beautiful Margrethe likely shortened his stay. (After all, scientists are human just like the rest of us.) While teaching at the University of Copenhagen in 1913, Bohr continued to work on the stability problem.

How he got his successful idea is not clear. But while other physicists were trying to understand how the quantum of energy and Planck’s constant, ℎ, arose from the classical laws of physics, Bohr took an “ℎ is okay!” attitude. He just accepted quantization as fundamental. After all, it worked for Planck, and it worked for Einstein.

Bohr wrote a very simple formula that stated that “angular momentum,” the rotational motion of an object, could exist only in quantum units. If so, only certain electron orbits were allowed. And, most important, he wrote his formula so that there was a smallest possible orbit. By fiat, Bohr’s formula “forbids” an electron to crash into the nucleus. If his ad hoc formula was correct, the planetary atom was stable.

Without more evidence, Bohr’s quantum idea would be rejected out of hand. But from his formula, Bohr could readily calculate all the energies allowed for a single electron orbiting a nucleus, that is, for the hydrogen atom. From those energies he could then calculate the particular frequencies of light that could be emitted from hydrogen atoms electrically excited in a “discharge,” something like a neon sign only with hydrogen instead of neon.

Those frequencies had been carefully studied for years, though Bohr was initially unaware of that work. Why only certain frequencies were emitted was a complete mystery. The spectrum of frequencies, unique to each element, presented a pretty set of colors. But were they any more significant than the particular patterns of a butterfly’s wings? Now, however, Bohr’s quantum rule predicted the frequencies for hydrogen with stunning accuracy — precise to parts in 10,000. But at this time, while Bohr had light quanta emitted by atoms, he, along with essentially all other physicists, still rejected Einstein’s compact photon.

Some physicists nevertheless dismissed Bohr’s theory as “number juggling.” Einstein, however, called it “one of the greatest discoveries.” And others soon came to agree. No one understood why it worked. But work it did. And for Bohr that was the important thing. Bohr’s pragmatic “ℎ is okay!” attitude toward the quantum brought him quick success.

(Bohr's equations contained the value of ℎ divided by 2π. That is because of the relationship of angular momentum and a circle. There are 2π radians in the circumference of a circle and the electrons orbited in circles. This relationship of ℎ / 2π appears so much in modern formulas, that a special symbol for the value of ℎ / 2π was created called h-bar: "."
ℏ = ℎ / 2π)

Contrast Bohr’s early triumph with his quantum ideas with Einstein’s long remaining “a man apart” in his belief in the almost universally rejected photon. I often wonder how the early experiences of these two men are reflected in their lifelong friendly debate about quantum mechanics.

(That story, however, is several chapters into the future History of Science. For the next episode we will learn of a possible explanation of Bohr’s formula and restrictions on electron orbits.)

Please forgive a little anti-climatic addition to an essay that is really complete, but I must say more about Niels Bohr:

Much later he conceived the principle of complementarity: that items could be separately analyzed as having contradictory properties, like behaving as a wave or a stream of particles. The notion of complementarity dominated his thinking on both science and philosophy. This principle attempts to explain the duality that everyone found so difficult to accept with Einstein’s proposal that light could act like particles.*

This is a history, so I should also add that, during the 1930s, Bohr gave refugees from Nazism temporary jobs at the Institute of Theoretical Physics at the University of Copenhagen, now known as the Niels Bohr Institute, which he founded. He provided them with financial support, arranged for them to be awarded fellowships from the Rockefeller Foundation, and ultimately found them places at various institutions around the world. After Denmark was occupied by the Germans, he had a dramatic meeting in Copenhagen with Heisenberg, who had become the head of the German nuclear energy project.

In 1943, fearing arrest, Bohr fled to Sweden, where he persuaded King Gustav V of Sweden to make public Sweden's willingness to provide asylum. He was then flown to Britain, where he joined the British Tube Alloys nuclear weapons project, and was part of the British team of physicists who worked on the Manhattan Project.

After the war, Bohr called for international cooperation on nuclear energy. He was involved with the establishment of CERN, and became the first chairman of the Nordic Institute for Theoretical Physics in 1957. He was also involved with the founding of the Risø DTU National Laboratory for Sustainable Energy.

He died in 1962 at 77 years of age.

*In physics, complementarity is a fundamental principle of quantum mechanics, closely associated with the Copenhagen interpretation which will be discussed in following chapters. It holds that objects governed by quantum mechanics, when measured, give results that depend inherently upon the type of measuring device used, and must necessarily be described in classical mechanical terms. Further, a full description of a particular type of phenomenon can only be achieved through measurements made in each of the various possible bases — which are thus complementary.

Sunday, April 21, 2013

What I Think

I spend a lot of time on Facebook. I suppose some people watch television when they want to take a break. I just don’t find much on TV these days that interest me – OK, I do like NCIS, but I get plenty of that in reruns on some station on the dial. I also watch the Cooking Channel a lot. I like both Chopped and Restaurant Impossible. But that’s only two nights a week. During the day and most nights I’m either cuddled up with a good book (or Kindle) or I’m walking the dog or taking a walk or just walking. I love the out-of-doors, especially the beauty that is Colorado.

But when I have a few minutes to sit down and drink some coffee or just relax, I always turn to FB. Mostly I view it on my phone except when I want to write something.

I’ve been busy for over a week now on a nice little tale of the history of science. At least, it is nice for me. I just love explaining science and sharing just how we got to today, complete with Facebook, servers, laptops, tablets, and smart phones. The journey is really interesting … at least to me.

I’ve spent a lot of time in a classroom. I’ve been a student for many, many hours of lecture and lab and homework. I’ve also stood in the front and tried to share the knowledge as best I could. These days I don’t lecture, except to a keyboard. I’m also back in class, although that is primarily via video and more text books, but the homework is just as hard (if not harder).

So why is FB attractive to me? What do I get out of it? Well, I like sharing my life with my friends, both past and present. I like to see the pictures and share the adventures and just have a feeling that I know a little better what my loved ones are up to.

I read all the comments with great interest. Some just share their lives with food and kids. A lot focus on music or current events or take the platform to present their personal political view or to denounce the views of those they don’t agree with. Some share the activities of their lives or their interests in travel or birds or scenery of whatnot. All that is very interesting to me.

Many of my friends and acquaintances, not to mention many of my family, are deep in faith and they freely share that too. That’s good. I understand clearly the people that have made God the center of their lives and their desire to share the Gospel message with others. They’re actually obeying one of God’s commandments when they do so. Many also share how they are living their lives for God’s glory in the works they do and the activities they support. I like to read about those charities and works too. They build me up to see the love in other people’s actions. God knows we need to be exposed to more love in other’s actions with all the hate that is floating around in the current events.

After all, the Gospel is free, it makes you free, and it is free for all. A theme I picked up in church today. I was thinking about how people’s lives are shaped by what happens in their life. That is, their future is often a reflection of their past.

Many people have had to deal with great tragedy in their life. For some, that tragedy strengthened their will like fire strengthens iron. For others the tragedy has become a theme that they carry like a heavy burden, making them all bent to the ground.

The Gospel is good news because it says you can lay that burden down. Faith is not a burden, but rather it lifts the burden.

Me, I’ve had no tragedy. I thought about what word is the opposite of “tragedy,” its antonym. I think, at least in my case, it is “luck.” I often consider myself the luckiest guy in the world.

I was born to fine and loving parents and grew up in a home full of love and laughter and music in a wonderful town in a wonderful state in a wonderful country. Throughout my life I’ve been lifted up by good events and happenings and my marriage has been a blissful union with the most loving and precious of wives and great family and children. I’ve had the freedom to explore the things that interest me most, while my professional life always provided support for my family and my interests and left over plenty for savings. Now I find myself retired with an excellent pension, good government support, and savings that let me continue to live a life of interest and loving relationships. I know that many, if not most people in the world are not so lucky … or blessed.

Oh, I’m lucky all right. However, not everything has been perfect, and I’ve had losses. But the positive always seems to exceed any kind of negativity in my life. I don’t even feel a need to document my losses, even though some have been great, for I have memories that compensate for those that are no longer in my life or those that the Lord has called home. No, those tragedies are not what have shaped my life.

Anyone who has read more than two of my postings on Facebook know that my great delights are in science and music. Lots of other interests too, but those are the main ones.

I am a man of Faith and my beliefs are the rock that my life rests upon. I was just thinking this morning how lucky I was to get to baptize my son and also my wife. Me, myself, I was baptized by a friend with an audience of other friends. My Christian beliefs are part of what makes me who I am. They are integral to my family and our interactions. They are the strength of our lives that can survive even the greatest adversity, although there has been no need … at least so far.

I recently did some musical work for another friend who writes and directs plays and dances and other enjoyable works of art. It is such a pleasure to be involved in her artistic efforts. This time it is a dance recital with the theme of Jack and the Beanstalk as the Prodigal Son … an interesting and logical combination. I’ve completed the basic sound tracks and voice-overs, and we’re currently refining the music production and getting it all down on disk for the performance.

The story of the Prodigal Son is a very special story for me. Jesus tells it, so it is literally straight from the Lord’s mouth. It is a simple story, yet one that a person must read carefully. There are three characters. The father, the prodigal son, and the “good” son. There are so many lessons in the actions of all three, but the greatest lesson is the lesson of grace and of love.

These are lessons the world is much in need of, especially at this time of tragedy … while we await news of the next tragedy. How these events affect us and how they make us feel and, even more important, how they make us act is a test, a measure, an experiment to gauge our faith and our grace and our love. I just hope we don’t all fail the test … schoolteacher speaking here.

So, this is my dilemma. I have so many friends and family that it is very obvious from their Facebook comments, live lives that are focused on their faith. So why aren’t my comments always about my faith? Is my love of science and music greater than my faith in Jesus Christ? That’s a good question.

I suppose that I could be like the Apostle Paul, not after his conversion, but before. Remember, he was the greatest of the Pharisees. He lived the law to the best of his ability. But, after the conversion, after the blessing of grace, he lived a law of freedom: Freedom to live a life under grace. He knew that his behavior toward God did not affect God’s behavior toward him. That is the freedom of the Gospel. God’s grace transforms us. It is not something we earn, but something given as a gift. We don’t deserve it, but it is freely given anyway.

So, I choose to exercise my freedom in God’s love by pursuing the things that I love. I worship God in the creation and in music. That’s why I do what I do and that’s why I write what I write.

I may be the prodigal son … wandering in the wilderness, but I know I have a loving Father that will take me in with open arms. I don’t think he will hold a little attempt to understand the wonder of His creation against me. I could spend all day studying his word, as I know many do. That is a wonderful way to spend your time. Moreover, I consider my studies of the wonders of the universe to also be studying His word, just in a different medium.

God forgive me if I error. And I know He will. Thank God.

History of Science -- Part Ten: Particles vs. Waves

Recall that Newton had a troubling time classifying light during his research. He thought it had some wave properties, but he finally decided light was a particle that he called a “corpuscule.” He postulated that, since light followed his laws of motion in both reflection and refraction, it had to be a stream of particles. Others thought it was a wave, but Newton's reputation won out his view, at least in England. Finally Thomas Young proved conclusively that light was a wave with his double slit experiment. And then Maxwell showed that light was a complex electromagnetic wave that was a combination of an electric field and a magnetic field. That was the state of the science at the turn of the twentieth century.

But God is more subtle than even all these great minds had concluded. It took another scientist to finally discover the surprising truth. Another troubling result that is part of Quantum Mechanics. Planck had already calculated that these tiny, atom-sized particles that made up all matter followed rules unlike any we had experienced in our “macro” world. Now more mystery and confusion will be added to QM.

His parents worried about mental retardation when this young boy was slow in starting to talk. Later, though he became an avid and independent student of things that interested him, his distaste for the rote instruction of the Gymnasium (High School) led to his not doing well. Asked to suggest a profession that he might follow, the headmaster confidently predicted, “I doesn’t matter, he’ll never make a success of anything.”

His parents left Germany for Italy after the family electrochemical business failed. The new business in Italy fared little better and soon the young man was on his own. He took the entrance exam to the Zurich Polytechnical Institute but did not pass. He was finally admitted the next year. On graduation, he was unsuccessful in trying for a position as Privatdozent. He had the same luck in applying for a teaching job at the Gymnasium. For a while he supported himself as a tutor for students having tough with high school. Eventually, through a friend’s influence, he got a job in the Swiss patent office.

His duties as Technical Expert, Third Class, were to write summaries of patent applications for his superiors to use in deciding whether an idea warranted a patent. He enjoyed the work, which did not take his full time. Keeping an eye on the door in case a supervisor came in, he worked on his own projects.

Initially, he continued on the subject of his doctoral thesis, the statistics of atoms bouncing around in a liquid. This work soon became the best evidence for the atomic nature of matter, something still debated at that time. He concluded that the apparent random movement of tiny dust motes suspended in water was caused by the microscopic jostling of individual atoms. This is called Brownian Motion. He was the first to understand the cause.

Then he was struck by a mathematical similarity between the equation for the motion of atoms and Planck’s radiation law. He wondered: Might light be not only mathematically like atoms, but also physically like atoms?

If so, might light, like matter, come in compact lumps? Perhaps the pulses of light energy emitted in one of Planck’s quantum jumps did not expand in all directions as Planck assumed. Could the energy instead be confined to a small region? Might there be atoms of light as well as atoms of matter?

He speculated that light is a stream of compact lumps, “photons” (a term that came later). Each photon would have an energy equal to Planck’s quantum (Planck’s constant times its frequency). Photons would be created when electrons emit light. Photons would disappear when light is absorbed.

Seeking evidence that his speculation might be right, he looked for something that would display a granular aspect to light. It was not hard to find. The “photoelectric effect” had been known for almost twenty years. Light shining on a metal could cause electrons to pop out.

The situation was messy. Unlike thermal radiation, where a universal rule held for all materials, the photoelectric effect for each substance was different. Moreover, the data was inaccurate and not particularly reproducible.

But never mind the poor data. Spread out light waves shouldn’t kick electrons out of a metal at all. Electrons are too tightly bound. While electrons are free to move about within a metal, they can’t readily escape it. We can “boil” electrons out of a metal, but it takes a very high temperature. We can pull electrons out of a metal, but it takes a very large electric field. Nevertheless, dim light, corresponding to an extremely weak electric field, still ejects electrons. The dimmer the light, the fewer the electrons. But no matter how dim the light, some electrons are always ejected.

The photoelectric effect was just what the twenty-five year old scientist needed. Planck’s radiation law implied that light came in packets, quanta, whose energy was larger for higher frequency light. If the quanta were actually compact lumps, all the energy of each photon could be concentrated on a single electron. A single electron absorbing a whole photon would gain a whole quantum of energy.

Light, especially high-frequency light with its high-energy photons, could then give electrons enough energy to jump out of the metal. The higher the energy of the photon, the higher the energy of the ejected electron. For light below a certain frequency, its photons would have insufficient energy to remove an electron from the metal, and no electrons would be ejected.

In 1905, the young scientist wrote, “According to the presently proposed assumption the energy in a beam of light emanating from a point source is not distributed continuously over larger and larger volumes of space, but consists of a finite number of energy quanta, localized at points of space which move without subdividing and which are absorbed and emitted only as units.”

This scientist believed what Max Planck did not. He believed in the quantum theory which all the others had ignored.

Assuming that light comes as a stream of photons and that a single electron absorbs all the energy of a photon, he used the conservation of energy of the ejected electrons. If you plot the energy-frequency of ejected electrons it shows that photons with energy less than the binding energy of the metal does not kick any electrons out. That explains the photoelectric effect being different for different metals. He hypothesized it was tied to the electron binding energy.

A striking aspect of his photon hypothesis is that the slope of the straight line in his graph is exactly Planck’s constant, "ℎ." Until this time, Planck’s constant was just a number needed to fit Planck’s formula to the observed thermal radiation. It appeared nowhere else in physics. Before this young scientist’s photon hypothesis, there was no reason to think the ejection of electrons by light had anything at all to do with the radiation emitted by hot bodies. This slope of the graph was the first indication that the quantum was universal.

Ten years after this work on the photoelectric effect, the American physicist Robert Millikan found that his formula in every case predicted “exactly the observed results.” Through careful experiment the quality of the photoelectric effect data was finally improved. Nevertheless, Millikan called the photon hypothesis leading to that formula “wholly untenable” and called the young scientist's suggestion that light came as compact particles “reckless.”

Millikan was not alone. The physics community received the photon postulate “with disbelief and skepticism bordering on derision.” Nevertheless, eight years after proposing the photon, the young scientist had gained a considerable reputation as a theoretical physicist for many other achievements and was nominated for membership in the Prussian Academy of Science. Planck, in his letter supporting that nomination, felt he had to defend the young scientist. “[T]hat he may sometimes have missed the target in his speculations, as, for example, in his hypothesis of light quanta, cannot really be held too much against him …” Remember, even Planck didn’t believe in quanta, although he had first thought of it.

Ultimately, in 1922, this scientist was awarded the Nobel Prize for his analysis of the photoelectric effect, yet the citation avoided explicit mention of the then seventeen-year-old idea, but still unaccepted photon. A biographer later wrote, “From 1905 to 1923, he was a man apart in being the only one, or almost the only one, to take the light-quantum theory seriously.”

Though the reaction of the physics community to photons was, in a word, rejection, they were not just being pig-headed. Light was proven to be a spread-out wave. Light displayed interference. A stream of discrete particles could not do that.

Recall my description in a previous chapter of the History of Science of Young’s experiment with the double slit. Light coming through a single narrow slit illuminates a screen more or less uniformly, displaying the spreading nature of waves. Once a second slit is added a pattern of dark bands appears whose spacing depends on the spacing of the two slits. At those dark places, wave crests from one slit arrive together with wave troughs from the second slit. Waves from one slit thus cancel waves from the other. Interference demonstrates that light is a wave. There was no other possible conclusion.

Nevertheless, the now Nobel Prize winning physicist held that the photoelectric effect showed light to be a stream of photons — tiny compact bullets. But how could tiny bullets produce the interference patterns seen with light?

One might suppose that the tiny bullets might bounce off each other and stack up non-uniformly making the bands that were assumed to be an interference pattern, but that loophole was quickly closed by careful experiments that sent one photon at a time through the slits. This had to be repeated hundreds of times, but the ultimate effect was the build up of an interference pattern.

A great mystery: Choosing to demonstrate interference, something explicable only in terms of waves, you could prove light to be a widely spread-out wave. However, by choosing a photoelectric demonstration, where a single electron absorbed a whole light quantum, you could prove light to be a stream of tiny compact objects. There seems to be an inconsistency. It appeared that light was playing with the experimenters and would act like a wave if tested as a wave and act like particles if tested as particles. But it has to be one or the other. Right?

Though the paradoxical nature of light disturbed the young scientist, he clung to his photon hypothesis. He declared that a mystery existed in Nature and that we must confront it. He did not pretend to resolve the problem.

And we do not pretend to resolve it now. The mystery is still with us a hundred years later. The implications of our being able to choose to prove either of two contradictory things extend beyond physics. It’s the quantum enigma. We now accept this dual nature of light, even though we still don’t understand it. Remember, we use experimental results to demonstrate the truth of theories. In this case, the various experiments seem to contradict each other, and we’re left with no choice but to accept the dual nature. Light manifests itself both as a wave and as a particle. This dual nature of light is disturbing, but — as the magician said — you ain't seen nothin' yet.

In 1906, the young scientist I’ve been talking about had discovered photons, firmly established the atomic nature of matter in his work on Brownian Motion, and formulated the theory of Relativity. Yes, the young man’s name was Albert Einstein. In that year he was promoted by the Swiss patent office to Technical Expert, Second Class.