Sunday, September 16, 2012

Breakthrough Thinking

I was helping Mark with some of his chemistry homework the other day. He was confused by the fact that, with the exception of Hydrogen and Helium, all elements have a “complete” outer shell or orbit when there are exactly eight electrons. That is the concept of valence in chemistry, and all the inert or “Nobel” gases won’t combine with other elements precisely because they already have eight electrons in their out-most shell. Elements with only one electron in the outermost shell, like Lithium, Sodium, or Potassium are very reactive because they are inclined to give up that outermost single electron, and elements like Oxygen, Fluorine, or Sulfur just have to gain one or two to get that "magic," "completeness" number making them very active chemically. The columns of the Periodic Table contain elements with many characteristics in common, precisely because the columns match the electron structure in the outermost shell.

For example, water, H2O consists of two hydrogen atoms with one electron in the outer shell each. They share these two electrons with Oxygen forming what is called a covalent bond. That adds two electrons to Oxygen’s six outer electrons, and everyone is happy — and stable. But Mark didn’t understand how the heavier elements with 16 or more electrons in the outermost shell could match the limit of eight. The answer, as any freshman chemist can tell you, is subshells. The heavier elements have subshells and it is the outer subshell that is limited to eight.

(In elements called the “transition metals,” it is a little more complicated than that, and inner shells can provide “valence” electrons. The electrons that determine how an atom reacts chemically are those that travel farthest from the nucleus, that is, those with the most energy. Electrons in the inner subshells have less energy than those in outer subshells. This effect is great enough that the 3d (third orbit, subshell “d”) electrons have more energy than 4s (fourth orbit, subshell “s”) electrons, and are therefore more important in chemical reactions, hence making them valence electrons although they are not in the so-called valence shell. TMI. Let’s not go there!)

That led us to discussions of the similarity between atoms and solar systems. The Rutherford model of the atom was very much copied from the solar system, but it was soon discovered to have many deficiencies. Niels Bohr modified the model to match quantum physics requirements, but many students still imagine the atom using the original Rutherford model. It is a nice analogy for students to understand, we pretend atoms are like solar systems with the nucleus playing the role of the sun and the electrons revolving around like planets. It is a useful model, but not really very accurate. Models of the atom are actually very complicated and involve both quantum dynamics and probability. They are best modeled with pure mathematics.

(The atomic orbital model is the currently accepted model of the placement of electrons in an atom. It is sometimes called the wave mechanics model. In the atomic orbital model, the atom consists of a nucleus surrounded by orbiting electrons. These electrons exist in atomic orbitals, which are a set of quantum states of the negatively charged electrons trapped in the electrical field generated by the positively charged nucleus. The atomic orbital model can only be described by quantum mechanics, in which the electrons are most accurately described as standing waves surrounding the nucleus.)

(Despite the obvious analogy to planets revolving around the Sun, electrons cannot be described as solid particles. In addition, atomic orbitals do not closely resemble a planet's elliptical path in ordinary atoms. A more accurate analogy might be that of a large and often oddly-shaped "cloud" (the electron), distributed around a relatively tiny planet (the atomic nucleus). One difference is that some of an atom's electrons have zero angular momentum, so they cannot in any sense be thought of as moving "around" the nucleus as a planet does around the sun. Other electrons do have varying amounts of angular momentum.)

After Mark left, I kept thinking — always a dangerous thing to do. That led me to one of my favorite thoughts on the progress of science: how theories are created to explain phenomenon, and how better theories cause the discard of early explanations. Successful theories do a good job of explanation, but when a better theory comes along and does a better job of explaining, and — at least to a degree — replaces earlier theories. Newton’s theory of gravity has not been completely discarded. It works for most instances, but must be corrected for very high velocities and very high gravity fields. Newton’s mathematics describes the orbit of the moon around the earth and the earth around the sun, but does not explain black holes.

For many years there was a theory of “ether” which was supposed to be a clear substance that permeated all of space. Light was then supposed to be vibrations in the ether, just as sound is vibrations in air. Then the Michelson-Morley experiments measuring the speed of light demonstrated that the ether did not exist. Forty years later, a young Albert Einstein developed a new theory that explained the experimental results. And ten years after that, Einstein’s general theory explained gravity and also made parts of Sir Isaac Newton’s theory of gravity obsolete. So we often think of Einstein as the archetypical genius — and a genius he surely was!

As scientists develop theories and mathematical models, they can apply these models to existing experimental and observed facts. But often the models and equations have greater interpretation and consequences. Some scientists have problems with these new consequences and may even search for a different result. That is really the problem of paradigms: we think and model within strict structures and our ideas are heavily influenced by our current society and level of understanding. That may be why most great creative breakthroughs occur with young scientists. Older scientists (a category in which I fit) have the problem of paradigms filtering their thinking to match the status quo and years of established thoughts and theories. It is very hard to breakthrough to new ideas.

Another example, again at the expense of Albert, is quantum physics. In 1905, Einstein explained certain features of the photoelectric effect by assuming that Planck's energy quanta were actual particles, which were later dubbed photons. This idea, in some ways, became the foundation for quantum theory. The situation changed rapidly in the years 1925–1930, when working mathematical foundations were found through the groundbreaking work of Erwin Schrödinger, Werner Heisenberg, Max Born, Pascual Jordan, and the foundational work of John von Neumann, Hermann Weyl, and Paul Dirac, and it became possible to unify several different approaches in terms of a fresh set of ideas. The physical interpretation of the theory was also clarified in these years after Werner Heisenberg discovered the uncertainty relations and Niels Bohr introduced the idea of complementarity.

Fundamental to all these discoveries and explanations, was a probabilistic interpretation. You can't tell exactly where the electrons are. They exist in a sort of probabilistic or statistical "cloud." Quantum theory became highly non-deterministic. An older Einstein by that time, did not accept this probabilistic explanation. He agreed with the experimental evidence, but assumed there were simply variables that were not known. He called these the hidden variables and famously stated, "God does not play dice with the universe."

Was he just too old at that time to recognized the value of the accepted statistical model? Were his paradigms too strong to overcome? Since all the experimental results since then, over 80 years worth, have continued to verify Schrödinger's and the others results. No hidden variables have been found, and statistical concepts like "tunneling of electrons" have been verified and used in practical devices like tunnel diodes.

So, with all of Einstein's genius, it appears he got that wrong. Well, no-one is perfect, and it is often time to get out of the way of the younger generaton and let them take over.

An interesting example of theories and their consequences comes from cosmology. Cosmology is the study of the origin and evolution of the universe. It is the one arena in which we can actually witness history. The pinpoints of starlight we see with the naked eye are photons that have been streaming toward us for a few years or a few thousand. The light from more distant objects, captured by powerful telescopes, has been traveling toward us far longer than that, sometimes for billions of years. When we look at such ancient light, we are seeing — literally — ancient times.

During the past decade, as observations of such ancient starlight have provided deep insight into the universe's past, they have also, surprisingly, provided deep insight into the nature of the future. And the future that the data suggest is particularly disquieting — because of something called dark energy.

This story of discovery begins a century ago with Einstein, who realized that space is not an immutable stage on which events play out, as Newton had envisioned. Instead, through his general theory of relativity, Einstein found that space, and time too, can bend, twist and warp, responding much as a trampoline does to a jumping child. In fact, so malleable is space that, according to the math, the size of the universe necessarily changes over time: the fabric of space must expand or contract — it can't stay put.

(Imagine a ball thrown into the air. It must either rise up under the force of the throw, or start to descend back to earth under the force of gravity. There is no equilibrium. The ball can’t float in the air.)

For Einstein, this was an unacceptable conclusion. He'd spent ten grueling years developing the general theory of relativity, seeking a better understanding of gravity, but to him the notion of an expanding or contracting cosmos seemed blatantly erroneous. It flew in the face of the prevailing wisdom that, over the largest of scales, the universe was fixed and unchanging.

Einstein responded swiftly. He modified the equations of general relativity so that the mathematics would yield an unchanging cosmos. A static situation, like a stalemate in a tug of war, requires equal but opposite forces that cancel each other. Across large distances, the force that shapes the cosmos is the attractive pull of gravity. And so, Einstein reasoned, a counterbalancing force would need to provide a repulsive push. But what force could that be?

Remarkably, he found that a simple modification of general relativity's equations entailed something that would have, well, blown Newton's mind: antigravity — a gravitational force that pushes instead of pulls. Ordinary matter, like the Earth or Sun, can generate only attractive gravity, but the math revealed that a more exotic source — an energy that uniformly fills space, much as steam fills a sauna, only invisibly (reminiscent of the ether) — would generate gravity's repulsive version. Einstein called this space-filling energy the “cosmological constant,” and he found that by finely adjusting its value, the repulsive gravity it produced would precisely cancel the usual attractive gravity coming from stars and galaxies, yielding a static cosmos. He breathed a sigh of relief.

(Again, think of the ball thrown up in the air. Perhaps you’ve seen the little trick with a column of air holding a ping pong ball steady in mid air. This is, of course, because the ongoing force of the air upward is balanced with the downward force of gravity, putting the ball in equilibrium. What Einstein proposed was the opposing or opposite force which would equalize and “neutralize” the force of gravity.)

A dozen years later, however, Einstein rued the day he introduced the cosmological constant. In 1929, the American astronomer Edwin Hubble discovered that distant galaxies are all rushing away from us. And the best explanation for this cosmic exodus came directly from general relativity: much as poppy seeds in a muffin that's baking move apart as the dough swells, galaxies move apart as the space in which they're embedded expands. Hubble's observations thus established that there was no need for a cosmological constant; the universe is not static.

Had Einstein only trusted the original mathematics of general relativity, he would have made one of the most spectacular predictions of all time — that the universe is expanding — more than a decade before it was discovered. Instead, he was left to lick his wounds, summarily removing the cosmological constant from the equations of general relativity and, according to one of his trusted colleagues, calling it his greatest blunder.

But the story of the cosmological constant was far from over. As Brian Greene has written, there is more to the story.

Fast forward to the 1990s, when we find two teams of astronomers undertaking painstakingly precise observations of distant supernovae — exploding stars so brilliant they can be seen clear across the cosmos — to determine how the expansion rate of space has changed over the history of the universe. These researchers anticipated that the gravitational attraction of matter dotting the night's sky would slow the expansion, much as Earth's gravity slows the speed of a ball tossed upward. By bearing witness to distant supernovae, cosmic beacons that trace the universe's expansion rate at various moments in the past, the teams sought to make this quantitative. Shockingly, however, when the data were analyzed, the teams found that the expansion rate has not been slowing down. It's been speeding up.

It's as if that tossed ball shot away from your hand, racing upward faster and faster. You'd conclude that something must be driving the ball away. Similarly, the astronomers concluded that something in space must be pushing galaxies apart ever more quickly. And after scrutinizing the situation, they have found that the push is most likely the repulsive gravity produced by a cosmological constant.

When Einstein introduced the cosmological constant, he envisioned its value being finely adjusted to exactly balance ordinary attractive gravity. But for other values the cosmological constant's repulsive gravity can beat out attractive gravity, and yield the observed accelerated spatial expansion, spot on. Were Einstein still with us, his discovery that repulsive gravity lies within nature's repertoire would have likely garnered him another Nobel Prize.

As remarkable as it is that even one of Einstein's "bad" ideas has proven prophetic, many puzzles still surround the cosmological constant: If there is a diffuse, invisible energy permeating space, where did it come from? Is this dark energy (to use modern parlance) a permanent fixture of space, or might its strength change over time? Perhaps most perplexing of all is a question of quantitative detail. The most refined attempts to calculate the amount of dark energy suffusing space miss the measured value by a gargantuan factor of 10123 (that is, a 1 followed by 123 zeroes) — the single greatest mismatch between theory and observation in the history of science.

This mismatch of the equations and the observations likely means that the equations need further adjustment. (Or could it be the measurements? Doubtful that they are off by that much.) So we have the march of science, from Newton to Einstein to ????. There is so much more to learn.

One reason I’m so excited about space travel and a strong advocate for NASA is the better environment for observation. If we had a world class observatory on the far side of the moon, sheltered from the radio waves of the earth and not blocked by several miles of air, but able to clearly observe the cosmos, what data would we collect? What young scientist would then have the idea, breaking loose from the paradigms and discovering whole new explanations for the world around us?

Originally written Feb. 11, 2011.

No comments:

Post a Comment