Monday, December 8, 2008

Frank Tipler's God of the Multiverse - Part I

In this earlier post, I looked at just one chapter in Frank Tipler's remarkable book, The Physics of Christianity, having to do with the Virgin Birth of Jesus Christ, and the amazing scientific data that Prof. Tipler (who teaches physics at Tulane University) had amassed in support of what I had always considered an abstract theological doctrine. This time I would like to spend some time examining, in a series of posts, one of the themes that permeates the book: the nature of God---in particular, Tipler's view of God as the Ultimate Singularity in the multiverse.

Don't worry if those last five words left you grasping for a referent. One of the virtues of Prof. Tipler's book is its ability to stretch your mind and make you think about things in a way that you have never done before. By the time you finish this series, I hope you will begin to appreciate what I mean by that statement.

Trying to reduce the concept of God to words is a daunting task. Traditionally, there is the via positiva ("positive path"), which tries to describe God through a listing of His attributes: He is omnipotent, omniscient, all-loving, all-seeing, etc. Opposed to that is the via negativa, which denies that any combination of words can encompass the reality of God, and says that the most we mortals can manage is to say what God is not. (A famous medieval text espousing this manner of approaching God is The Cloud of Unknowing, which teaches that God can be experienced only through the heart, and not the mind.)

Frank Tipler espouses yet another approach: he comes to an understanding of God through the well-tested laws of physics and mathematics. And in doing so, he raises difficulties for both scientists and non-scientists.

Tipler's explanations are difficult for non-scientists, because they need some grounding in mathematics and physics to appreciate what he is claiming God to be. But they are equally difficult for scientists, who know perfectly well the mathematics and physics that Tipler is using. They, however, have always considered God to be metaphysical---literally, "beyond physics"--- and therefore not describable or comprehensible in mathematics.

To both groups, scientists and non-scientists alike, I would say: Please keep an open mind as you read what follows. The non-scientists will need to be open to learning about concepts with which they may not be familiar. And the scientists will need to be open to viewing what might be familiar mathematical concepts in unfamiliar ways. And especially for the non-physicists: please bear with me in this first post of the series, which will be a highly condensed summary of the current state of our knowledge in physics; you may want to take it in small doses, and reread it on separate days until you can wrap your mind around the various concepts. As for the physicists and other scientists who read this, please forgive me (and let me know in the comments) if I have erred or oversimplified in my presentation; it has been almost fifty years since I took physics (and calculus) in college, and my knowledge since then is what I have gained from trying to keep up with advances through popular accounts. With those caveats, let us begin.

Tipler starts with what ought to be familiar territory for everybody: the well-tested truths of science. Our current understanding of the physical world is shaped by three great theories. In the historical order of their exposition, they are: (a) Einstein's theory of general relativity (which was a refinement and extension of classical Newtonian mechanics); (b) the quantum mechanics of Bohr, Heisenberg, Schrödinger, and others; and (c) the so-called "Standard Model" of particle physics, which explains the basis for atoms, protons, electrons, neutrons and quarks.

These three theories have been at the forefront of physics because they have been tested and found accurate thousands, if not hundreds of thousands, of times. Each has been used to predict the existence of a feature or phenomenon before that feature or phenomenon was actually observed in nature. And no observation has ever been verifiably recorded and repeated which contradicted any one of the three theories.

To be more accurate, the three theories are now currently just two, as I shall explain after laying a little more foundation. For now, I would ask the physicists to indulge me in continuing to compare and contrast the three theories.

If we were to rank the three theories instead by the scale of the phenomena they deal with, the order would be exactly reversed. The Standard Model deals with subatomic particles as infinitesimal as quarks, which are too small to be detected separately, and which are the building blocks of the proton and neutron. Quantum mechanics as originally formulated explained the interactions between atomic particles on a microscopic scale; its reach into the subatomic world was gradually expanded until it merged into the Standard Model. The theory of relativity, on the other hand, works with and predicts phenomena occurring on a cosmological scale---the scale of the physical universe which surrounds us.

The great problem in physics today is how to unite the different theories into one Grand Unified Theory, the so-called "Theory of Everything." The problem that everyone from Einstein onward has faced is how to harmonize the continuum of relativity theory with the quanta, or discrete packets, of energy that are found on the microscopic scale of the quantum world. Consider, for example, a single line:

_____________________________________________

Now imagine it extending as far as physically possible at both ends. If you pick two points on it and label the one to the left of you "zero" (0) and the one to the right of it "one" (1), then mathematics teaches us that there is an infinity of points on the line just between the two points you have labeled, no matter how close together you have picked them, or how far apart.

That symbolizes the mathematics behind relativity: the line may be called a continuum, with an infinitely dense number of equally imaginary points between any two you happen to pick, no matter how close.

But we all know (or think we know) that nature itself is not constructed that way. Any physical, concrete line in nature has to be made up at bottom of individual atoms of the element(s) forming the line, and atoms are distinct, separate things. There is no continuum observable when you use a microscope to examine an actual, real-world line, but only (if it is powerful enough) a series of individual, discrete entities, which may be analogized to the quanta of quantum theory.

The problem of reconciling general relativity with quantum theory thus can be reduced to the problem of harmonizing the integers---discrete, individual units which identify the quanta of quantum theory---with the continuum of classical mechanics and relativity theory. (In his formal paper published in Reports on Progress in Physics, Prof. Tipler poses the problem exactly in this way. Which, he asks---the pure numbers or the continuum---underlies physical reality? If you are a physicist and take the trouble to read the full paper, his answer might surprise you.)

Another way of looking at the problem is to note that nature as we know it has four fundamental forces: the "strong" force (which keeps quarks so tightly bound together that they cannot be individually observed inside the particles which they form); the "weak" force which manifests itself in the beta decay of electrons from radioactive nuclei, as well as in the decay of neutrons into protons; the electromagnetic force, which compels the electrons into (discrete) wave patterns around the atoms; and gravity, which holds (for example) the planets in orbit around the Sun (yet which is many orders of magnitude weaker than the so-called "weak" force).

The theories of quantum mechanics and the Standard Model (which have now been combined into one theory) explain the first three forces, but cannot explain gravity. General relativity, on the other hand, does a superb job of explaining gravity, but cannot account for the other three forces. And ever since Einstein worked out his theory of relativity, and Bohr, Heisenberg and Schrödinger worked out the theory of quantum mechanics, scientists have been trying to unify the theories into a single "Theory of Everything," as noted above.

Instead, what happened historically is that quantum mechanics first absorbed special relativity. It incorporated Einstein's equivalence between matter and energy (E = mc2) to deal with the various ways that energy could manifest itself at the subatomic level, in accordance with the uncertainty introduced by Werner Heisenberg (see below). Later, it evolved into two branches that dealt with each of the first three fundamental forces (these were "quantum chromodynamics" for the strong force and "quantum electroweak" theory for the weak and the electromagnetic forces). In the 1970's and '80's those two theories were then united under the rubric of what is called the "Standard Model." Thus it is actually the Standard Model that physicists are trying to unite today with general relativity into a theory of "quantum gravity." When physicists speak of uniting relativity with quantum theory, and thus combining gravity with the other three fundamental forces in one mathematical explanation, they often invent a verb, and speak of "quantizing" relativity.

(As an aside, the latest attempt to unite the theories is known as "String Theory". It relies on techniques from higher dimensional mathematics, and would imply that the universe has as many as eleven or more dimensions, in contrast to the three [four, counting time] with which we are familiar. Since string theory cannot yet be used to make testable predictions, Prof. Tipler dismisses it as unworthy of attention; besides, as we shall see, he has no need of it.)

We are almost there, and ready to introduce the insights of Professor Tipler. But first, you need to know about one more aspect of this problem of unification, which is the concept of a "singularity." In some ways, it is the simplest aspect of all to understand, but in other ways, it is the most abstract and difficult. Consider this simple equation:

2x = 3y

We have all dealt with such equations in high-school algebra classes. You will remember, perhaps, that such an equation has as many solutions as there are real numbers (like x = 3, y = 2; or x = 9, y = 6), that is to say, there are an infinity of solutions, because there is an infinity of real numbers. But now, what would happen if we replace one of the real number coefficients (that is what the numbers 2 and 3 are called in the above equation: a coefficient is a number that quantifies, or "quantizes" if you will, an unknown like x or y) with a coefficient of infinity?

∞x = 3y

(The sideways 8 is the traditional mathematical symbol for infinity, introduced by the English mathematician John Wallis in 1655.) This equation is not in present-day mathematical form, which would use summation notation, but that might scare non-mathematicians away. As a compromise, here is a slightly more standard way of writing it as an infinite sum:

x + 2x + 3x + 4x + 5x + . . . = 3y

In this notation, the ". . . " on the left means "continue the addition, increasing the previous coefficient of each new term by 1, without limit, i.e., to infinity."

Now what do we do? How can anyone solve such an equation? Stick in any value you like for x, and there is no way to add up all the terms on the left side, except to say that the total is infinity. But there is no real number y that can be multiplied by any other real number (including 3) to get infinity, so the solution is impossible. A mathematician, or a physicist, who encounters an infinite quantity in an equation can only throw up his hands---the equation has ceased to have meaning. It now contains what they call a "singularity"---a term that has an infinite coefficient (which is the same thing as saying it has an infinite number of terms), and thus cannot relate to anything in the real world. As Prof. Tipler states (p. 11): "A singularity occurring in the laboratory would contradict observation: infinite physical quantities have never been observed. If singularities occur, they must occur outside the laboratory, outside of space and time altogether."

Such a problem was encountered with Einstein's theory of general relativity shortly after he published it. At the time (1916), scientists had no idea that the universe was expanding, but Einstein saw that his equations predicted an expanding universe. Since that prediction contradicted what he thought was a known fact, he introduced an arbitrary factor into the equations to cancel out the predicted expansion, and to keep the universe in a "steady state" of equilibrium. He called it the "cosmological constant."

But other scientists examining Einstein's mathematics soon found that the equilibrium artificially introduced by the cosmological constant was an extremely fragile one---so fragile, in fact, that it could be disturbed by the simple act of moving a single teacup from the cupboard to a table. Any such disturbance in the equilibrium and the universe would be expanding once again, despite the constant. And when the astronomer Edwin Hubble showed in 1929 that the universe was indeed expanding as the equations of relativity had originally predicted, Einstein was embarrassed. He withdrew the cosmological constant, and called it his "greatest blunder."

But now a new problem arose. If the universe was expanding, that meant it had been smaller in the past. And if one extrapolated even further, using Einstein's equations, one quickly saw that there had to be a time when the whole universe was very small indeed, and that all of its matter and energy had been packed into a very tight space, so that its density (the amount of matter per unit of volume) was enormous. Indeed, there was no way scientists could see to prevent the density from approaching infinity as the radius of the universe shrank to an infinitely small distance. This was because the mathematics of relativity assumed that space and time were a continuum, as noted above: for any radius, no matter how small, there would be a radius that was smaller still on the continuum, and that would imply a density that was greater still. There was no way to call an arbitrary halt to the process, and so they were faced with that most dreaded of breakdowns: a singularity!

A similar problem had been encountered in nineteenth-century physics when trying to construct equations that would explain how heat was radiated: the assumption of a continuum prevented the theory from predicting the actual numbers that were observed. The solution, found by German physicist Max Planck, was to "quantize" the radiation energy into discrete packets, or quanta. Instead of a theoretical continuum, without breaks of any kind, scientists had encountered the neatly packaged quanta of the particle world. In his "banner year" of 1905, Einstein published a paper applying the concept of quanta to explain the photon (particle) nature of light, and quantum theory was born. It was therefore natural for Einstein to expect, after he published his general theory of relativity and the singularities of gravitational collapse became evident, that he could eliminate them by "quantizing" the field of gravity in the same way that the electromagnetic field had been quantized earlier.

This would lead to an understanding of gravity itself as a wave/particle which, like photons of light, arrived at a point in discrete packets, rather than taking on all the infinitesimally varying values of a continuum. And indeed, two famous physicists---Richard Feynman and Steven Weinberg---accomplished just that feat (separately) in the 1960s. Proceeding from the concepts of the Standard Model, they developed equations which dealt with quantized gravity---only to discover that, in order to be consistent with the observed facts, and to cancel out undesirable infinite expressions, the expressions used entailed derivatives of an arbitrarily high order. (You can read some of the Feynman approach---or order the book of his lectures---here. Physicists and mathematicians can look at the equations themselves [actually, the Lagrangian from which the equations are generated] beginning at page 914 of the Tipler article.)

(Technical note [skip if you wish]: A derivative measures a rate of change at a given instant. A derivative of a derivative, measuring the rate of change in an underlying rate of change [that itself is changing with time], is called a second-order derivative. The process stops when the observed (or calculated) rate of change stops, because the derivative of a zero rate of change [i.e., a constant] is zero. If the rate of change for any higher-order derivative never becomes constant, so that one can keep the process going arbitrarily further, with derivatives of derivatives of derivatives of . . . etc. ad infinitum, the order of the derivatives is said to be "arbitrarily high". However, expressions that have derivatives of arbitrarily high order are typical, once again, of those describing the continuum. The equations of electromagnetic quantum field theory were the familiar partial differential equations, as used in Schrödinger's original formulation of classical quantum wave mechanics. Such equations have at most two orders of derivatives, and thus a finite number of terms.)

Feynman, Weinberg and all subsequent physicists until Tipler saw this development as most unwelcome, because it implied that their equations were simply describing the world of the continuum after all. And this could not be squared with the world of the quantum, because of Werner Heisenberg's famous "Uncertainty Principle."

It was Heisenberg who had first developed the real-world consequences of energy being measurable only in discrete packets, or quanta. The logical result, as he showed mathematically, was a trade-off in the degree of knowledge one could obtain about a given particle traveling through space with a given quantum of energy. In traditional mechanics, the sum total of one's knowledge about a particle at any given instant was expressed in terms of its position (specified in physical coordinates) and its momentum (the product of the particle's mass multiplied by its velocity). Heisenberg showed that if energy came in quanta, then the more closely one observed a particle's position, the less one could know about its momentum at that position, and vice versa. Measurements of all of the parameters defining a particle (or a wave) could never be made arbitrarily precise, as they could under a continuum hypothesis. And that is why Feynman and Weinberg felt that they had failed in quantizing gravity: their equations led to an arbitrarily high number of terms, which implied that gravity could not be a quantum phenomenon.

Now enter Frank Tipler, who makes this remarkable assertion just 34 pages into his book:

Feynman, Weinberg, and most subsequent physicists have not accepted this unique theory of gravity because they could not accept its philosophical implications. All previous theories of physics have been built on equations called partial differential equations, which basically equate derivatives [rates of change] of various physical quantities. In the past, the fundamental equations have had no higher than second order derivatives, meaning that there were only a finite number of terms. We might not have been able to determine the initial conditions to feed into these equations with sufficient precision to predict the future---remember the uncertainty principle---but at least we could determine the equations themselves with certainty.

What Feynman and Weinberg really discovered was another and more fundamental limitation on human knowledge: not only can we not, even in principle, determine exactly the position and momentum of a particle, but we cannot even determine or write down, even in principle, the ultimate equations the particle will follow! In fact, if we consider an equation to have an infinite number of terms, there are no ultimate equations! This does not mean that the history of the particle . . . is not subject to, and completely determined by, physical law. It is, even in the Feynman-Weinberg theory. But we humans will never know this theory. As we shall see . . . our descendants in the far future will be able to understand this theory with ever-increasing precision, but not even they will completely understand it until the very end of time.
So according to Frank Tipler, the higher-order derivatives that show up in the Feynman-Weinberg equations of quantum gravity are not the sign of a theory-destroying singularity, but are instead a highly revealing window on the nature of scientific reality: they tend to prove the existence of the theoretical construct of a multiverse (the so-called "many-universe", or "many-worlds" explanation), invented first by the physicist Hugh Everett in 1957 as an alternative way to understand the phenomena that quantum mechanics predicted. (I shall return to a more detailed explanation of the multiverse in my next post; don't worry if it sounds strange just now.) At the same time, the equations resulting from these higher-order derivatives tell us not only about our limitations as humans trying to understand the physical world, but also something about its future. Finally, as an added bonus, physicists can stop their seemingly endless quest for the Grand Unified Theory, because it has been sitting right under their noses for the past 45 years, and it will never become any more tractable than it is now, until after billions and billions of years, just before the world ends!

(All this comes just in the first 30-odd pages of the book, which is one of the many reasons I refer to it as one of the most remarkable books I have ever read.)

Many physicists, according to Prof. Tipler, do not want to accept the implications of the Feynman-Weinberg equations for quantum gravity, because the entire history of theoretical physics in the twentieth century was one of "renormalizing" such equations to eliminate the unwanted infinite number of terms. The Feynman-Weinstein equations, however, do not submit to such techniques (or at least, no more than do the equations for quantum electrodynamics or for the Standard Model). Instead, as John Baez says in the article I just linked, they turn into a hydra---a mythological beast who sprouted one or more heads for each one the hero managed to lop off:

. . . you may have to stick in some extra terms . . . before playing the renormalization game. If you can succeed with only finitely many extra terms, you say your theory is "renormalizable". If you need infinitely many terms, you throw up your hands in despair and say the theory is "nonrenormalizable". A nonrenormalizable [set of equations] is like a hydra-headed monster that keeps needing more extra terms to be added the more you add.
Professor Tipler, for one, is not daunted by this fact. As a physicist, he wants to work with the Feynman-Weinberg equations because he knows that the predictions that can be made with them (if one makes suitable limiting assumptions) are the same as those of general relativity, which have thus far all proved to match observations exactly. Even if the theory does not meet physicists' expectations of what such a theory should be like, he says the fact that it has held up so well under testing and observation to date is reason enough to keep it. He still has to deal with the singularities the equations imply, but he proposes to do so in a novel and highly intriguing way: rather than take them as a confession of the breakdown of physical laws, they constitute evidence of a reality that lies outside of space and time, which fixes the boundary conditions of the world we live in, and which determines the course it will take in the future.

And with that simple determination, Frank Tipler parts company with the majority of theoretical physicists today. At the same time, however, he takes the first small step toward his understanding of the singularities that these equations entail as mathematical proof that God indeed exists. Because of his refusal to shy away from such a conclusion, I have the impression that most physicists have relegated his formal paper to the "file and forget" category (even though it received the journal's highest accolade), while ignoring or ridiculing his book-length popularization of it. In doing so, I believe they are missing out on some wonderful physics.

We have covered a good deal of the preliminary territory, but it will take another post or so to trace the exposition of Prof. Tipler's bold ideas concerning the nature of our reality. Thus if you have borne with me this far, I hope you will follow on with me when I put up the next part of this series in another week or so.



3 comments:

  1. "Thus if you have borne with me this far, I hope you will follow on with me when I put up the next part of this series in another week or so."

    Yep. Tracking with you thus far. I'm fascinated by Tipler's thinking and your analysis of it.

    ReplyDelete
  2. Hey, whatever happened to Part II ??

    I'm trying to wrap my head around Tipler right now, not sure if he's a genius or a madman. Just as I gave up on Christological theology and the Episcopal Church (burned out in the Diocese of Newark some years back; could relate to much of what Spong said and wrote but not to much of what he actually did; too much fashionable liberalism most everywhere you went), Tipler gave up on atheism and embraced Christ. Interesting.

    Anyway, thanks for a good post, it gave me some more puzzle pieces to Tipler's complex thinking. From what little I understand so far, Tipler is basically saying "damn the infinities!" Bring 'em on!! He seems to rely on some quirky infinity in the equations that would make for an eternal conscious instant in the instant before the universe collapses. And it collapses despite dark energy, because in conquering the universe, our distant digitalized progeny, living in a real "Matrix", will have to burn up all the protons and neutrons in the universe in their bullet-sized spaceships containing their digital Matricies. Thus messing up the Higgs field and thus allowing the crunch to happen after all.

    Wow. And I thought that the Flew conversion to Deism was breathtaking.

    Jim G.

    ReplyDelete
  3. Jim G, Part II has at last been posted. I do not expect you will have to wait as long for Part III -- at least, that is my intent.

    ReplyDelete