XFiles: Constant superstition

(Book: On Guard, by William Lane Craig. Chapter 5: “Why is the universe fine-tuned for life?”)

Chapter 5 starts the “Intelligent Design” portion of Dr. Craig’s apologetic, though he does not call it that. In fact, he puts in a sidebar that explicitly points out that “fine-tuned” does not mean designed. Fine tuning, he explains, “just means that the range of life-permitting values for the constants and quantities is extremely narrow.” But don’t let that mislead you, this is his ID argument, which he presents in the form of another syllogism.

  1. The fine-tuning of the universe is due to either physical necessity, chance, or design.
  2. It is not due to physical necessity or chance.
  3. Therefore it is due to design.

There are several significant flaws in this argument. First of all, as we saw last week, time itself has its origin, along with the rest of the universe, at the Big Bang. That means there has never been a time when the physical constants of the universe did not exist with their current values, which prevents any Designer from having the opportunity to design them. They’re already here, they already have the correct values, and the Designer himself could not have existed prior to the constants (since time itself does not go back that far), so we can eliminate design right off the bat.

Dr. Craig also falls for the creationist obfuscation of “chance,” as in some kind of events that are completely unpredictable and chaotic. Such a definition of chance is actually closer in meaning to “magic” than it is to the laws of nature. In nature, things may be undirected, in the sense that there’s no entity pre-planning how he/she/it/they intend for things to work out. Due to the mechanistic nature of how things work in real life, however, the results are anything but random (and in fact one of the hardest things to do scientifically is to produce genuine randomness).

What we have in the real world are a number of entities and processes that interact according to some relatively simple basic principles that have surprisingly complex results. At a certain point, the complexity becomes too great for us to follow all the variables and see where things are going, and so we introduce the concept of “random chance” as a placeholder for combined effect of all the influences we can’t keep track of. This, however, is a manifestation of our own limits, not some actual god of chaos named “Chance.” The underlying phenomena (with the exception of quantum-level phenomena) are following the fundamental laws of nature. We just can’t keep track of them all.

Dr. Craig, in his first premise, fails to grasp this important feature of the natural world. By claiming that observed characteristics must fall into one of three categories—physical necessity, chance, or design—he and other ID creationists are overlooking (and perhaps deliberately obscuring) the way nature really works. Most natural phenomena operate by a combination of multiple independent agents or processes interacting in undirected ways according to a few basic physical laws, and when these interactions influence one another, the resulting feedback loop produces amazingly complex and apparently structured results, similar to what we think of as “design.” In fact, one of the reasons a lot of design is done by trial and error is because this process simulates the “random” processes that are so effective in undirected nature.

Technically, I’m digressing a bit. Usually, ID creationists offer the “physical necessity, chance, or design” argument as a ploy to attack evolution, and evolution, in particular, operates by the complex mechanisms of independent processes interacting according to natural law. Dr. Craig, however, is attempting to adapt this anti-evolution argument and use it to lay out a trichotomy of possible causes for the physical constants of the universe. That’s obviously a red herring, since these constants have always existed, for all of time, with their current values, and therefore it is meaningless to speak of them having any “cause.”

This gets into another of Dr. Craig’s mistakes, though: a failure to understand the nature of a constant. Remember that old joke about “2+2=5, for sufficiently large values of 2”? It’s funny because it treats “2” like some kind of vague, variable quantity, when we know it isn’t. The number 2 is a constant, it doesn’t have values that are larger or smaller than 2. That is, by definition, what a constant is. We can’t change what the value of a constant is, we can only be mistaken about which value is correct. But Dr. Craig is treating constants as though you could just pick some random number (if you were an Intelligent Designer) and *poof*, the new value of π is now 42.

[C]onsider  the following examples of fine-tuning. The so-called weak force, one of the four fundamental forces of nature, which operates inside the nucleus of an atom, is so finely tuned that an alteration in its value by even one part out of 10100 would have prevented a life-permitting universe! Similarly, a change in the value of the so-called cosmological constant, which drives the acceleration of the universe’s expansion, by as little as one part in 10120 would have rendered the universe life-prohibiting.

And if you changed π to be 314.15 instead of 3.1415, it would take the earth a whole century to revolve around the sun! Wow, amazing! Change 2 to 2.5, and the equation in the joke would not be funny any more, because the number 2 would no longer be a constant, it would be a variable, and you could put whatever value you want in it. Or whatever value you need to make your fine-tuning work.

Meanwhile, in the real world, constants are, well, constant. The way we discover them is by virtue of the fact that they can’t be any other value, really. We can take an equation in which a constant appears, and substitute a different value in place of the constant, and see how the numbers come out, but that won’t tell us anything about the real world, and really it won’t tell us much about anything: if you say that “the circumference of a circle is 2πr, but you can change the value of π,” how can you be sure that the value of 2 won’t change also? They’re both constants; it’s no more unreasonable to take the one as a variable than it is to take the other.

Dr. Craig spends a lot of time trying to force various speculative modern theories (e.g. string theory) onto his Procrustean bed of “physical necessity or chance or design,” on the assumption that constants can be changed, and that it’s somehow meaningful to talk about causes for things that have always existed. Frankly, I’m not going to dig into them too much—it’s a wild goose chase, and there are others more qualified than I to discuss the problems that Dr. Craig is having with advanced physics (e.g. skydivephil’s video).

What I do want to go back to is the underlying assumption behind his reasoning. Dr. Craig is making the classic creationist assumption that if we do not understand why the constants and quantities of the universe work out the way they do, then we should automatically assume that some intelligent, self-aware, purposeful Being must have deliberately and intentionally created them. In short, he is advocating that our default preference should be for superstition in cases where scientific answers have not yet been found.

Obviously that’s an appallingly anti-scientific and anti-intellectual assumption to make, and I doubt that Dr. Craig would willingly concede that he is making it. If you look at his methodology, though, this is exactly what he is doing. He is not trying to describe a scientifically-definable process by which a Designer would “tune” the values of the physical constants of the universe. He’s not giving us enough detail that we could calculate what real-world consequences would result from such a process, and how those consequences would differ from those that would result if these values arose by some other process. He’s not giving us anything that would allow use to verify, objectively and scientifically, whether the facts fit his hypothesis better than any non-design hypothesis. All he’s doing is trying to attack the scientific alternatives to his superstitious explanation. A few examples:

For example, the most promising candidate for a [Theory of Everything] to date, so-called M-theory or superstring theory, only works if there are eleven dimensions. But the theory itself can’t explain why just that particular number of dimensions should exist.

Moreover, M-theory doesn’t predict uniquely a life-permitting universe. It permits a vast range of around 10500 different possible universes, all consistent with the same laws but varying in the values of the constants of nature. Almost all of these possible universes are life-prohibiting. So some explanation is needed why, out of all these possibilities, a life permitting universe exists…

The proposed mechanisms for generating a world ensemble are so vague that it’s far from clear that the physics governing the multiverse will not involve any fine tuning…

We saw in chapter 4 that the Borde-Guth-Vilenkin theorum requires that even a multiverse of bubble universes must have a beginning. In that case, the mechanism that generates the bubble universes has been chugging away for only a finite amount of time. So by now, there may well be only a finite number of bubbles in the world ensemble, which may not be enough to guarantee the appearance of a finely tuned universe by chance alone…

With the failure of the many worlds hypothesis, the last ring of defense for the alternative of chance collapses. Neither physical necessity nor chance provides a good explanation of the fine-tuning of the universe.

Etc. etc. etc. Science may not have all the answers. Superstring theory can’t explain why eleven dimensions should happen to exist. M-theory permits too many universes, we need more explanations. The many worlds hypothesis may not explain everything, therefore it has failed. In other words, don’t listen to the scientists. Ignore their answers. Look somewhere else.

Well, I’ll take it a step farther than Dr. Craig: science does not yet know how—or if—any sort of material metaverse exists and/or influences the characteristics of this universe. The scientific thing to do in the absence of such answers is to ask more questions, seek more data, and learn what we can. The superstitious thing to do is just give up and attribute things to some invisible magical Being. What Dr. Craig is doing is encouraging us to reject the former approach in favor of the latter.

This is the key failure of Intelligent Design creationism, even as a pseudo-scientific approach. In the end, it is an attempt to replace science with ordinary superstition, an attempt to throw away both scientific answers and even scientific investigation, in favor of a dogmatic animism. Dr. Craig does his homework better than most, and tries his hardest to dress it up in a lab coat and make it sound like he’s just being a good skeptic, but it’s marketing. A genuinely objective scientist would acknowledge that there are still gaps in the various theories, but would nevertheless agree that they’re the best answers we’ve found so far. He wouldn’t pounce on the gaps as a convenient place to try and stuff God in.

We’ve got more to cover in this chapter, but Dr. Craig’s next move is to attack Richard Dawkins, and that’s going to open up a whole ‘nuther can, as they say. So we’ll save that for next time. Tune in again next week!

12 Responses to “XFiles: Constant superstition”

  1. Rob Says:

    Why can’t it be due to chance? Craig is a puddle complaining about his hole.

  2. Andrew G. Says:

    Stenger’s book “The Fallacy of Fine-Tuning” is worth reading as an actual physicist’s response to all the fine-tuning arguments. (Does contain bunches of equations; but it’s written so that you can skip those and just read the text.)

    Your handling of constants is a bit sloppy there. Some constants, like pi, have values fixed by definition (or mathematical necessity, depending on how you look at it) – there is no possible world in which they have different values. Other values often referred to as “constants”, like the speed of light, are not really constants but just definitions of measuring units; theoretical physicists like to set as many of these to 1 as they can, to simplify equations. (The ones that aren’t 1 tend to end up with values based on some expression containing pi.)

    The “constants” that can properly be subject to fine-tuning arguments are the couple of dozen dimensionless values that show up in physics for no obvious reason and have to be measured. Physicists are really unhappy about this, it’s messy to have so many apparently arbitrary numbers showing up, and they would really like to find some theoretical reason for the values. However, no such theory exists yet, though it’s an active area of work.

    The kinds of values in question are things like these: the fine-structure constant (electromagnetic field “strength”), the masses of the leptons and quarks, some of the elements of various matrices that define the weak and strong interactions, and so on.

    One of Stenger’s main arguments is this: even if we assume each parameter is arbitrary and random, we can’t take the possible range of variation of each parameter alone as being a measure of how likely the universe is to support life, instead we have to consider all the parameters as independently variable, and look at what proportion of the resulting parameter space is favourable. This turns out to negate almost all fine-tuning arguments.

  3. pboyfloyd Says:

    Creator, causer, maker, designer, fine-tuner. They’ve pretty much dug through the thesaurus looking to define that which they’re assuming.

    1) They see HIM in the surroundings, in the sky, in their hearts, in every favourable happening ever, as sure as I can see the word, “Oooo.”, in my Cheerios every morning.

    Point is that everyone knows this and apologists are being disingenuous pretending that it (1) is not the case. They’re pretending that the idea of God is up for debate, that agnostics ought to be bamboozled by bad probability and infinity arguments while they themselves wouldn’t be phased by any argument at all, because of (1).

  4. wmcbrine Says:

    I like Sagan’s phrase, “the Lithic Principle” — the idea (self-evident, really) that the Universe is fined-tuned for the production of rocks.

    Really though, if this Universe is fine-tuned for anything, it’s vacuum. Emptiness upon emptiness, at every scale. Most of the Universe is nothing, and if you take that away, most of the rest is nothing, and so on. And it’s getting more empty every day, at an ever-accelerating pace.

    • asdf Says:

      But is a vacuum really nothing? I’m not entirely informed on this point, but I there are all these fields and stuff in place, right?

  5. Anonymous Says:

    Although your position is sound, your arguments are not. There really is true randomness in nature. our argument that it is simply because we don’t know enough to predict everything is wrong. This is the hidden variable theory and it has been demonstrated experimentally to be incorrect. Radioactive decay is the most obvious example of purely random things in the world, but there is also mutation and spontaneous symmetry breaking to name a few others.

    Be careful to base your arguments on correct interpretations of science.

    • Deacon Duncan Says:

      Hi, anonymous,

      My understanding is that genuine randomness happens at the quantum level, not at larger scales. Thus, an atom of U-235 will spontaneously decay after some random period of time, and this is truly random, whereas mutation (as in genetics) is a “random” change that has a physical cause related to the electrostatic properties of the molecules, the way they are positioned, the other molecules in the environment, the velocities of those molecules, etc. That’s the idea I was trying to express (and it’s why I made an exception for quantum-level phenomena), but if that idea is still mistaken I would be happy to learn more about it.

      Thanks.

    • josh Says:

      Anonymous,
      On a Many Worlds interpretation (the most consistent in my opinion), there is no true randomness. We are limited in our predictive ability by lack of knowledge of what the rest of the universal wavefunction is doing, but that’s just like regular dice-rolling randomness, everything is still deterministic. (Note that Craig, sloppy thinker that he is, seems to conflate the Many Worlds interpretation of quantum mechanics with the multiverse/landscape idea of cosmology and string theory.)
      On the other hand, I don’t think one (Duncan) can say that “randomness”, be it real or apparent, only happens at the quantum level. It is just that at larger scales, the probabilities become extremely lopsided, giving the appearance of classical determinism in almost all cases.

    • Alex SL Says:

      This is actually something that I would love to understand more about. A lot of people are deeply impressed by the fact that quantum processes happen “randomly” if you look at an individual particle or unstable isotope, and then conclude that determinism is false, that free will exists, that god uses quantum to influence the world, whatever.

      But is this really random, or is it merely spontaneous? And does it really have any impact on anything but the smallest scale? All those probability distributions seem to collapse if we look at a larger scale. The bloody cat is not actually half dead and half alive, but only one or the other at any given time. A bunch of uranium is exactly 34.5% decayed, you could theoretically count the atoms. Etc. So can random ever blossom up to the scale where it influences anything, so as to break determinism? I wonder.

  6. asdf Says:

    There is no evidence at all for the many-worlds hypothesis, no experimental data. You can’t just say “trust that science will figure it out.” There are not just gaps in the current theories, there is no evidence to support them! Regarding the beginning of time, if God is eternal, it means that he transcends time, therefore your premise that design can be ruled out is incorrect. It is true that these constants have existed for all time, but again, time had a beginning, and therefore so did these constants. To say that “there has never been a time when the physical constants of the universe did not exist with their current values” is correct, but that does not mean that they necessarily had to have the values that they hold now. It may seem nonsensical to you, but it is profoundly unscientific, when it comes to the values of these constants, to not allow for the question of “Why?”


Leave a reply to asdf Cancel reply