Read Enlightenment Now Page 3


  Exchange can make an entire society not just richer but nicer, because in an effective market it is cheaper to buy things than to steal them, and other people are more valuable to you alive than dead. (As the economist Ludwig von Mises put it centuries later, “If the tailor goes to war against the baker, he must henceforth bake his own bread.”) Many Enlightenment thinkers, including Montesquieu, Kant, Voltaire, Diderot, and the Abbé de Saint-Pierre, endorsed the ideal of doux commerce, gentle commerce.19 The American founders—George Washington, James Madison, and especially Alexander Hamilton—designed the institutions of the young nation to nurture it.

  This brings us to another Enlightenment ideal, peace. War was so common in history that it was natural to see it as a permanent part of the human condition and to think peace could come only in a messianic age. But now war was no longer thought of as a divine punishment to be endured and deplored, or a glorious contest to be won and celebrated, but a practical problem to be mitigated and someday solved. In “Perpetual Peace,” Kant laid out measures that would discourage leaders from dragging their countries into war.20 Together with international commerce, he recommended representative republics (what we would call democracies), mutual transparency, norms against conquest and internal interference, freedom of travel and immigration, and a federation of states that would adjudicate disputes between them.

  For all the prescience of the founders, framers, and philosophes, this is not a book of Enlightenolatry. The Enlightenment thinkers were men and women of their age, the 18th century. Some were racists, sexists, anti-Semites, slaveholders, or duelists. Some of the questions they worried about are almost incomprehensible to us, and they came up with plenty of daffy ideas together with the brilliant ones. More to the point, they were born too soon to appreciate some of the keystones of our modern understanding of reality.

  They of all people would have been the first to concede this. If you extol reason, then what matters is the integrity of the thoughts, not the personalities of the thinkers. And if you’re committed to progress, you can’t very well claim to have it all figured out. It takes nothing away from the Enlightenment thinkers to identify some critical ideas about the human condition and the nature of progress that we know and they didn’t. Those ideas, I suggest, are entropy, evolution, and information.

  CHAPTER 2

  ENTRO, EVO, INFO

  The first keystone in understanding the human condition is the concept of entropy or disorder, which emerged from 19th-century physics and was defined in its current form by the physicist Ludwig Boltzmann.1 The Second Law of Thermodynamics states that in an isolated system (one that is not interacting with its environment), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there.

  In its original formulation the Second Law referred to the process in which usable energy in the form of a difference in temperature between two bodies is inevitably dissipated as heat flows from the warmer to the cooler body. (As the musical team Flanders & Swann explained, “You can’t pass heat from the cooler to the hotter; Try it if you like but you far better notter.”) A cup of coffee, unless it is placed on a plugged-in hot plate, will cool down. When the coal feeding a steam engine is used up, the cooled-off steam on one side of the piston can no longer budge it because the warmed-up steam and air on the other side are pushing back just as hard.

  Once it was appreciated that heat is not an invisible fluid but the energy in moving molecules, and that a difference in temperature between two bodies consists of a difference in the average speeds of those molecules, a more general, statistical version of the concept of entropy and the Second Law took shape. Now order could be characterized in terms of the set of all microscopically distinct states of a system (in the original example involving heat, the possible speeds and positions of all the molecules in the two bodies). Of all these states, the ones that we find useful from a bird’s-eye view (such as one body being hotter than the other, which translates into the average speed of the molecules in one body being higher than the average speed in the other) make up a tiny fraction of the possibilities, while all the disorderly or useless states (the ones without a temperature difference, in which the average speeds in the two bodies are the same) make up the vast majority. It follows that any perturbation of the system, whether it is a random jiggling of its parts or a whack from the outside, will, by the laws of probability, nudge the system toward disorder or uselessness—not because nature strives for disorder, but because there are so many more ways of being disorderly than of being orderly. If you walk away from a sandcastle, it won’t be there tomorrow, because as the wind, waves, seagulls, and small children push the grains of sand around, they’re more likely to arrange them into one of the vast number of configurations that don’t look like a castle than into the tiny few that do. I’ll often refer to the statistical version of the Second Law, which does not apply specifically to temperature differences evening out but to order dissipating, as the Law of Entropy.

  How is entropy relevant to human affairs? Life and happiness depend on an infinitesimal sliver of orderly arrangements of matter amid the astronomical number of possibilities. Our bodies are improbable assemblies of molecules, and they maintain that order with the help of other improbabilities: the few substances that can nourish us, the few materials in the few shapes that can clothe us, shelter us, and move things around to our liking. Far more of the arrangements of matter found on Earth are of no worldly use to us, so when things change without a human agent directing the change, they are likely to change for the worse. The Law of Entropy is widely acknowledged in everyday life in sayings such as “Things fall apart,” “Rust never sleeps,” “Shit happens,” “Whatever can go wrong will go wrong,” and (from the Texas lawmaker Sam Rayburn) “Any jackass can kick down a barn, but it takes a carpenter to build one.”

  Scientists appreciate that the Second Law is far more than an explanation of everyday nuisances. It is a foundation of our understanding of the universe and our place in it. In 1928 the physicist Arthur Eddington wrote:

  The law that entropy always increases . . . holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.2

  In his famous 1959 Rede lectures, published as The Two Cultures and the Scientific Revolution, the scientist and novelist C. P. Snow commented on the disdain for science among educated Britons in his day:

  A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: Have you read a work of Shakespeare’s?3

  The chemist Peter Atkins alludes to the Second Law in the title of his book Four Laws That Drive the Universe. And closer to home, the evolutionary psychologists John Tooby, Leda Cosmides, and Clark Barrett entitled a recent paper on the foundations of the science of mind “The Second Law of Thermodynamics Is the First Law of Psychology.”4

  Why the awe for the Second Law? From an Olympian vantage point, it defines the fate of the universe and the ultimate purpose of life, mind, and human striving: to deploy
energy and knowledge to fight back the tide of entropy and carve out refuges of beneficial order. From a terrestrial vantage point we can get more specific, but before we get to familiar ground I need to lay out the other two foundational ideas.

  * * *

  At first glance the Law of Entropy would seem to allow for only a discouraging history and a depressing future. The universe began in a state of low entropy, the Big Bang, with its unfathomably dense concentration of energy. From there everything went downhill, with the universe dispersing—as it will continue to do—into a thin gruel of particles evenly and sparsely distributed through space. In reality, of course, the universe as we find it is not a featureless gruel. It is enlivened with galaxies, planets, mountains, clouds, snowflakes, and an efflorescence of flora and fauna, including us.

  One reason the cosmos is filled with so much interesting stuff is a set of processes called self-organization, which allow circumscribed zones of order to emerge.5 When energy is poured into a system, and the system dissipates that energy in its slide toward entropy, it can become poised in an orderly, indeed beautiful, configuration—a sphere, spiral, starburst, whirlpool, ripple, crystal, or fractal. The fact that we find these configurations beautiful, incidentally, suggests that beauty may not just be in the eye of the beholder. The brain’s aesthetic response may be a receptiveness to the counter-entropic patterns that can spring forth from nature.

  But there is another kind of orderliness in nature that also must be explained: not the elegant symmetries and rhythms in the physical world, but the functional design in the living world. Living things are made of organs that have heterogeneous parts which are uncannily shaped and arranged to do things that keep the organism alive (that is, continuing to absorb energy to resist entropy).6

  The customary illustration of biological design is the eye, but I will make the point with my second-favorite sense organ. The human ear contains an elastic drumhead that vibrates in response to the slightest puff of air, a bony lever that multiplies the vibration’s force, a piston that impresses the vibration into the fluid in a long tunnel (conveniently coiled to fit inside the wall of the skull), a tapering membrane that runs down the length of the tunnel and physically separates the waveform into its harmonics, and an array of cells with tiny hairs that are flexed back and forth by the vibrating membrane, sending a train of electrical impulses to the brain. It is impossible to explain why these membranes and bones and fluids and hairs are arranged in that improbable way without noting that this configuration allows the brain to register patterned sound. Even the fleshy outer ear—asymmetrical top to bottom and front to back, and crinkled with ridges and valleys—is shaped in a way that sculpts the incoming sound to inform the brain whether the soundmaker is above or below, in front or behind.

  Organisms are replete with improbable configurations of flesh like eyes, ears, hearts, and stomachs which cry out for an explanation. Before Charles Darwin and Alfred Russel Wallace provided one in 1859, it was reasonable to think they were the handiwork of a divine designer—one of the reasons, I suspect, that so many Enlightenment thinkers were deists rather than outright atheists. Darwin and Wallace made the designer unnecessary. Once self-organizing processes of physics and chemistry gave rise to a configuration of matter that could replicate itself, the copies would make copies, which would make copies of the copies, and so on, in an exponential explosion. The replicating systems would compete for the material to make their copies and the energy to power the replication. Since no copying process is perfect—the Law of Entropy sees to that—errors will crop up, and though most of these mutations will degrade the replicator (entropy again), occasionally dumb luck will throw one up that’s more effective at replicating, and its descendants will swamp the competition. As copying errors that enhance stability and replication accumulate over the generations, the replicating system—we call it an organism—will appear to have been engineered for survival and reproduction in the future, though it only preserved the copying errors that led to survival and reproduction in the past.

  Creationists commonly doctor the Second Law of Thermodynamics to claim that biological evolution, an increase in order over time, is physically impossible. The part of the law they omit is “in a closed system.” Organisms are open systems: they capture energy from the sun, food, or ocean vents to carve out temporary pockets of order in their bodies and nests while they dump heat and waste into the environment, increasing disorder in the world as a whole. Organisms’ use of energy to maintain their integrity against the press of entropy is a modern explanation of the principle of conatus (effort or striving), which Spinoza defined as “the endeavor to persist and flourish in one’s own being,” and which was a foundation of several Enlightenment-era theories of life and mind.7

  The ironclad requirement to suck energy out of the environment leads to one of the tragedies of living things. While plants bask in solar energy, and a few creatures of the briny deep soak up the chemical broth spewing from cracks in the ocean floor, animals are born exploiters: they live off the hard-won energy stored in the bodies of plants and other animals by eating them. So do the viruses, bacteria, and other pathogens and parasites that gnaw at bodies from the inside. With the exception of fruit, everything we call “food” is the body part or energy store of some other organism, which would just as soon keep that treasure for itself. Nature is a war, and much of what captures our attention in the natural world is an arms race. Prey animals protect themselves with shells, spines, claws, horns, venom, camouflage, flight, or self-defense; plants have thorns, rinds, bark, and irritants and poisons saturating their tissues. Animals evolve weapons to penetrate these defenses: carnivores have speed, talons, and eagle-eyed vision, while herbivores have grinding teeth and livers that detoxify natural poisons.

  * * *

  And now we come to the third keystone, information.8 Information may be thought of as a reduction in entropy—as the ingredient that distinguishes an orderly, structured system from the vast set of random, useless ones.9 Imagine pages of random characters tapped out by a monkey at a typewriter, or a stretch of white noise from a radio tuned between channels, or a screenful of confetti from a corrupted computer file. Each of these objects can take trillions of different forms, each as boring as the next. But now suppose that the devices are controlled by a signal that arranges the characters or sound waves or pixels into a pattern that correlates with something in the world: the Declaration of Independence, the opening bars of “Hey Jude,” a cat wearing sunglasses. We say that the signal transmits information about the Declaration or the song or the cat.10

  The information contained in a pattern depends on how coarsely or finely grained our view of the world is. If we cared about the exact sequence of characters in the monkey’s output, or the precise difference between one burst of noise and another, or the particular pattern of pixels in just one of the haphazard displays, then we would have to say that each of the items contains the same amount of information as the others. Indeed, the interesting ones would contain less information, because when you look at one part (like the letter q) you can guess others (such as the following letter, u) without needing the signal. But more commonly we lump together the immense majority of random-looking configurations as equivalently boring, and distinguish them all from the tiny few that correlate with something else. From that vantage point the cat photo contains more information than the confetti of pixels, because it takes a garrulous message to pinpoint a rare orderly configuration out of the vast number of equivalently disorderly ones. To say that the universe is orderly rather than random is to say that it contains information in this sense. Some physicists enshrine information as one of the basic constituents of the universe, together with matter and energy.11

  Information is what gets accumulated in a genome in the course of evolution. The sequence of bases in a DNA molecule correlates with the sequence of amino acids in the proteins that make up the organism’s body, and they got tha
t sequence by structuring the organism’s ancestors—reducing their entropy—into the improbable configurations that allowed them to capture energy and grow and reproduce.

  Information is also collected by an animal’s nervous system as it lives its life. When the ear transduces sound into neural firings, the two physical processes—vibrating air and diffusing ions—could not be more different. But thanks to the correlation between them, the pattern of neural activity in the animal’s brain carries information about the sound in the world. From there the information can switch from electrical to chemical and back as it crosses the synapses connecting one neuron to the next; through all these physical transformations, the information is preserved.

  A momentous discovery of 20th-century theoretical neuroscience is that networks of neurons not only can preserve information but can transform it in ways that allow us to explain how brains can be intelligent. Two input neurons can be connected to an output neuron in such a way that their firing patterns correspond to logical relations such as AND, OR, and NOT, or to a statistical decision that depends on the weight of the incoming evidence. That gives neural networks the power to engage in information processing or computation. Given a large enough network built out of these logical and statistical circuits (and with billions of neurons, the brain has room for plenty), a brain can compute complex functions, the prerequisite for intelligence. It can transform the information about the world that it receives from the sense organs in a way that mirrors the laws governing that world, which in turn allows it to make useful inferences and predictions.12 Internal representations that reliably correlate with states of the world, and that participate in inferences that tend to derive true implications from true premises, may be called knowledge.13 We say that someone knows what a robin is if she thinks the thought “robin” whenever she sees one, and if she can infer that it is a kind of bird which appears in the spring and pulls worms out of the ground.