Read The Way the World Works Page 20


  Sherry Judd, the founder of the museum, a smiley woman with short curly hair who wore a western-style blue shirt, was serving spaghetti. Sherry worked at both Otis and Andro, and also at a paper mill in California, and her father was a mason at Otis Mill. She started raising money for the museum several years ago, she said. “I had a vision that someday there was not going to be papermaking in these towns,” she told me. “Somebody needs to tell the children about what their ancestors did, how hard they worked to develop this community and the communities around it.” For two years she raised money for the museum by towing around a caboose replica filled with papermaking artifacts and giving talks on the need to preserve the past. She had a video made, “Along the Androscoggin,” about the history of papermaking in the area, with good clips from mill workers, including Norm Paradis and his son. She wants people to walk into the museum and hear the sound of the papermaking machinery, and see how it worked. “I have a lot of ideas up here,” she said, tapping her head, “but we need a curator. And a grant writer.”

  We bought some tickets for the quilt raffle and a brick to go into the museum’s new front walkway, and then we drove home talking about Sherry, Norm, Walter, and the skiing hill next to the river.

  Don Carli, of the Institute for Sustainable Communication, told me that this year eighteen paper mills have closed in the United States, and more than thirty-four papermaking machines have been permanently put out of commission. Meanwhile, the power demand from the Internet is growing hugely. “If you do a simple extrapolation of the consumption of energy by data centers, we have a crisis,” Carli said. In 2006, the Energy Information Administration estimated that data centers consumed about 60 billion kilowatt hours of electricity—just the centers themselves, not the wireless or fiber-optic networks that connect them or the end-user computers that they serve—while paper mills consumed 75 billion kilowatt hours of electricity, of which more than half was green power from renewable sources. “And that was in 2006,” Carli said, “when print wasn’t kicked to the curb and declared all but dead and buried. It was still fighting the good fight.” Not only is there now a roughly comparable carbon footprint between server farms and paper mills, but the rate of growth in server and data center energy consumption is “metastasizing,” he said. “It doubled between 2000 and 2005, and it’s due to double again at current rates by 2010.” That’s one reason why gigantic data centers are now going up far away from cities, Carli added. “You can’t go to ConEd and get another ten megawatts of power. You can buy the computers, you can buy the servers. You just can’t get juice for them, because the grid is tapped out.”

  “That’s kind of amazing,” I said.

  “So when we start thinking about transforming more and more of our communication to digital media,” Carli said, “we really do have to be asking, Where will the electrons come from?”

  I nodded and looked out at the trees.

  (2009)

  Google’s Earth

  I’m fond of Google, I have to say. I like Larry Page, who seems, at least in the YouTube videos I’ve watched, shy and smart, with salt-and-pepper bangs; and Sergey Brin, who seems less shy and jokier and also smart. Ken Auletta, the author of an absorbing, shaggy, name-droppy book called Googled: The End of the World As We Know It, doesn’t seem to like either of them much—he says that Page has a “Kermit the Frog” voice, which isn’t nice, while Brin comes off as a swaggering, efficiency-obsessed overachiever who, at Stanford, aced tests, picked locks, “borrowed” computer equipment from the loading dock, and once renumbered all the rooms in the computer science building. “Google’s leaders are not cold businessmen; they are cold engineers,” Auletta writes—but “cold” seems oddly wrong. Auletta’s own chilliness may be traceable in part to Brin’s and Page’s reluctance to be interviewed. “After months of my kicking at the door, they opened it,” he writes in the acknowledgments. “Google’s founders and many of its executives share a zeal to digitize books,” he observes, “but don’t have much interest in reading them.”

  They’ll probably give more than a glance at Googled. I read the book in three huge gulps and learned a lot—about Google’s “cold war” with Facebook, about Google’s tussles with Viacom, about Google’s role in the “Yahoo-Microsoft melee,” and about Google’s gradual estrangement from its former ally Apple. Auletta is given to martial similes and parallels, from Prince Metternich in nineteenth-century Europe to Afghanistan now: “Privacy questions will continue to hover like a Predator drone,” he writes, “capable of firing a missile that can destroy the trust companies require to serve as trustees for personal data.” And he includes some revealing human moments: Larry Page, on the day of Google’s hugely successful stock offering, pulls out his cell phone and says, “I’m going to call my mom!”

  But what Auletta mainly does is talk shop with CEOs, and that is the great strength of the book. Auletta seems to have interviewed every media chief in North America, and most of them are unhappy, one way or another, with what Google has become. Google is voracious, they say, it has gargantuan ambitions, it’s too rich, it’s too smug, it makes big money off of O.P.C.—other people’s content. One unnamed “prominent media executive” leaned toward Auletta at the 2007 Google Zeitgeist Conference and whispered a rhetorical question in his ear: What real value, he wanted to know, was Google producing for society?

  Wait. What real value? Come now, my prominent executive friend. Have you not glanced at Street View in Google Maps? Have you not relied on the humble aid of the search-box calculator, or checked out Google’s movie showtimes, or marveled at the quick-and-dirtiness of Google Translate? Have you not made interesting recherché nineteenth-century discoveries in Google Books? Or played with the amazing expando-charts in Google Finance? Have you not designed a strange tall house in Google SketchUp, and did you not make a sudden cry of awed delight the first time you saw the planet begin to turn and loom closer in Google Earth? Are you not signed up for automatic Google News alerts on several topics? I would be very surprised if you are not signed up for a Google alert or two. Surely no other software company has built a cluster of products that are anywhere near as cleverly engineered, as quick-loading, and as fun to fiddle with as Google has, all for free. Have you not searched?

  Because, let me tell you, I remember the old days, the antegoogluvian era. It was okay—it wasn’t horrible by any means. There were cordless telephones, and people wore comfortable sweaters. There was AltaVista, and Ask Jeeves, and HotBot, and Excite, and Infoseek, and Northern Light—with its deep results and its elegant floating schooner logo—and if you wanted to drag through several oceans at once, there was MetaCrawler. But the haul was haphazard, and it came in slow. You chewed your peanut-butter cracker, waiting for the screen to fill.

  Then Google arrived in 1998, sponged clean, impossibly fast. Google was like a sunlit white Formica countertop with a single vine-ripened tomato on it. No ads in sight—Google was anti-ad back then. It was weirdly smart, too; you almost never had a false hit. You didn’t have to know anything about the two graduate students who had aligned and tuned their secret algorithms—the inseparable Page and Brin—to sense that they were brilliant young software dudes, with all the sneakered sure-footedness of innocence: the “I’m Feeling Lucky” button in that broad blank expanse of screen space made that clear. Google would make us all lucky; that was the promise. And, in fact, it did.

  So why are the prominent media executives unhappy? Because Google is making lots of ad money, and there’s only so much ad money to go around. Last year almost all of Google’s revenue came from the one truly annoying thing that the company is responsible for: tiny, cheesy, three-line text advertisements. These AdWords or AdSense ads load fast, and they’re supposedly “polite,” in that they don’t flicker or have pop-ups, and they’re almost everywhere now—on high-traffic destinations like the Washington Post or MySpace or Discovery.com, and on hundreds of thousands of little websites and blogs as well. “It’s all of our revenue,??
? Larry Page said in a meeting that Auletta attended in 2007.

  The headlines say things like “Laser Hair Removal,” “Christian Singles,” “Turn Traffic Into Money,” “Have You Been Injured?” “Belly Fat Diet Recipe,” “If U Can Blog U Can Earn,” “Are You Writing a Book?” and so on. Countless M.F.A., or Made for AdSense, websites have appeared; they use articles stolen or “scraped” or mashed together from sites like Wikipedia, and their edges are framed with Google’s text ads. The ads work on a cost-per-click scheme: the advertiser pays Google only if you actually click on the ad. If you do, he’s billed a quarter, or a dollar, or (for some sought-after keywords like “personal injury” or “mesothelioma lawyers”) ten dollars or more.

  But think—when was the last time you clicked on a three-line text ad? Almost never? Me neither. And yet, in 2008, Google had $21.8 billion in revenue, about 95 percent of which flowed from AdWords/AdSense. (A trickle came from banner and video ads sold by Google’s new subsidiary DoubleClick, and from other products and services.) These unartful, hard-sell irritants—which have none of the beauty or the humor of TV, magazine, radio, or newspaper advertising—are the foundation of Google’s financial empire, if you can believe it. It’s an empire built on tiny grains of keyword-searchable sand.

  The advertising revenue keeps Google’s stock high, and that allows the company to do whatever it feels like doing. In 2006, when Google’s stock was worth $132 billion, the company absorbed YouTube for $1.65 billion, almost with a shrug. “They can buy anything they want or lose money on anything they choose to,” Irwin Gotlieb, the chief of GroupM, one of Google’s biggest competitors in the media market, told Auletta. If Microsoft is courting DoubleClick, Google can swoop in and buy DoubleClick for $3.1 billion. If the business of cloud computing seems to hold great promise, Google can build twenty or fifty or seventy massive data centers in undisclosed locations around the world, each drawing enough power to light a small city. Earlier this month, Google announced it would pay $750 million in stock for a company called AdMob, to sell banner ads on cell phones. “Once you get to a certain size, you have to figure out new ways of growing,” Ivan Seidenberg, the chief executive of Verizon, said to Auletta. “And then you start leaking on everyone else’s industry.” That’s why Auletta’s CEOs are resentful.

  True, the miracles keep coming: Google Voice, which can e-mail you a transcript of your voice mail messages; and Chrome, a quick, clever Web browser; and Android, the new operating system for mobile devices. One of the latest is an agreement to print books on an ATM-style on-demand printer, the Espresso Book Machine. But perhaps there are too many miracles emanating from one campus now; perhaps brand fatigue is setting in. Google’s famous slogan, “Don’t be evil,” now sounds a little bell-tollingly dystopian. When they were at Stanford, Page and Brin criticized search engines that had become too “advertising oriented.” “These guys were opposed to advertising,” Auletta quotes Ram Shriram, one of Google’s first investors, as saying. “They had a purist view of the world.” They aren’t opposed now. Now they must be forever finding forage for a hungry, $180 billion ad-maddened beast. Auletta describes an unusual job-interview test that Sergey Brin once gave to a prospective in-house lawyer: “I need you to draw me a contract,” Brin said to her. “I need the contract to be for me to sell my soul to the Devil.” That was in 2002, the year Google began work internally on what would become AdSense.

  Now Page and Brin fly around in a customized Boeing 767 and talk sincerely about green computing, even as the free streamings of everyone’s home video clips on YouTube burn through mountaintops of coal. They haven’t figured out a way to “monetize”—that is, make a profit from—their money maelstrom YouTube, although I notice that Coffee-mate and Samsung banners appear nowadays in Philip DeFranco’s popular video monologues. “The benefit of free is that you get 100 percent of the market,” Eric Schmidt, Google’s chief executive, explained to Auletta. “Free is the right answer.” For a while, perhaps—but maybe free is unsustainable. For newspapers, Auletta writes, “free may be a death certificate.” Maybe in the end, even on the Internet, you get what you pay for.

  (2009)

  Steve Jobs

  The other day, I ordered a new machine from Apple, and just before bed I went to the Apple website to check when it was going to ship. There, looking at me, instead of the normal welcome page announcing the latest mojo miracle of euphoric minimalism, was a man with round John Lennon glasses and an intense gaze and a close-cropped beard, photographed in black-and-white. It was Steve Jobs from some years ago, before he got sick. He looked like he wanted to tell me something, but I didn’t know what it was. To the left of the photograph, on this simple white screen—not an ounce of color on it anywhere—I saw his birth date: 1955. Then there was a hyphen, and then: 2011.

  I was stricken. Everyone who cares about music and art and movies and heroic comebacks and rich rewards and being able to carry several kinds of infinity around in your shirt pocket is taken aback by this sudden huge vacuuming-out of a titanic presence from our lives. We’ve lost our techno-impresario and digital dream granter. Vladimir Nabokov once wrote, in a letter, that when he’d finished a novel he felt like a house after the movers had carried out the grand piano. That’s what it feels like to lose this world-historical personage. The grand piano is gone.

  The next morning, I picked up my latecomer’s MacBook Pro—I’d bought it only this year, after more than two decades of struggling with and cursing at software from outside Apple’s fruitful orchard—and opened the aluminum top. I went to the website again, and there he was, still Steve, still looking at us. His fingers were in a sort of delicate pinch at his chin, in a pose that photographers like, because they want to see your hands. And the pose made sense, since one of the really noble things that Apple has done is to apply the ancient prehensile precision of pinching, sliding, or tapping fingers to screens and touch pads. Other companies had touch screens. Only Apple made them not seem ridiculous.

  I saw Jobs just once, last year, at the first iPad unveiling, in San Francisco. A mass of tech journalists surged into the auditorium while, over the P.A. system, Bob Dylan sang “How does it feel?” The live-bloggers flipped open their laptops. Joshua Topolsky, who was then the head of Engadget, told me that this was bigger than the iPhone. “In a way, I would almost hate to be Apple right now,” he said.

  Jobs was talking to Al Gore in the front row—Gore appeared to be, amazingly, chewing gum. Then the show began, and Steve went onstage, looking thin but fit, like some kind of aging vegan long-distance runner. He told us that so many millions of iPods had been sold and so many million people had visited the retail stores, with their blue-shirted Geniuses waiting to help you. He said it was kind of incredible, and it was—I found myself applauding joyfully and unjournalistically. And then came the announcement: “And we call it—the iPad.”

  Immediately afterward, the carping began. Meh, the iPad wasn’t magical at all, it was just a big iPhone, the journalists said. One expert called it “D.O.A.”—disappointing on arrival. But it was a smash; people immediately began figuring out new ways to use this brilliant, slip-sliding rectangle of private joy.

  When he was young, Jobs looked remarkably like James Taylor. When he was older and sick, his blue jeans hung off his body. Even so, I thought that he, like a true marathoner, was going to make it—make it to the iPhone 5, to the iPad 3. Instead, he died, too weak at the end, according to the Times, to walk up the stairs of his house.

  But Jobs lived to see the Beatles on iTunes, to see Tim Cook, Apple’s new CEO, not muff the latest iPhone announcement, and then he left us on our own. He died absolutely the king of the world of talking to people who aren’t in the same room with you and of book reading when you don’t have a real book and of movie editing and of e-mail and of music distribution—the king of the world of making good things flow better. You have to love him.

  (2011)

  War

  Why I’m a Pacifist
r />   Six months after the Japanese attack on Pearl Harbor, Abraham Kaufman, the executive secretary of the War Resisters League, stood up in the auditorium of the Union Methodist Church in Manhattan and said something that was difficult to say. Kaufman, a man of thirty-three, who had put himself through City College at night and had worked Sundays selling magazines and candy in a subway station, insisted that we needed peace now—and that to get peace now, we needed to negotiate with Hitler. “This tremendous war can be ended by just one small spark of truth and sanity,” he said.

  To those who argued that you couldn’t negotiate with Hitler, Kaufman replied that the Allies were already negotiating with Hitler, and with Japan, too—over prisoners of war, for example, and the sending of food to Greece. It was important to confer right away, Kaufman believed, before either side had lost. Our aim should be what Woodrow Wilson had hoped for at the end of the First World War: a peace without victory. “We ask for peace now,” Kaufman said, “while there is still a world to discuss aims, not when it is too late.”

  What explained Kaufman’s urgency? It was simple: he didn’t want any more people to suffer and die. Civilian massacres and military horrors were reported daily, and Kaufman feared that the war would prove to be, as he’d written to the New York Times two years earlier, “so disastrous as to make the 1917 adventure seem quite mild.” He understood exactly what was at stake. In his view, a negotiated peace with Hitler was, paradoxically, the best chance the Allies had of protecting the world from Hitler’s last-ditch, exterminative frenzy.

  Kaufman was one of a surprisingly vocal group of World War II pacifists—absolute pacifists, who were opposed to any war service. They weren’t, all of them, against personal or familial self-defense, or against law enforcement. But they did hold that war was, in the words of the British pacifist and parliamentarian Arthur Ponsonby, “a monster born of hypocrisy, fed on falsehood, fattened on humbug, kept alive by superstition, directed to the death and torture of millions, succeeding in no high purpose, degrading to humanity, endangering civilization and bringing forth in its travail a hideous brood of strife, conflict and war, more war.” Along with Kaufman and Ponsonby—and thousands of conscientious objectors who spent time in jail, in rural work camps, in hospitals, or in controlled starvation studies—the ranks of wartime pacifists included Vera Brittain, Rabbi Abraham Cronbach, Dorothy Day, and Jessie Wallace Hughan.