Read The Innovators Page 46


  This is not something that most kids ponder, but both of Berners-Lee’s parents were computer scientists. They worked as programmers on the Ferranti Mark I, the commercial version of the Manchester University stored-program computer. One evening at home his father, who had been asked by his boss to draft a speech on how to make computers more intuitive, talked about some books on the human brain that he was reading. His son recalled, “The idea stayed with me that computers could become much more powerful if they could be programmed to link otherwise unconnected information.”1 They also talked about Alan Turing’s concept of a universal machine. “It made me realize that the limitations on what you could do with a computer were just the limitations of your imagination.”2

  Berners-Lee was born in 1955, the same year as Bill Gates and Steve Jobs, and he considered it a lucky time to be interested in electronics. Kids of that era found it easy to get hold of basic equipment and components that they could play with. “Things came along at the right time,” he explained. “Anytime we understood one technology, then industry produced something more powerful that we could afford with our pocket money.”3

  In primary school, Berners-Lee and a friend hung around hobby shops, where they used their allowance to buy electromagnets and make their own relays and switches. “You’d have an electromagnet banged into a bit of wood,” he recalled. “When you turned it on, it would attract a bit of tin and that would complete a circuit.” From that they developed a deep understanding of what a bit was, how it could be stored, and the things that could be done with a circuit. Just when they were outgrowing simple electromagnetic switches, transistors became common enough that he and his friends could buy a bag of a hundred pretty cheaply. “We learned how to test transistors and use them to replace the relays we had built.”4 In doing so, he could visualize clearly what each component was doing by comparing them to the old electromagnetic switches they superseded. He used them to make audio sounds for his train set and to create circuits that controlled when the train should slow down.

  “We began to imagine quite complicated logical circuits, but those became impractical because you would have to use too many transistors,” he said. But just as he ran into that problem, microchips became available at the local electronics store. “You buy these little bags of microchips with your pocket money and you’d realize you could make the core of a computer.”5 Not only that, but you could understand the core of the computer because you had progressed from simple switches to transistors to microchips and knew how each worked.

  One summer just before he went off to Oxford, Berners-Lee had a job in a lumber yard. When he was dumping a pile of sawdust into a Dumpster, he spied an old calculator, partly mechanical and partly electronic, with rows of buttons. He salvaged it, wired it up with some of his switches and transistors, and soon had it working as a rudimentary computer. At a repair shop he bought a broken television set and used the monitor to serve as a display, after figuring out how the circuit of vacuum tubes worked.6

  During his Oxford years, microprocessors became available. So, just as Wozniak and Jobs had done, he and his friends designed boards that they tried to sell. They were not as successful as the Steves, partly because, as Berners-Lee later said, “we didn’t have the same ripe community and cultural mix around us like there was at the Homebrew and in Silicon Valley.”7 Innovation emerges in places with the right primordial soup, which was true of the Bay Area but not of Oxfordshire in the 1970s.

  His step-by-step hands-on education, starting with electromagnetic switches and progressing to microprocessors, gave him a deep understanding of electronics. “Once you’ve made something with wire and nails, when someone says a chip or circuit has a relay you feel confident using it because you know you could make one,” he said. “Now kids get a MacBook and regard it as an appliance. They treat it like a refrigerator and expect it to be filled with good things, but they don’t know how it works. They don’t fully understand what I knew, and my parents knew, which was what you could do with a computer was limited only by your imagination.”8

  There was a second childhood memory that lingered: that of a Victorian-era almanac and advice book in his family home with the magical and musty title Enquire Within Upon Everything. The introduction proclaimed, “Whether You Wish to Model a Flower in Wax; to Study the Rules of Etiquette; to Serve a Relish for Breakfast or Supper; to Plan a Dinner for a Large Party or a Small One; to Cure a Headache; to Make a Will; to Get Married; to Bury a Relative; Whatever You May Wish to Do, Make, or to Enjoy, Provided Your Desire has Relation to the Necessities of Domestic Life, I Hope You will not Fail to ‘Enquire Within.’ ”9 It was, in some ways, the Whole Earth Catalog of the nineteenth century, and it was filled with random information and connections, all well indexed. “Enquirers are referred to the index at the end,” the title page instructed. By 1894 it had gone through eighty-nine editions and sold 1,188,000 copies. “The book served as a portal to a world of information, everything from how to remove clothing stains to tips on investing money,” Berners-Lee observed. “Not a perfect analogy for the Web, but a primitive starting point.”10

  Another concept that Berners-Lee had been chewing on since childhood was how the human brain makes random associations—the smell of coffee conjures up the dress a friend wore when you last had coffee with her—whereas a machine can make only the associations that it has been programmed to make. He was also interested in how people work together. “You got half the solution in your brain, and I got half in my brain,” he explained. “If we are sitting around a table, I’ll start a sentence and you might help finish it, and that’s the way we all brainstorm. Scribble stuff on whiteboard, and we edit each other’s stuff. How can we do that when we are separated?”11

  All of these elements, from Enquire Within to the brain’s ability to make random associations and to collaborate with others, were jangling around in Berners-Lee’s head when he graduated from Oxford. Later he would realize a truth about innovation: New ideas occur when a lot of random notions churn together until they coalesce. He described the process this way: “Half-formed ideas, they float around. They come from different places, and the mind has got this wonderful way of somehow just shoveling them around until one day they fit. They may fit not so well, and then we go for a bike ride or something, and it’s better.”12

  For Berners-Lee, his own innovative concepts began to coalesce when he took a consulting job at CERN, the mammoth supercollider and particle physics lab near Geneva. He needed a way to catalogue the connections among the ten thousand or so researchers, their projects, and their computer systems. Both the computers and the people spoke many different languages and tended to make ad hoc links to one another. Berners-Lee needed to keep track of them, so he wrote a program to help him do so. He noticed that when people explained to him the various relationships at CERN, they tended to scribble diagrams with a lot of arrows. So he devised a method to replicate these in his program. He would type in the name of a person or project and then create links that would show which were related. Thus it was that Berners-Lee created a computer program that he named, after the Victorian almanac of his childhood, Enquire.

  “I liked Enquire,” he wrote, “because it stored information without using structures like matrices or trees.”13 Such structures are hierarchical and rigid, whereas the human mind makes more random leaps. As he worked on Enquire, he developed a grander vision for what it could become. “Suppose all the information stored on computers everywhere were linked. There would be a single global information space. A web of information would form.”14 What he imagined, although he didn’t know it at the time, was Vannevar Bush’s memex machine—which could store documents, cross-reference them, retrieve them—writ global.

  But before he got very far in creating Enquire, his consultancy at CERN came to an end. He left behind his computer and his eight-inch floppy disk containing all of the code, and it was promptly lost and forgotten. For a few years he worked in England for a company t
hat made software for publishing documents. But he got bored and applied for a fellowship at CERN. In September 1984 he arrived back there to work with the group that was responsible for gathering the results of all of the experiments being done at the institute.

  CERN was a cauldron of diverse peoples and computer systems using dozens of languages, both verbal and digital. All had to share information. “In this connected diversity,” Berners-Lee recalled, “CERN was a microcosm of the rest of the world.”15 In such a setting, he found himself returning to his childhood ruminations about how people with different perspectives work together to turn each other’s half-formed notions into new ideas. “I’ve always been interested in how people work together. I was working with a lot of people at other institutes and universities, and they had to collaborate. If they had been in the same room, they would have written all over the blackboard. I was looking for a system that would allow people to brainstorm and to keep track of the institutional memory of a project.”16

  Such a system, he felt, would connect people from afar so that they could complete each other’s sentences and add useful ingredients to each other’s half-formed notions. “I wanted it to be something which would allow us to work together, design things together,” he said. “The really interesting part of the design is when we have lots of people all over the planet who have part of it in their heads. They have parts of the cure for AIDS, part of an understanding of cancer.”17 The goal was to facilitate team creativity—the brainstorming that occurs when people sit around fleshing out each other’s ideas—when the players are not in the same place.

  So Berners-Lee reconstructed his Enquire program and began thinking about ways to expand it. “I wanted to access different kinds of information, such as a researcher’s technical papers, the manual for different software modules, minutes of meetings, hastily scribbled notes, and so on.”18 Actually, he wanted to do much more than that. He had the placid exterior of a congenital coder, but lurking underneath he harbored the whimsical curiosity of a child who stayed up late reading Enquire Within Upon Everything. Rather than merely devising a data management system, he yearned to create a collaborative playground. “I wanted to build a creative space,” he later said, “something like a sandpit where everyone could play together.”19

  He hit upon a simple maneuver that would allow him to make the connections he wanted: hypertext. Now familiar to any Web surfer, hypertext is a word or phrase that is coded so that when clicked it sends the reader to another document or piece of content. Envisioned by Bush in his description of a memex machine, it was named in 1963 by the tech visionary Ted Nelson, who dreamed up a brilliantly ambitious project called Xanadu, never brought to fruition, in which all pieces of information would be published with two-way hypertext links to and from related information.

  Hypertext was a way to allow the connections that were at the core of Berners-Lee’s Enquire program to proliferate like rabbits; anyone could link to documents on other computers, even those with different operating systems, without asking permission. “An Enquire program capable of external hypertext links was the difference between imprisonment and freedom,” he exulted. “New webs could be made to bind different computers together.” There would be no central node, no command hub. If you knew the web address of a document, you could link to it. That way the system of links could spread and sprawl, “riding on top of the Internet,” as Berners-Lee put it.20 Once again, an innovation was created by weaving together two previous innovations: in this case, hypertext and the Internet.

  Using a NeXT computer, the handsome hybrid of a workstation and personal computer that Jobs created after being ousted from Apple, Berners-Lee adapted a protocol that he had been working on, called a Remote Procedure Call, that allowed a program running on one computer to call up a subroutine that was on another computer. Then he drew up a set of principles for naming each document. Initially he called these Universal Document Identifiers. The folks at the Internet Engineering Task Force in charge of approving standards balked at what they said was his “arrogance” in calling his scheme universal. So he agreed to change it to uniform. In fact he was pushed into changing all three words, turning it into Uniform Resource Locators—those URLs, such as http://www.cern.ch, that we now use every day.21 By the end of 1990 he had created a suite of tools that allowed his network to come to life: a Hypertext Transfer Protocol (HTTP) to allow hypertext to be exchanged online, a Hypertext Markup Language (HTML) for creating pages, a rudimentary browser to serve as the application software that retrieved and displayed information, and server software that could respond to requests from the network.

  In March 1989 Berners-Lee had his design in place and officially submitted a funding proposal to the top managers at CERN. “The hope would be to allow a pool of information to develop which could grow and evolve,” he wrote. “A ‘web’ of notes with links between them is far more useful than a fixed hierarchical system.”22 Unfortunately, his proposal elicited as much bafflement as enthusiasm. “Vague, but exciting,” his boss, Mike Sendall, wrote atop the memo. “When I read Tim’s proposal,” he later admitted, “I could not figure out what it was, but I thought it was great.”23 Once again, a brilliant inventor found himself in need of a collaborator to turn a concept into a reality.

  * * *

  More than most digital-age innovations, the conception of the Web was driven primarily by one person. But Berners-Lee did need a partner in bringing it to fruition. Fortunately, he was able to find one in Robert Cailliau, a Belgian engineer at CERN, who had been toying with similar ideas and was willing to join forces. “In the marriage of hypertext and the Internet,” said Berners-Lee, “Robert was best man.”

  With his personable demeanor and bureaucratic skills, Cailliau was the perfect person to be the evangelist for the project within CERN and the project manager who got things done. A fastidious dresser who methodically scheduled his haircuts, he was “the kind of engineer who can be driven mad by the incompatibility of power plugs in different countries,” according to Berners-Lee.24 They formed a partnership often seen in innovative teams: the visionary product designer paired with the diligent project manager. Cailliau, who loved planning and organizational work, cleared the way, he said, for Berners-Lee to “bury his head in the bits and develop his software.” One day Cailliau tried to go over a project plan with Berners-Lee and realized, “He just did not understand the concept!”25 Because of Cailliau, he didn’t have to.

  Cailliau’s first contribution was to sharpen the funding proposal that Berners-Lee had submitted to CERN administrators by making it less vague while keeping it exciting. He began with its title, “Information Management.” Cailliau insisted that they figure out a catchier name for the project, which shouldn’t be too hard. Berners-Lee had a few ideas. The first was Mine of Information, but that abbreviated to MOI, French for me, which sounded a bit egocentric. The second idea was The Information Mine, but that abbreviated to TIM, which was even more so. Cailliau rejected the approach, often used at CERN, of plucking the name of some Greek god or Egyptian pharaoh. Then Berners-Lee came up with something that was direct and descriptive. “Let’s call it the World Wide Web,” he said. It was the metaphor he had used in his original proposal. Cailliau balked. “We can’t call it that, because the abbreviation WWW sounds longer than the full name!”26 The initials have three times the syllables as the name itself. But Berners-Lee could be quietly stubborn. “It sounds good,” he declared. So the title of the proposal was changed to “WorldWideWeb: Proposal for a HyperText Project.” Thus the Web was named.

  Once the project was officially embraced, the CERN administrators wanted to patent it. When Cailliau raised the issue, Berners-Lee objected. He wanted the Web to spread and evolve as quickly as possible, and that meant it should be free and open. At one point he looked at Cailliau and asked accusingly, “Robert, do you want to be rich?” As Cailliau recalled, his initial reaction was “Well, it helps, no?”27 That was the incorrect response. “He appa
rently didn’t care about that,” Cailliau realized. “Tim’s not in it for the money. He accepts a much wider range of hotel-room facilities than a CEO would.”28

  Instead Berners-Lee insisted that the Web protocols should be made available freely, shared openly, and put forever in the public domain. After all, the whole point of the Web, and the essence of its design, was to promote sharing and collaboration. CERN issued a document declaring that it “relinquishes all intellectual property rights to this code, both source and binary form, and permission is granted for anyone to use, duplicate, modify, and redistribute it.”29 Eventually CERN joined forces with Richard Stallman and adopted his GNU General Public License. The result was one of the grandest free and open-source projects in history.

  That approach reflected Berners-Lee’s self-effacing style. He was averse to any hint of personal aggrandizement. Its wellsprings also came from someplace deeper within him: a moral outlook based on peer sharing and respect, something he found in the Unitarian Universalist Church that he adopted. As he said of his fellow Unitarians, “They meet in churches instead of wired hotels, and discuss justice, peace, conflict, and morality rather than protocols and data formats, but in other ways the peer respect is very similar to that of the Internet Engineering Task Force. . . . The design of the Internet and the Web is a search for a set of rules which will allow computers to work together in harmony, and our spiritual and social quest is for a set of rules which allow people to work together in harmony.”30

  * * *

  Despite the hoopla that accompanies many product announcements—think Bell Labs unveiling the transistor or Steve Jobs the Macintosh—some of the most momentous innovations tiptoe quietly onto history’s stage. On August 6, 1991, Berners-Lee was glancing through the Internet’s alt.hypertext newsgroup and ran across this question: “Is anyone aware of research or development efforts in . . . hypertext links enabling retrieval from multiple heterogeneous sources?” His answer, “from: [email protected] at 2:56 pm,” became the first public announcement of the Web. “The WorldWideWeb project aims to allow links to be made to any information anywhere,” he began. “If you’re interested in using the code, mail me.”31