Both the authors and the activists—“cyberpunks” and “cypherpunks” —worry that freedom may someday be lost to dark, conspiratorial forces. Yet the writers tend to veer somewhat left in their fear of oligarchs, while crypto supporters target their ire at nefarious state agencies, a distinction none of them seemed to note.
Now at first this may seem an academic contradiction between writers of an obscure literary genre and their fans in a fringe techno-political movement. But that superficial reading misses several points. First, although both groups choose to call themselves “punks” in stylish rebellion against social structures they perceive as straitlaced or repressive, they are actually mainstream opinion shapers among the hundreds of thousands of technologically informed netizens who are designing and implementing the information age. Almost every high official or member of Congress has one or more vital staff members who care far more about William Gibson and Phil Zimmermann than about Milton Friedman or Paula Jones. The resident techies who maintain Web pages and crucial network access for every major politician, corporate head, academic department, and news outlet are fast becoming indispensable and influential out of all proportion to their numbers. To an ever-increasing degree, they govern what their employers read and see, helping sway the way they view the world.
Moreover, this example illustrates both yin and yang aspects of the social immune system we explored earlier. Exercised to a peak of fighting mettle, cypherpunks practice their well-trained suspicion-of-authority instincts in the most gut-satisfying way, by attacking a single, demonized concentration of power, losing sight of the need to apply accountability in all directions, especially the ignored shadows.
None of this disparages the warnings that they raise. With articulate passion, these social T-cells swarm around some of the right dangers. They diagnose perilous trends. But their chosen prescription should be examined before we swallow it whole.
One of the things that makes some people reluctant to negotiate is a combination of hard technical determinism with a belief in the efficacy of strong crypto. The coming of small, cheap, and mobile sensor/video camera technologies would seem to undermine this belief system, but so far, a large number of people still think Is and 0s will save them from bad polities.
JEFF UBOIS
CLIPPER CHIP AND PASCAL’S CHOICE
In chapters 8 and 9 we will discuss many of the technical and social ramifications raised by innovations like the Clipper chip, harkening to the subtitle of this book: Will Technology Force Us to Choose Between Privacy and Freedom? For the remainder of chapter 7, however, we will concentrate on just one aspect—the modern obsession with secrecy that appears to have transfixed an alliance of pundits, intellectuals, and activists all across the political spectrum, leading to the widespread acceptance of a bizarre consensus: that liberty can best be protected with masks and secret codes.
First, a little background.
Encryption is the art of disguising meaning within a message, so that only the intended recipient will understand its significance. The origins of encryption go back almost to the start of writing itself. Codes, ciphers, and cryptograms have been used for diplomatic and military purposes since at least Babylonian times. Meanwhile, nations have desperately sought ways to unscramble the codes of their rivals. During World War I, a clumsy attempt by the German Foreign Ministry to stir up an anti-American, recidivist conspiracy in Mexico was uncovered with the deciphering of the infamous “Zimmerman telegram.” This message had the unintended result of pushing U.S. public opinion closer toward siding with Britain and France. A quarter of a century later, ingenious skill at decryption proved even more decisive in human affairs when a team of eccentrics and émigrés succeeded in cracking the Nazi Enigma encryption system, giving the British a crucial advantage in the Battle of the Atlantic. Similar success by U.S. officers at breaking Japanese naval codes helped turn the tide of combat in that theater, perhaps a year earlier than might have happened otherwise.
Throughout the Cold War, an expensive, top-secret campaign of electronic espionage and surveillance was run out of the mysterious so-called Puzzle Palace of the NSA, where sophisticated spy satellites reported nearly every whisper carried over Soviet airwaves and telephone cables. America’s top cipher designers pursued mathematical games in deadly earnest. Techniques were developed to scramble data swiftly and efficiently. Meanwhile, down the hall, others worked just as hard developing methods to decipher the messages of foreign powers.
Even before the Cold War waned, there was also a small but vigorous research community studying cryptography beyond government circles. For instance, public key encryption emerged from the efforts of a few creative outsiders to come up with a system that would be simple, secure, and easy to use by a wide variety of groups or individuals. In a public key crypto system, all the receiving party requires for deciphering a message is a pair of prime numbers. The product of these two numbers is a public key that can be used for enciphering, but only knowledge of the secret prime factors allows the message to be decrypted. In order to prevent an opponent’s high-speed computers from penetrating the code by brute force, the keys have to be quite large, up to hundreds of binary digits long. (We will describe some of the technical possibilities, and drawbacks, in chapter 9.)
Today many citizens encounter encryption when they try tuning television signals sent by satellite or cable. In most local systems, a paying subscriber receives a “key” electronically, via a cable box sitting atop the television, after which premium channels are supposed to come through with clarity. Until recently, these scrambling schemes used comparatively simple analog techniques that a gifted electronics engineer might bypass, resulting in a thriving “cable piracy” black market. But digital technologies may give the advantage back to mainstream cable and satellite companies—for a while.
Cable television is just the start of encryption entering our lives. Many telephone sets are now sold with simple, built-in scrambling systems. Impressive hardware/software packages promise both corporations and individuals ways to conceal their private data from competitors or other prying eyes.
This trend did not escape notice by the law enforcement community. As early as 1993, speaking to the Executives Club of Chicago, FBI directordesignate Louis Freeh described how his agency perceived a looming danger and predicted, “The country will be unable to protect itself against terrorism, violent crime, foreign threats, drug trafficking, espionage, kidnapping, and other grave crimes.” Freeh conceded that the age of old-fashioned analog telephony, carried on easily tapped copper wires, is rapidly coming to an end. As ever more telephone and data traffic goes digital, especially via fiber optics and multi-branched switching systems, the technology of disguising communications under a static haze of encryption has Freeh and his colleagues worried. In the government’s view, technology promises to change the rules of daily life, making electronic privacy so secure that even the FBI won’t be able to listen in.
Still, the official response of the Clinton administration was not to try to prevent private or commercial encryption, but instead to propose a standardization of coding technology that would still (the FBI hoped) meet the needs of justice professionals. Like other methods for scrambling private messages, the Clipper chip would let users encrypt their voice or digital communications so that almost any outsider would have a hard time listening in. Unlike competing methods, however, there would be an exception built in, since the keys, or mathematical factors for decoding all “clipped” messages, would be held in escrow, deposited in a pair of separate, secure databases. Armed with a court order, the FBI might then retrieve both keys to a given scrambler and listen in, just as they do today on about one thousand unscrambled telephone lines per year. In Freeh’s words, “Advanced technology will make it possible for the FBI to carry out court-approved surveillance in the life-and-death cases.”
In addition to the inevitable firestorm of political and ideological objections that followed Freeh’s initiative, the
Clipper faced many kinks and problems of a technological nature. For instance, in June 1994 a computer scientist at AT&T Bell Laboratories, Matthew Blaze, announced a basic flaw in the proposed technology that would let clever users convert even the supposedly tame Clipper so that it provided relatively “unbreakable” encryption. No doubt later redesigns would have closed Blaze’s loophole, even as other ingenious minds applied themselves to finding new ones. (As Marc Rotenberg of the Electronic Privacy Information Center [EPIC] pointed out, any central bank of key codes would become the ultimate target of computer hackers.)
There were other drawbacks, such as standardization on a specific hardware format, nearly always a bad idea during times of rapidly changing technology. If the system were ever compromised at a later date, upgrading would be an expensive, nationwide headache.
No law would have actually banished methods of encoding and decoding data different from Clipper (though such legislation was being discussed as this book went to press). Such alternatives are widely available within the United States and abroad. So far, the federal government has promoted its favored technology with the power of the purse, by requiring its own suppliers to include the Clipper in new designs, and by using Cold War technology transfer laws to restrict the export of competing technologies. The government hoped thereby to create momentum for the Clipper’s adoption as an industrywide encryption standard. But anyone serious about evading a wiretap would still be able to use something else, such as the widely distributed public key coding protocol called PGP, or Pretty Good Privacy, written by Philip Zimmermann.
None of this softened the storm of outraged protests by those who perceived this as despotism on the march. Tom Maddox, a columnist for Locus magazine, put it this way: “The response from organizations such as EFF and the Computer Professionals for Social Responsibility and a number of corporations was immediate and almost uniformly negative. Seeing Big Brother embodied in the hardware, they balked.”
The objections presented by these groups had much to do with their members’ collective sense of threatened rights. As PGP author Philip Zimmermann observes, “I think I ought to be able to go up and whisper in your ear, even if your ear is 1,000 miles away. If we install Clipper, then we can’t do that, because the government will have a back door into our encrypted communications.”
In a world of black and white ideologies, Zimmermann is actually more pragmatic and open-minded than many of his staunch defenders. “I think that the government does have some reasonable points to make. Criminals can use this technology to hide their activities,” he says. “I think the debate on cryptography is not an open-and-shut case.” Nevertheless, Zimmermann sees his role as promoting the crypto side of the argument, and leaving the opposition to others. “I am a cryptographer,” he explains. “It’s what I do.”
Ironically, while the Clipper chip controversy was often couched as an effort to increase government power of surveillance over individuals, that is an overstatement. Government wiretapping is going to be hampered by encryption, whether or not something like Clipper finally becomes standard. Instead of the current situation, in which most wiretaps are placed by local cops, needing only a local judge’s hand-scrawled (and possibly postdated) “court order” plus a pair of rusty alligator clips, there would be straitlaced procedures required to access the keys for any single Clipper chip, with officials having to present formal documents at two separate escrow agencies. True, one can imagine scenarios under which a corrupt administrator, or invading hacker, or a worker at the Clipper factory might access and sell some keys, but that is possible under any encryption scheme, not just Clipper.
Might opponents in principle have negotiated concessions from the government in the form of rigorously worked out verification procedures? Certain measures could have made Clipper’s key escrow system arguably more resistant to tampering or abuse. For instance, public key encryption can be performed with more than two keys. Any arbitrary number may be established, so that a particular message can be decrypted only if all five, ten, or perhaps even a hundred keys are used at the same time. A mathematical technique called “secret sharing” extends this principle to a degree that should satisfy all but the most paranoid. In a five-key system, the “back door” would be accessible to government agents only if they presented highly credible probable-cause evidence of criminality to separate oversight committees in five separate cities—committees that might be set up from the start to include substantial citizen membership, perhaps with a chair reserved for the Electronic Frontier Foundation. (Anything can be negotiated.) Refusal of permission by just one cache authority would thwart the proposed eavesdropping. This measure would also help defend against illicit key collection by invading hackers.
Another possible alternative to the FBI’s concept of twin government escrow sites would be a free enterprise solution. Under this plan, which was vaguely and tentatively floated under the unpromising name, Clipper II, individuals or corporations would purchase (or deposit) their encryption keys from a trusted commercial agent of their own choice. These repository institutions would presumably have strong economic incentives to protect their customers’ keys, demanding triple verification of court orders and scrutinizing every step of the transaction, because a slipup, letting the government gain access too easily, might ruin the company’s reputation and lose it customers. (These repositories would be far easier to hold accountable than the faceless operators of “anonymous remailers,” which we will discuss a little later.)
Alas, consensus-oriented ideas such as these were hardly discussed in most public statements about the Clipper initiative, especially by advocates of strong privacy, who almost universally dismissed the possibility of compromise. Nor was this entirely because of their ideological myopia or fixation on a single threat. Indeed, federal officials did little to engender an atmosphere that could lead to negotiation and trust.
The Government’s Fault
On a technical level, bureaucrats contributed to the general wariness by refusing to publish openly the mathematical algorithm on which Clipper was based. Now at first sight this might seem a reasonable security precaution, to protect against future code breakers, but in fact there is a world of difference between the method that is used to generate and apply encryption keys and the keys themselves. A computational technique, like any other fine-looking plan, can look excellent on paper, yet be in fact riddled with hidden flaws that will only be exposed after assault by relentless and varied criticism. Corporations and private individuals had a perfect right to examine (and hammer away at) the software underlying Clipper, in order to be sure in advance that only the cached keys would decipher any encrypted messages. Refusal to expose the algorithm to scrutiny struck many crypto advocates as suspicious. So did the fact that officials were so slow to suggest more than two cache sites, or the commercial repository alternative.
Even had these problems been overcome, a serious cultural gap yawned between the U.S. government and its critics. Hasty or witless abuses, such as the Steve Jackson Games episode, provided grist for a well-tuned mill of distrust toward bullying authority figures. Many cypherpunks and others were already primed to believe the worst about public scandals, like the calamity that occurred at the Branch Davidian compound in 1993, attributing such fatal episodes to deliberate malevolence rather than bad luck or official incompetence. Bureaucrats exacerbate this reflex by perpetuating high levels of Cold War secrecy and habitually thwarting document requests under the Freedom of Information Act. In this atmosphere, many activists give credence to the worst rumors, for example, that the FBI plans to tap one in every hundred phones, or that the NSA already routinely spies on U.S. domestic telephone traffic (using sophisticated wordsampling techniques to screen a myriad conversations, seeking those worth further investigation).
Although I tend to think that the stupidity of such outrages would far exceed any conceivable benefit (the risk of exposure by whistleblowers could lead to towering scandals), I will not dism
iss such concerns out of hand. The fact that a scheme is doomed from the start to become a disastrous embarrassment and put its instigators in prison does not mean that some isolated clique of egotists in power won’t convince themselves that it is, in the immortal words of Oliver North, “a neat idea.”
Dartmouth Professor Arthur Kantrowitz explains this self-destructive pattern, one that is followed by too many aloof officials. “In a classified project, the vested interests which grow around a decision can frequently prevent the questioning of authority necessary for the elimination of error. Peacetime classified projects have a very bad record of rejecting imaginative suggestions which frequently are very threatening to the existing political power structure.”
Whether conspiracy fans are right in their direst suspicions, or paranoid, or somewhere in between, the important point is that in the long run, transparency offers the best hope of preventing such behavior by government agencies. It will do this in three ways: 1. Creating an atmosphere in which whistleblowers are feted and protected.
2. Eliminating the bureaucrats’ rationalizations for such activities, by exposing terrorist and other threats in the normal course of events.
3. Distributing the expertise that will enable citizens and amateur or media sleuths to catch official power abusers “in the act”
Although this scenario is hardly perfect, the transparency solution is assuredly less far-fetched than believing that a Web-advertised “underground” encryption package, bought by mail order from some unknown bunch of programmers in Delhi or St. Petersburg, will guarantee protection against both hackers and the finest tools the NSA can bring to bear.