Cover Page

from reviews of the first edition of

APPLIED CRYPTOGRAPHY

Protocols, Algorithms, and Source Code in C

“… the definitive text on the subject….”

—Software Development Magazine

“… good reading for anyone interested in cryptography.”

—BYTE

“This book should be on the shelf of any computer professional involved in the use or implementation of cryptography.”

—IEEE Software

“… dazzling … fascinating…. This book absolutely must be on your bookshelf …”

—PC Techniques

“… comprehensive … an encyclopedic work …”

—The Cryptogram

“… a fantastic book on cryptography today. It belongs in the library of anyone interested in cryptography or anyone who deals with information security and cryptographic systems.”

—Computers & Security

“An encyclopedic survey … could well have been subtitled ‘The Joy of Encrypting’ … a useful addition to the library of any active or would-be security practitioner.”

—Cryptologia

“… encyclopedic … readable … well-informed … picks up where Dorothy Denning’s classic Cryptography and Data Security left off a dozen years ago…. This book would be a bargain at twice the price.”

—;login:

“This is a marvelous resource—the best book on cryptography and its application available today.”

—Dorothy Denning
Georgetown University

“… Schneier’s book is an indispensable reference and resource…. I recommend it highly.”

—Martin Hellman
Stanford University

APPLIED CRYPTOGRAPHY, SECOND EDITION

PROTOCOLS, ALGORITHMS, AND SOURCE CODE IN C

BRUCE SCHNEIER

20th Anniversary Edition

Wiley Logo

Introduction

I first wrote Applied Cryptography in 1993. Two years later, I wrote the greatly expanded second edition. At this vantage point of two decades later, it can be hard to remember how heady cryptography’s promise was back then. These were the early days of the Internet. Most of my friends had e-mail, but that was because most of my friends were techies. Few of us used the World Wide Web. There was nothing yet called electronic commerce.

Cryptography was being used by the few who cared. We could encrypt our e-mail with PGP, but mostly we didn’t. We could encrypt sensitive files, but mostly we didn’t. I don’t remember having the option of a usable full-disk encryption product, at least one that I would trust to be reliable.

What we did have were ideas—research and engineering ideas—and that was the point of Applied Cryptography. My goal in writing the book was to collect all the good ideas of academic cryptography under one cover and in a form that non-mathematicians could read and use.

What we also had, more important than ideas, was the unshakable belief that technology trumped politics. You can see it in John Perry Barlow’s 1996 “Declaration of the Independence of Cyberspace,” where he told governments, “You have no moral right to rule us, nor do you possess any methods of enforcement that we have reason to fear.” You can see it three years earlier in cypherpunk John Gilmore’s famous quote: “The Net interprets censorship as damage and routes around it.” You can see it in the pages of Applied Cryptography. The first paragraph of the Preface, which I wrote in 1993, says, “There are two kinds of cryptography in this world: cryptography that will stop your kid sister from reading your files, and cryptography that will stop major governments from reading your files. This book is about the latter.”

This was the promise of cryptography. It was the promise behind everything—from file and e-mail encryption to digital signatures, digital certified mail, secure election protocols, and digital cash. The math would give us all power and security, because math trumps everything else. It would topple everything from government sovereignty to the music industry’s attempts at stopping file sharing.

The “natural law” of cryptography is that it’s much easier to use than it is to break. To take a hand-waving example, think about basic encryption. Adding a single bit to a key, say from a 64-bit key to a 65-bit key, adds at most a small amount of work to encrypt and decrypt. But it doubles the amount of work to break. Or, more mathematically, encryption and decryption work grows linearly with key length, but cryptanalysis work grows exponentially. It’s always easier for the communicators than the eavesdropper.

It turned out that this was all true, but less important than we had believed. A few years later, we realized that cryptography was just math, and that math has no agency. In order for cryptography to actually do anything, it has to be embedded in a protocol, written in a programming language, embedded in software, run on an operating system and computer attached to a network, and used by living people. All of those things add vulnerabilities and—more importantly—they’re more conventionally balanced. That is, there’s no inherent advantage for the defender over the attacker. Spending more effort on either results in linear improvements. Even worse, the attacker generally has an inherent advantage over the defender, at least today.

So when we learn about the NSA through the documents provided by Edward Snowden, we find that most of the time the NSA breaks cryptography by circumventing it. The NSA hacks the computers doing the encryption and decryption. It exploits bad implementations. It exploits weak or default keys. Or it “exfiltrates”—NSA-speak for steals—keys. Yes, it has some mathematics that we don’t know about, but that’s the exception. The most amazing thing about the NSA as revealed by Snowden is that it isn’t made of magic.

This doesn’t mean that cryptography is useless: far from it. What cryptography does is raise both the cost and risk of attack. Data zipping around the Internet unencrypted can be collected wholesale with minimal effort. Encrypted data has to be targeted individually. The NSA—or whoever is after your data—needs to target you individually and attack your computer and network specifically. That takes time and manpower, and is inherently risky. No organization has enough budget to do that to everyone; they have to pick and choose. While ubiquitous encryption won’t eliminate targeted collection, it does have the potential to make bulk collection infeasible. The goal is to leverage the economics, the physics, and the math.

There’s one more problem, though—one that the Snowden documents have illustrated well. Yes, technology can trump politics, but politics can also trump technology. Governments can use laws to subvert cryptography. They can sabotage the cryptographic standards in the communications and computer systems you use. They can deliberately insert backdoors into those same systems. They can do all of those, and then forbid the corporations implementing those systems to tell you about it. We know the NSA does this; we have to assume that other governments do the same thing.

Never forget, though, that while cryptography is still an essential tool for security, cryptography does not automatically mean security. The technical challenges of implementing cryptography are far more difficult than the mathematical challenges of making the cryptography secure. And remember that the political challenges of being able to implement strong cryptography are just as important as the technical challenges. Security is only as strong as the weakest link, and the further away you get from the mathematics, the weaker the links become.

The 1995 world of Applied Cryptography, Second Edition, was very different from today’s world. That was a singular time in academic cryptography, when I was able to survey the entire field of research and put everything under one cover. Today, there’s too much, and the task of compiling it all is just too great. For those who want a more current book, I recommend Cryptography Engineering, which I wrote in 2010 with Niels Ferguson and Tadayoshi Kohno. But for a review of those heady times of the mid-1990s, and an introduction to what has become an essential technology of the Internet, Applied Cryptography still holds up surprisingly well.

—Minneapolis, Minnesota, and Cambridge, Massachusetts, January 2015

Foreword By Whitfield Diffie

The literature of cryptography has a curious history. Secrecy, of course, has always played a central role, but until the First World War, important developments appeared in print in a more or less timely fashion and the field moved forward in much the same way as other specialized disciplines. As late as 1918, one of the most influential cryptanalytic papers of the twentieth century, William F. Friedman’s monograph The Index of Coincidence and Its Applications in Cryptography, appeared as a research report of the private Riverbank Laboratories [577]. And this, despite the fact that the work had been done as part of the war effort. In the same year Edward H. Hebern of Oakland, California filed the first patent for a rotor machine [710], the device destined to be a mainstay of military cryptography for nearly 50 years.

After the First World War, however, things began to change. U.S. Army and Navy organizations, working entirely in secret, began to make fundamental advances in cryptography. During the thirties and forties a few basic papers did appear in the open literature and several treatises on the subject were published, but the latter were farther and farther behind the state of the art. By the end of the war the transition was complete. With one notable exception, the public literature had died. That exception was Claude Shannon’s paper “The Communication Theory of Secrecy Systems,” which appeared in the Bell System Technical Journal in 1949 [1432]. It was similar to Friedman’s 1918 paper, in that it grew out of wartime work of Shannon’s. After the Second World War ended it was declassified, possibly by mistake.

From 1949 until 1967 the cryptographic literature was barren. In that year a different sort of contribution appeared: David Kahn’s history, The Codebreakers [794]. It didn’t contain any new technical ideas, but it did contain a remarkably complete history of what had gone before, including mention of some things that the government still considered secret. The significance of The Codebreakers lay not just in its remarkable scope, but also in the fact that it enjoyed good sales and made tens of thousands of people, who had never given the matter a moment’s thought, aware of cryptography. A trickle of new cryptographic papers began to be written.

At about the same time, Horst Feistel, who had earlier worked on identification friend or foe devices for the Air Force, took his lifelong passion for cryptography to the IBM Watson Laboratory in Yorktown Heights, New York. There, he began development of what was to become the U.S. Data Encryption Standard; by the early 1970s several technical reports on this subject by Feistel and his colleagues had been made public by IBM [1482,1484,552].

This was the situation when I entered the field in late 1972. The cryptographic literature wasn’t abundant, but what there was included some very shiny nuggets.

Cryptology presents a difficulty not found in normal academic disciplines: the need for the proper interaction of cryptography and cryptanalysis. This arises out of the fact that in the absence of real communications requirements, it is easy to propose a system that appears unbreakable. Many academic designs are so complex that the would-be cryptanalyst doesn’t know where to start; exposing flaws in these designs is far harder than designing them in the first place. The result is that the competitive process, which is one strong motivation in academic research, cannot take hold.

When Martin Hellman and I proposed public-key cryptography in 1975 [496], one of the indirect aspects of our contribution was to introduce a problem that does not even appear easy to solve. Now an aspiring cryptosystem designer could produce something that would be recognized as clever—something that did more than just turn meaningful text into nonsense. The result has been a spectacular increase in the number of people working in cryptography, the number of meetings held, and the number of books and papers published.

In my acceptance speech for the Donald E. Fink award—given for the best expository paper to appear in an IEEE journal—which I received jointly with Hellman in 1980, I told the audience that in writing “Privacy and Authentication,” I had an experience that I suspected was rare even among the prominent scholars who populate the IEEE awards ceremony: I had written the paper I had wanted to study, but could not find, when I first became seriously interested in cryptography. Had I been able to go to the Stanford bookstore and pick up a modern cryptography text, I would probably have learned about the field years earlier. But the only things available in the fall of 1972 were a few classic papers and some obscure technical reports.

The contemporary researcher has no such problem. The problem now is choosing where to start among the thousands of papers and dozens of books. The contemporary researcher, yes, but what about the contemporary programmer or engineer who merely wants to use cryptography? Where does that person turn? Until now, it has been necessary to spend long hours hunting out and then studying the research literature before being able to design the sort of cryptographic utilities glibly described in popular articles.

This is the gap that Bruce Schneier’s Applied Cryptography has come to fill. Beginning with the objectives of communication security and elementary examples of programs used to achieve these objectives, Schneier gives us a panoramic view of the fruits of 20 years of public research. The title says it all; from the mundane objective of having a secure conversation the very first time you call someone to the possibilities of digital money and cryptographically secure elections, this is where you’ll find it.

Not satisfied that the book was about the real world merely because it went all the way down to the code, Schneier has included an account of the world in which cryptography is developed and applied, and discusses entities ranging from the International Association for Cryptologic Research to the NSA.

When public interest in cryptography was just emerging in the late seventies and early eighties, the National Security Agency (NSA), America’s official cryptographic organ, made several attempts to quash it. The first was a letter from a long-time NSA employee allegedly, avowedly, and apparently acting on his own. The letter was sent to the IEEE and warned that the publication of cryptographic material was a violation of the International Traffic in Arms Regulations (ITAR). This viewpoint turned out not even to be supported by the regulations themselves—which contained an explicit exemption for published material—but gave both the public practice of cryptography and the 1977 Information Theory Workshop lots of unexpected publicity.

A more serious attempt occurred in 1980, when the NSA funded the American Council on Education to examine the issue with a view to persuading Congress to give it legal control of publications in the field of cryptography. The results fell far short of NSAs ambitions and resulted in a program of voluntary review of cryptographic papers; researchers were requested to ask the NSAs opinion on whether disclosure of results would adversely affect the national interest before publication.

As the eighties progressed, pressure focused more on the practice than the study of cryptography. Existing laws gave the NSA the power, through the Department of State, to regulate the export of cryptographic equipment. As business became more and more international and the American fraction of the world market declined, the pressure to have a single product in both domestic and offshore markets increased. Such single products were subject to export control and thus the NSA acquired substantial influence not only over what was exported, but also over what was sold in the United States.

As this is written, a new challenge confronts the public practice of cryptography. The government has augmented the widely published and available Data Encryption Standard, with a secret algorithm implemented in tamper-resistant chips. These chips will incorporate a codified mechanism of government monitoring. The negative aspects of this “key-escrow” program range from a potentially disastrous impact on personal privacy to the high cost of having to add hardware to products that had previously encrypted in software. So far key escrow products are enjoying less than stellar sales and the scheme has attracted widespread negative comment, especially from the independent cryptographers. Some people, however, see more future in programming than politicking and have redoubled their efforts to provide the world with strong cryptography that is accessible to public scrutiny.

A sharp step back from the notion that export control law could supersede the First Amendment seemed to have been taken in 1980 when the Federal Register announcement of a revision to ITAR included the statement: “… provision has been added to make it clear that the regulation of the export of technical data does not purport to interfere with the First Amendment rights of individuals.” But the fact that tension between the First Amendment and the export control laws has not gone away should be evident from statements at a conference held by RSA Data Security. NSA’s representative from the export control office expressed the opinion that people who published cryptographic programs were “in a grey area” with respect to the law. If that is so, it is a grey area on which the first edition of this book has shed some light. Export applications for the book itself have been granted, with acknowledgement that published material lay beyond the authority of the Munitions Control Board. Applications to export the enclosed programs on disk, however, have been denied.

The shift in the NSA’s strategy, from attempting to control cryptographic research to tightening its grip on the development and deployment of cryptographic products, is presumably due to its realization that all the great cryptographic papers in the world do not protect a single bit of traffic. Sitting on the shelf, this volume may be able to do no better than the books and papers that preceded it, but sitting next to a workstation, where a programmer is writing cryptographic code, it just may.

Whitfield Diffie       

Mountain View, CA