Privacy and Encryption Export Controls: A Crypto Trilogy (Bernstein, Junger, & Karn)

Keith Aoki, Associate Professor of Law,
University of Oregon School of Law
NOTICE: This module has been revised to reflect changes in the law, effective 09/29/1999. The current version of this module is available at this link.

"In this increasingly electronic age, we are all required in our everyday lives to communicate with one another. This reliance on electronic communication however has brought with it a dramatic diminution in our ability to communicate privately. Cellular phones are subject to monitoring, email is easily intercepted, and transactions over the internet are often less than secure. Something as commonplace as furnishing our credit card number, social security number or bank account number puts each of us at risk. Moreover, when we employ electronic methods of communication, we often leave behind electronic "fingerprints" behind, fingerprints that can be traced back to us. Whether we are surveilled by our government, by criminals, or by our neighbors, it is fair to say that never has our ability to shield our affairs from prying eyes been at such a low ebb. The availability and use of strong encryption may offer an opportunity to reclaim some portion of the privacy we have lost. Government efforts to control encryption thus may well implicate not only the First Amendment rights of cryptographers intent on pushing the boundaries of their science, but also the constitutional rights of each of us as potential recipients of encryption's bounty. Viewed from this perspective, the government's efforts to retard progress in cryptography may implicate the Fourth Amendment as well as the right to speak anonymously, see McIntyre v. Ohio Elections Comm'n, 514 U.S. 334 (1995), the right against compelled speech, see Wooley v. Maynard, 430 U.S. 705 (1977), and the right to informational privacy, see Whalen v. Roe, 429 U.S. 589 (1977). While we leave it for another day the resolution of these difficult issues, it is important to point out that Bernstein's is a suit not merely concerning a small group of scientists laboring on an esoteric field, but also touches on the public interest broadly defined."

Judge Betty B. Fletcher, Bernstein v. U.S. Dept. of Justice, et al., 176 F. 3d 1132, 1145-1146 (1999)

Table of Contents

  1. Brief Introduction
  2. Roadmap to the Module
  3. A Brief Encryption Glossary of Terms
  4. A Short History of Privacy (and Privacy law) (Q & A)
  5. Secret Writing through History (Q & A)
  6. The Export Restriction Regulations: Legislation and Litigation
  7. A. Pre-1996 Regulatory Scheme: International Traffic in Arms Regulations

    B. Post-1996 Regulatory Scheme: Export Administration Regulations

    C. Brief Background on the Bernstein I, II, & III cases.

    1. Bernstein I (1996)

    2. Bernstein II (1996)

    3. Bernstein III (1997)

  8. A Crypto Trilogy: Bernstein, Junger & Karn
    1. Questions to Consider
    2. The Cases
  1. Bernstein v. Dept. of Justice (Bernstein IV) (1999)
  2. Junger v. Daley (1998)
  3. Karn v. U.S. Dept. of State (1996)

VIII. Selected Bibliography

 

I. Brief Introduction

Privacy is not synonymous with secrecy. To quote Eric Hughes of the Cypherpunks, "[a] private matter is something one doesn’t want the whole world to know, but a secret matter is something one doesn’t want anybody to know. Privacy is the power to reveal oneself selectively to the world." Strong encryption (such as "public key" cryptography) makes privacy possible in an increasingly digitized electronic world, thereby enabling some important preconditions for an open society to exist.

This module considers how digital electronic communications are both enabling but troublingly insecure. The use of strong encryption does three things: (1) ensures confidentiality/privacy of one's message; (2) ensures authenticity of a message; and (3) ensures the integrity of the contents of a message. Strong "public key" encryption may be thought of as a type of electronic "self-help" used to protect the content of one’s electronic communications from unwanted intrusion by public governmental entities as well as from private actors. For much of the post WW II era, the National Security Agency had an almost complete monopoly on strong encryption methods. However, this monopoly was broken by the development of "public-key" cryptography by Whitfield Diffie and Martin Hellman in 1975. "Public-key" cryptography eventually spurred major controversies in the 1990s, when the U.S. government tried implementing the "Clipper Chip," which would have allowed the government to have access to a key into virtually any and all encrypted digital communications. This module asks you to analyze a trilogy of cases (Karn v. U.S. Dept. of State [decided March 22, 1996 in the District Court for the District of Columbia]; Bernstein v. U.S. Department of State [decided May 6, 1999 in the 9th Circuit]; and Junger v. Daley [decided July 2, 1998 in the Northern District of Ohio]) that were decided beginning in the mid-1990s that problematically address the limits of the use of extremely strong "public key" cryptographic systems by private actors in the context of export licensing schemes that were administered by the State Department and the Commerce Department.

These cases have strong first amendment and administrative law components to them, but their implications go beyond those areas of the law to implicate fundamental questions of individual privacy in the digital environment -- who will have presumptive access to strong encryption tools.

II. ROADMAP TO THIS MODULE

This module is organized as a series of questions and answers revolving around the use of encryption to ensure privacy in one's communication. This module looks first at broad notions of privacy in general and the roots of a legal right of privacy within the United States, which has both common law and constitutional components. A brief history of codes and codebreaking is then discussed from the Caesar cipher to the German Enigma code machine as well as defining certain basic cryptographic terms and concepts. Next, the post -World War II governmental monopoly on strong cryptography by the National Security Agency (NSA) is described, as is the significant advent of public key cryptography, which was first developed by Whitfield Diffie and Martin Hellman in 1975, which made possible a challenge to the NSA's monopoly on strong encryption.

Simultaneous with the refinement of public key cryptographic techniques is the rise and spread of ubiquitous computer networked environments, which simultaneously enable extensive data communications and threaten the privacy or secrecy of the same transmitted information, some of which may be of a highly personal or confidential nature. In the late 1970s, the NSA in concert with the National Institute of Standards and Technology (NIST) released an 56-bit encryption algorithm called Digital Encryption Standard (DES), which would be used for electronic transfers of financial and other data. However, almost from the time of its release, DES's ability to protect information began to be outstripped by rapidly advancing decryption technology.

The US Government has repeatedly sought to curtail the spread of strong public key encryption technology in two ways: one, by a domestic initiative promoting "key escrow" deposit schemes, and secondly, by classifying strong public key encryption algorithms as "munitions" and therefore, subject to strict export controls, contingent on obtaining a rarely granted license from the State or Commerce Department. There is a relation between the domestic initiatives and export restrictions on strong cryptographic products. Export controls hinder the spread and development of strong cryptographic tools within the US, because software developers are reluctant to incorporate cryptographic tools that might prevent them from selling their products overseas. Government-backed "key escrow" arrangements such as the ill-fated Clipper chip initiative of the mid-1990s ran the risk of seriously harming overseas markets for US products using strong encryption.

This module focuses attention on legal developments occurring in the 1990s over attempts by the Clinton Administration to use State Department regulations, and then, Commerce Department regulations to stop the export of strong public key cryptographic tools. In particular, there have been three cases that have interpreted the relevant government export regulations, two upholding government-attempts to restrict the export of strong cryptography, and one, strongly questioning the ability of the government to restrict the use and export of public key cryptographic algorithms on first amendment grounds.

Finally, this module ends with an abbereviated selected bibliography of articles and books on encryption and the law.

To begin with, this module offers a brief encryption glossary of terms before moving on to discuss a brief history of privacy law and then to present a brief historical backdrop to the Crypto Trilogy of Bernstein, Junger & Karn.

III. A Brief Encryption Glossary of Terms

Algorithm: a mathematical function used to encrypt and decrypt a message.

Asymmetric Cryptography: a system of cryptography in which the key needed to encrypt a message differs from that needed to decrypt the same message. Each party possesses both a public key and a private key; neither key can be derived from the other. Invented by mathematicians Whitfield Diffie and Martin Hellman in the mid-1970s. The most widely known asymmetric cryptosystem is Pretty Good Privacy (PGP). This type of system is also know as "public key cryptography."

Authentication: the ability to accurately ascertain the identity of the sender of a message.

Bits: for cryptographic purposes, the unit by which cryptographic "keys" are measured. The longer the key, the more secure the cryptosystem using it. The relative security of key lengths is exponential: a 24-bit key system is over 65,000 times harder to break than a 16-bit key system.

Brute Force Attack: an attack on a cryptosystem that involves trying every possible key to decrypt a ciphertext until finding one that works. Usually the average time for a brute force attack is half the number of possible keys multiplied by the time required to test each key (by using it to decrypt the ciphertext and checking if the results are intelligible)

Cipher: a method of encrypting any text, regardless of content.

Ciphertext: a message that has been encrypted.

Code: system of communication relying on a pre-arranged set of meanings such as those found in a codebook.

Confidentiality: the ability to ensure that only an intended recipient can understand a communication.

Cryptanalysis: the practice of defeating attempts to hide communications. Cryptanalysts are also referred to as interlopers, eavesdroppers, enemies, opponents, and third parties.

Cryptography: the art or science of secret communication, or may be thought of as, the storage of information (for a shorter or longer period of time) in a form that allows it to be revealed to those you wish to read it yet hides it from everyone else.

Cryptology: includes both cryptanalysis and cryptography.

Cryptosystem: a method of encrypting information so that decryption can only occur under certain conditions, which generally means only by persons in possession of a decryption engine (like a computer) and a decryption key.

Decryption: the transformation of ciphertext back into plaintext.

DES: Digital Encryption Standard, a 56-bit single key cipher adopted for use within the U.S. in 1977. Developed in the early 1970s by the National Bureau of Standards (since renamed the National Institute of Standards and Technology, or NIST) to be a national interoperable cyrptographic algorithm.

Encryption: the process of disguising a plaintext message to conceal its substance.

Integrity: the assurance that a message has not been modified in transit.

Key: the number or alphanumeric sequence needed to encrypt or decrypt a message. The idea is roughly equivalent to the idea of a "password."

Key Escrow: a system where the government is allowed a "back door" to the encryption keys of private parties. Keys are registered in private data repositories and under certain conditions, (similar to those currently needed to obtain a subpoena or wiretap)law enforcement officials would be allowed to decrypt intercepted communications without their knowledge or consent.

Keyspace: the range of values for a given key (determines strength)

NIST: National Institute of Standards and Technology, the government agency charged with adopting a national standard cryptographic algorithm.

Non-Repudiation: the inability of an author to deny she sent a message

NSA: National Security Agency established in 1952 to be the U.S. government's chief signals intelligence and cryptographic department.

Object Code: computer program code that is directly executable by a computer. Humans generally cannot read object code

Plaintext: data that may be read and understood without any special measures. Also called "cleartext." Plaintext is converted into ciphertext by means of an encryption engine (like a computer) whose operation is fixed (cryptosystem) but functions in a way that is dependent on a piece of information (the encryption key)

Public Key Cryptography: a system of cryptography in which the key necessary to encrypt a message differs from that needed to decrypt the same message. Each party possesses both a public key and a private key; neither key may be derived from the other. Also known as "asymmetric cryptography."

Source Code: computer program code that is actually entered by a human programmer. Must be "compiled," or translated into object code, before a computer can execute the program.

Symmetric Cryptography: a cryptographic system in which the key needed to encrypt and decrypt a particular message is the same. This type of cryptosystem is much older than public key cryptography. The disadvantage of symmetric cryptography is that some secure method is needed to exchange the key or "password" in the first place. The most widely-know example of a symmetric cryptosystem is the widely used Data Encryption Standard (DES)

IV. A Short History of Privacy (and Privacy Law)

Q.: What do we mean when we discuss the term "privacy"?

The term privacy is a Rorschach-like term that expands and contracts and acquires a constellation of shifting meanings depending on context. Here are some common ways that the word "privacy" is used:

[T]he privacy of private property; privacy as a proprietary interest in name and image; privacy as the keeping of one's affairs to oneself; the privacy of internal affairs of a voluntary association or of a business; privacy as the physical absence of others who unqualified by kinship, affection, or other attributes to be present; respect for privacy as the respect for the desire of another person not to disclose or have disclosed information about what he is doing or has done; the privacy of sexual and familial affairs; the desire for privacy as the desire not to be observed by another person or persons; and the privacy of the private citizen as opposed to the public official. (US Congress, Office of Technology Assessment, OTA-TCT-606, Information Security and Privacy in Network Environments 82 (Sept. 1994))

Q: What does it mean to say, "I have a RIGHT to Privacy"?

Explicit legal protection for individual privacy has been a development occurring in the U.S. legal system over the past century. However, the foundation of the legal recognition of a right to privacy is ancient. For example, the idea that there is a boundary between the public and private realms of social lives at least dates back to Socrates and Aristotle.)

Until the 19th century, communications between persons were "private" as long as they were out of earshot of third parties. One could use codes to conceal written messages or simply in a world that was largely illiterate, simply putting something in writing meant that it was inaccessible to most people. Perhaps beginning in the Renaissance with the development of the printing press and definitely by the beginning of the industrial revolution, communication possibilities have been steadily expanding -- with a concomitant loss of an ability to keep communications and information private.

Explicit legal recognition of privacy rights did not occur within the US prior to the late nineteenth century. There were however, many constitutional and common law antecedents to privacy rights. The norms undergirding privacy rights overlapped with first amendment freedom of speech norms pertaining to associational freedom and freedom of the press. The fourth amendment right to be free of unreasonable search and seizures was another source for protection of privacy norms, as was the third amendment right to not have troops quartered in one's home. The fifth amendment right against self-incrimination also worked to protect an individual's privacy. In the common law areas of nuisance and trespass, there are implicit norms protecting one's privacy, although they may be articulated in the language of property ownership.

Professors Richard Turkington and Anita Allen finds the earliest judicial discussion of privacy in an 1881 Michigan appellate court opinion in which tort relief was granted to a plaintiff who sued a defendant who had observed her during childbirth without the plaintiff's permission. Turkington and Allen also notes that Judge Thomas Cooley discussed the "right to be let alone" in his treatise on torts.

Traditionally, the right to privacy (the violation of which gives rise to an action in tort) is considered to have received its clearest articulation in Samuel Warren and Louis Brandeis' 1890 Harvard Law Review article, "The Right to Privacy, " which they characterized as arising out of the "right to be let alone." However, state courts did not begin recognizing such a right of action in tort until the first decade of the twentieth century.

The widespread legal recognition of a right of right was relatively slow. In the final volume of the First Restatement of Torts, published in 1939, a tort for invasion of privacy was officially recognized, even though as of 1947 only nine jurisdictions recognized a common law right of privacy. By 1960, when William Prosser published an influential article on Privacy in the California Law Review, he was able to identify over three hundred appellate cases dealing with the common law right to privacy.

The common law right of privacy has been held at different times and in different courts to include interests as disparate as protection against commercial exploitation against one's likeness or image, portrayal of information about an individual in a false light, public disclosure of private facts about an individual and intrusion on seclusion. Legislatures have enacted assorted statutes protecting the confidentiality of certain types of information about individuals by both public and private institutions.

Beginning in the 1960s, in Griswold v. Connecticut, 381 US 479 (1965), the U.S. Supreme Court expressly recognized a general constitutional right of privacy independent of the fourth and fifth amendments prohibitions on unreasonable searches and seizures and forced self-incrimination as well as from the common law right of privacy. Griswold held that a constitutional right of privacy shielded the decision whether or not to use contraceptives from state interference.

A significant ambiguity about privacy rights is whether, as a definitional matter, privacy rights have to do with limiting/controlling access to data and information about ourselves or whether privacy right pertain to the ability to make fundamental decisions about oneself, such as reproductive choices, in the face of contrary opinions.

This module focuses on the former, that is, with the ability to limit and control access to data and information about ourselves. There are two types of informational privacy involved: (1) the ability to control access to transactional information, i.e., data footprints and (2) control over access to the substantive content of one's communications. The module written by professor Ann Bartow addresses the issue of transactional privacy and "data footprints." This module considers the issue of securing the privacy of the contents of one's communications and documents stored and transmitted electronically.

  1. Secret Writing Through History

Q.: What are some examples of the historical uses of codes?

Writing has been a medium of sharing thoughts and ideas for thousands of years, a method of enlightenment and understanding. However, for almost as long as humans have used writing for communication, writing has also been used to deliberately conceal messages and meanings from unwanted eyes through the use of codes and ciphers. For example, Julius Caesar used what came to be known as "Caesar cipher" to ensure that his military and political communications remained safe from prying ears and eyes.

Science -fiction writer Bruce Sterling has described how the ancient Assyrians used a form of "funerary cryptography " in which tombs would have odd sets of cryptographic cuneiform symbols written on them. These curious symbols would sometimes cause curious but mystified passerbys to utter the translation outloud. What they were in fact doing when they read the markings out loud was to inadvertently utter a blessing for the dead.

Egyptian hieroglyphics were made deliberately arcane and obscure, so that not many persons were literate in writing or in interpreting a cipher. This was done to make sure that the elite powers of priests and scribes remained solely with them. The relation between literacy, technology, communication and power has always been intimate, if complex. Literacy and the ability to communicate were and still are, central to the retention of power in a digital communication age.

Consider Paul Revere's famous warning: "one if by land, two if by sea" as a rudimentary, if effective use of a code. The key is the number of lanterns in the Old North Church and the message is effectively concealed from all who do not know the key.

In the 1940s, Alan Turing, British mathematician, cryptographer and namesake of the "Turing Test" for artificial intelligence, was part of a top-secret team of British code-breakers that used electronic machines that might be thought of as predecessors to computers to break Nazi messages that had been encrypted using the German Enigma Code machine. This top-secret breakthrough was significant and had much to do with the ultimate Allied victory, enhancing their ability to track and sink German U-Boats and learn of other strategic maneuvers. England's top-secret triumph meant that at the dawn of the cold war era, cryptography would continue remain a jealously guarded state secret, at least up until the mid 1970s.

In 1949, Claude Shannon, pioneer information theorist, described the "entropy" (degree of disorder or uncertainty in a system) of a message as well devising a formal measurement for the amount of information within a particular stream of digital bits. In the postwar era, following Shannon’s important theoretical work, early digital computers (in the name of National Security) were able to repeatedly chomp through dense streams of encrypted information, searching out repetitions, structures and variations from the random.

Q.: Why should I care about cloak-and-dagger stuff like codes and cryptography?

While law enforcement officials have argued for key escrow encryption systems incorporating back-door key access such as the Clipper Chip on the grounds that they need to be able to intercept the communications of terrorists, drug dealers and other criminals, there are many legitimate reasons and situations where individuals and groups of people may want to conceal information. For example, companies may possess confidential employee data (medical, salary and other records) that may be susceptible to access by data entry clerks -- encryption protects these records. Employees may share workspaces and equipment with others and may want to ensure the confidentiality of information about projects they are working on. Companies may need to transfer confidential information between branch offices and field agents -- encryption helps keep the information secure from interception. Companies may possess proprietary information and trade secrets, R & D results, product plans, manufacturing processes, legal and financial data, etc. that they want to keep secret from competitors. An individual or company may want to transmit sensitive information on a computer that they would like to keep private from persons who may examine the computer en route. Or two persons may correspond via e-mail and wish to keep the content of their electronic communications private. There are two general situations: one, there is a need to keep information in a particular location, invulnerable to unauthorized persons and two, there is a need to keep information that is being transmitted from point A to point B safe from interception. In the second situation there is a problem of secure key exchange because the person who will be receiving and decrypting the information will usually not be the person who sent and encrypted the information.

From ancient times to the twentieth century, cryptography has traditionally been the domain of armies, spies and diplomats. However, in the increasingly digital world of the Internet and networked electronic communications, personal and corporate privacy information are increasingly placed jeopardy. However the very technological tools that place more and more of our confidential communications and data in danger also provide the means to protect and secure that information. In a privacy sense, as with Dickens, it is the worst of times, but it is also the best of times.

Q.: What is the NSA?

During the Cold War Era, U.S. cryptography came under the jurisdiction of the newly created National Security Agency ("NSA"), an extremely secretive bureaucracy established by President Harry Truman in 1952. For the next 23 years, the NSA held a monopoly on the use of strong cryptographic tools within the U.S., while pursuing its primary purpose, which was to protect the communications of the U.S. government, and crack those of the U.S. government’s real, imagined, or potential adversaries. The NSA, with its rumored world's largest number of employed mathematicians, labored to make sure that only the NSA was in possession of every known cryptographic technique. Under the auspices of the Invention Secrecy Act of 1952, the NSA was successful for much of the early Cold War era in withholding the granting of patents on new discoveries in cryptography in the interest of national security.

While information about "public key" cryptography became more widely known in the late 1970s and 1980s, the NSA continued trying to prevent the demise of its influence over cryptographic technology. The NSA used its power to influence scientific research grants in cryptography and mathematics at the National Science Foundation. When scientists complained, the Public Cryptography Study Group was founded in 1978, but even that, the NSA had a hand in. This group established "voluntary control" measures, but the word "voluntary" was illusory, because the NSA had pre-publication access to all research papers on encryption by private parties, and therefore had the ability to censor any material it deemed inappropriate with regard to national security. This situation was tolerated for years because many felt that it was indeed important for the NSA to remain on top of cryptographic techniques because of national security concerns. However, by the mid-1990s the NSA attempts to control encryption tools in the name of national security had begun to draw very strong opposition in the cryptographic community, spurring the series of lawsuits that end this module.

Q.: How do Banks and Other Financial Institutions Keep Their Electronic Data Transmissions Confidential?

Up to the early 1990s, many banks and financial institutions used DES, or Data Encryption Standards, which is a 56-bit symmetric cryptosystem designated in the 1970 by the then-named National Bureau of Standards (since renamed NIST (National Institute for Standards and Technology). DES has an interesting, if somewhat mysterious history.

In the 1960’s, IBM developed an encryption program named LUCIFER, a variant of which became the NSA’s Date Encryption System (DES). DES was adopted as a federal standard on November 23, 1976 and has continued to be re-certified every five years, although by the mid-1990s, most cryptographers believe that DES has reached the end of its useful life as a cryptographic tool.

DES is the most widely used symmetric cryptosystem (the encryption and decryption keys are the same), and stands at a NSA regulated 56 bits (bits are the unit by which the strength of cryptographic keys are measured). However, from the beginning of its use as a system to encrypt business and financial data in the 1970s, DES was regarded with suspicion due to rumors that the NSA had forced IBM to intentionally weaken the system down to 56 bits, so that it was easier for the NSA, and no others, to break DES encoded messages when they saw a need for it.

The process for using a single-key algorithm such as DES involves: (1) agreeing on a security key with the intended receiver or sending her the key; (2) supplying plaintext and using the key to an algorithm to create the ciphertext; (3) sending the ciphertext to the receiver; and (4) having the receiver supply the ciphertext and the key to an algorithm to create the plaintext. One cryptography scholar says that, "the original design submitted by IBM permitted all 16 x 48 = 768 bits of key used in the 16 rounds to be selected independently. A U.S. Senate Select Committee ascertained in 1977 that the [NSA] was instrumental in reducing the DES secret key to 56 bits that are each used many times."

Q.: How Do You Tell How Strong an Encryption Algorithm Is?

Encryption algorithms are described in terms of relative strength. The stronger protection from a particular algorithm, the harder it is to crack some message that has been encrypted by that algorithm. The length of the key determines an algorithm's relative strength. Key length is described in terms of "bits" (how many numbers are in a particular key). The range of values for a given cryptographic key is called "keyspace". An important characteristic of complex cryptosystems is that the relative strength of different keys is exponential, rather than geometrical. This fact means that a 24 -bit algorithm is not three times as hard to break than an 8-bit algorithm, but is over 65,000 times harder to break (65,536, or 2 to the 16th power) than an 8-bit key.

With the rise of parallel processing (using many computers simultaneously), many encryption experts believe that DES ciphertext is crackable in a matter of hours. However, the NSA has never affirmed or denied their ability to decipher DES (some people have said that NSA stands for "Never say Anything" or even "No Such Agency"). However, if a method of cracking DES ciphertext was developed, such a developer would benefit by keeping that method secret, because if it were widely known, people would stop using that cryptosystem, and the ability to crack it would become worthless.

Q.: What is "Public Key" Cryptography?

The NSA’s cryptographic monopoly came to end in 1975, with the invention of "public-key" cryptography by mathematicians Whitfield Diffie and Martin Hellman. Diffie, then a mathematician at MIT's Artificial Intelligence Lab, along with Martin Hellman created public-key encryption in the mid-1970s. During the mid-1960s, individual user files in time-shared mainframe computers were protected by passwords. However, the system manager had complete access to all the passwords of all the users. If law enforcement officers served the system manager with a subpoena, the subpoenaed passwords and user files would go to law enforcement. Whitfield Diffie wanted to find a way to eliminate the need for trusted third parties like the system manager, and he realized that the way to do this was through a decentralized system.

Diffie focused on age-old problem of key management. As mentioned earlier, Julius Caesar had developed a simple cipher system in order to encode sensitive military messages. What Caesar did was take an original message ("plaintext") and then encrypt it into what appeared to be gibberish ("ciphertext"). The recipient of the gibberish message would use the same key as Caesar and thus would decrypt the message back into plaintext. The big problem in this cryptosystem was protecting the key. Anyone, who knew Caesar key would be able to understand the encrypted message, therefore to keep his communications secret, Caesar would have to change key often. But changing keys often created a related problem: if you change the key frequently, how do you inform your spies behind enemy lines what the new key is or when you're changing it. If you tell them what the new key is using the old code, which may have been intercepted and broken, then enemies will learn new code and your secret plans would be for naught.

Whitfield Diffie and Martin Hellman, through the aid of rapidly increasing computing technology, conceptually spilt the then-unitary cryptographic key in 1975. Diffie and Hellman envisaged a cryptosystem in which each user had two keys, a "public" key and a "private" key, each unique to owner. Any message that was encrypted with one key may be decrypted by the other, and neither key can be used to determine the other. If I send you a message, I first need to obtain your "public" key. It is possible to distribute one's public key without compromising security -- however, possessing someone's "public" key is absolutely no help in deciphering information about what one's "private" key is. If I use your "public" key to encrypt a message to you, my message will be pure gibberish to anyone who may intercept it and only one person in the world can decode it -- you, who hold the other key, your "private" key. If you want to respond to me with a secret message, you would then use my "public" key to encrypt your message and I would use my "private" key to decrypt.

Thus, with the advent of "public key" encryption, instead of using the same key to encrypt and decrypt a message (such as with DES), every user in the system would have both a public key and a private key. The public key can be published or be made available in a key repository and the private key would never be revealed.

With the advent of computer programs that implemented variants of the Diffie-Hellman scheme, the ancient limits on cryptography were suddenly vanquished. After Diffie and Hellman published their ideas in 1975, the de facto NSA monopoly on cryptographic tools was on its way out. In 1977, three MIT mathematicians -- Ronald L. Rivest, Adi Shamir and Leonard M. Adelman -- developed a cryptosystem that put Diffie-Hellman's findings into practice through the use of very large prime numbers. Needless to say, this was the NSA's worst possible nightmare, that is, the prospect that every company/citizen within or without the United States would have access to extraordinarily strong cryptographic tools and techniques of the sort that had formerly ranked alongside advanced missile guidance systems, biological warfare knowledge and nuclear power as jewels in the crown of the US's most jealously guarded national security secrets.

Q.: Why Use Public Key Cryptography?

Public key cryptography allows users to accomplish three important objectives with their electronic communications. The first was ensuring the confidentiality and privacy of a communication, that only the intended recipient could read the message. Public key cryptography also allowed for more efficient authentication as well, that the recipient of an encrypted communication could accurately ascertain and verify the identity of the sender of the message. A could send a message to B, encrypted with A’s private key, B then decrypts using A’s public key, thereby ensuring that the message really came from A. Along with ensuring confidentiality, Public key cryptography also ensures a message's integrity, i.e., that it hasn't been tampered with or altered en route. For example, public key encryption allows an electronic check to be confidential (no one but the recipient can read it), authenticated (that the check actually came from the person who "signed" it) and that no one tampered with or changed the check amount en route.

In a traditional symmetric cryptographic system, the same password/algorithm is used to encrypt and decrypt messages. In a public key cryptographic system each user has two keys: one, a "public" key that is widely published and easily available through some type of distribution infrastructure, and two, a "private" key that is never revealed. Any message encrypted with one key may be decrypted with the other, so that someone may use my "public" key to encrypt a message to me, and I can use my "private" key to decrypt the same message. Neither key can be used to determine the other, and public key cryptography allows people who never met or exchanged a key to communicate in a highly private manner.

Q.: What are the RSA algorithms?

The strength of public key algorithms rest in large part upon how large the key is (how many bits of information make up the key). The larger the key, the harder it is to break the code. In 1977, three MIT mathematicians, Ronald L. Rivest, Adi Shamir, and Leonard M. Adelman created a set of algorithms using the Diffie-Hellman model. Their algorithm (used in an encryption program such as RIPEM) creates encryption keys by multiplying two extremely large prime numbers together. In order to break this code, one needs to reverse the multiplication process, however do so can take a long time, even on a powerful computer. These algorithms were seen as a big improvement over the DES. Whereas DES was limited to 56 bits, RSA keys could be any size. Also, the relative strength of different keys are exponential rather than geometrical, meaning that a 24 bit algorithm is not three times as hard to break than an 8 bit, but 64,000 times harder to break. Public key algorithms like the RSA use prime numbers. This means that even though it may be easy to take prime numbers and come up with an end result, it is much harder to take a large number and find out what prime numbers were used to produce it (which is what would is needed to discover the private key).

Q.: What are the weaknesses or drawbacks of public key cryptosystems?

As strong as public-key systems may be in comparison to traditional symmetric cryptosystems, they are not invulnerable. However, the strength of public key cryptosystems may also be a weakness. A disadvantage is that with extremely large prime number keys used by "public key" encryption systems, it takes much longer to decrypt a message than with a single key system such as DES. However, rapidly advancing computing technology may mitigate this drawback somewhat.

For example, in 1977, Rivest, Shamir and Adelman, issued a challenge to the world in an article in Martin Gardner's column in Scientific American. A single phrase was encrypted using the RSA 129-bit algorithm and readers of Scientific American were challenged to find the plaintext. Using 1977 computing technology as a benchmark, Gardner estimated it would take millions of years to decrypt the RSA encoded ciphertext. However, an MIT student using almost 2,000 computers worldwide hooked together into a network to undertake the necessary calculations decrypted the message after 16 years and 8 months on April 16, 1994.

Another disadvantage of public key systems is that there may be a problem of key validation. When you wish to send encrypted data to someone, without the required key-validation protocol, how can you be sure that the public key that you have obtained for the receiver is indeed truly his/her public key? A knowledgeable encryption specialist could publish a public key in the receiver’s name, and then proceed to intercept any message you send using that key. Establishing an appropriate key-distribution and validation system is a significant cost to implementing an effective public key cryptosystem.

Another drawback isn't technical, but rather intrinsic to the use of strong cryptography. Cheap, easy-to-use extremely strong cryptography unfortunately shields both the law abiding and legitimate and lawless alike. The debates in this area have been heated and controversial, but they are not the focus of this module. However, one should note that throughout much of the 1990s, there has been an ongoing debate over whether the government and its law enforcement agencies such as the FBI should have access, under certain circumstances to an electronic "backdoor" via an Escrowed Encryption Standard such as the Clipper Chip or some other key escrow scheme.

For example, FBI agent Jim Kallstrom, was quoted in an article by Steven Levy in the New York Times Sunday Magazine: " From the standpoint of law enforcement, there's a super-big threat out there -- this guy is gonna build this domain in the Bronx now, because he's got a new steel door and none of the welding torches, none of the boomerangs, nothing we do is gonna blast our way in there. Sure, we want those new steel doors ourselves, to protect our banks, to protect the American corporation trade secrets, patents rights, technology. But people operating in legitimate business are not violating the laws -- it becomes a different ball of wax when we have probable cause and we have to get into that domain. Do we want a digital superhighway where not only the commerce of the nation can take place but where major criminals can operate impervious to the legal process?"

Q.: If the U.S. Government Does Not Want to See Virtually Unbreakable Cryptography From Becoming Routine, What Has it Been Doing to Prevent It?

There have been two interrelated but separate fronts. The first is focused on trying to adopt key escrow standards domestically and the second is to control export of strong cryptography via stringent export licensing schemes. As mentioned earlier, there have been several government initiatives to have either mandatory or voluntary key EES or key escrow schemes adopted by the computing and communications industry. This is not the focus of this paper, but University of Miami Law School Professor A. Michael Froomkin's seminal 1995 article, "The Metaphor is the Key: Cryptography, the Clipper Chip, and the Constitution," is a good starting place.

Basically, escrow key systems such as the Clipper Chip require the placement of private user keys in data repository under federal control.

Under conditions similar to those currently required for law enforcement officers to obtain warrant/wiretap, they could also could obtain escrowed encryption key. Escrow key systems are a step backward. Recall that Whitfield Diffie and Martin Hellman developed public key cryptography precisely to get around the problem of having a trusted third party that held keys in escrow. For current legislative developments such as the proposed "Electronic Date Storage Act" or the "Security and Freedom through Encryption (SAFE) Act", see the Center for Democracy & Technology's Encryption website at http://www.cdt.org/crypt/970312_admin.html and http://www.cdt.org/crypto/legis_106/SAFE/index.shtml#provisions

On the domestic front, in the late 1980s and early 1990s, the NSA and NIST proposed replacing the increasing vulnerable DES algorithm with a new algorithm they named Skipjack, which was purported to be over 16 million times stronger than DES. The Skipjack encryption algorithm was integrated into a system using what was called the Law Enforcement Access Field (LEAF) that added a signal to an encrypted message that directed a potential wiretapper (theoretically from a government law enforcement agency) to the appropriate key to decipher message. The Skipjack algorithm and the LEAF system were integrated into the Capstone Chip that could handle phone communications, computer data and digital signatures.

In 1993, AT & T approached NSA in 1993 with a Surity 3600 phone security system which was designed to use the nonexportable DES algorithm, seeking an export license. Knowing the high degree of market penetration AT & T possessed with regard to telecommunications devices, the NSA suggested that AT & T use the a stripped down version of the Capstone chip called the Clipper Chip (which incorporated both the Skipjack algorithm and a LEAF system that used an escrow key) in their products and export licenses would not be a problem. If AT & T incorporated the Clipper Chip, it would receive two valuable things: first, AT & T would receive a contract with the U.S. government to buy tens of thousands of phones (with no export ban) and, second, AT & T would also sell a lot more phones to private parties because companies would need to use Clipper Chip-quipped devices to communicate with government Clipper Chip phones.

In the early 1990s, the NSA had presented a Clipper Chip proposal to the Bush Administration, but nothing happened. However, within two months of taking office, the Clinton Administration picked up the Clipper Chip proposal and was aggressively pushing to implement it as soon as possible. In early 1993, NIST was ordered to consider using the Clipper Chip as the new encryption standard. The ensuing heated public debates have focused on how to reconcile two fundamentally opposed interests: individual privacy and public safety.

In 1993, NIST published the proposed Clipper Chip Key Escrow standard in the Federal Register, and allowed 60 days for public comment. NIST received 320 responses, only 2 positive. In 1994, Michael R. Nelson, a White House Technology Consultant said that the Clipper Chip proposal was "the Bosnia of Telecommunications."

By the beginning of the 1990s, RSA Data Security, Inc. (the company formed by Rivest, Shamir and Adelman) public key encryption technology was in the process of being adopted by companies such as Apple, AT & T, Lotus, Microsoft and Novell. However, the NSA believed that extremely strong cryptographic techniques should be treated like a munition and therefore needed an export license. Despite the argument from U.S. companies that they wouldn't be able to compete in a global market if they were forced to 'water-down" their computer products by incorporating less than 40-bit cryptography Congress was sympathetic to the NSA's national security arguments and began working on regulations that required export licensing of products incorporating strong cryptographic techniques.

This is the other front of the cryptographic policy debates. In addition to aggressively pushing for domestic use of the Clipper Chip, the Clinton Administration to implement stringent export restrictions and licenses on cryptographic products stronger than 40-bit lengths. This is the subject of the trilogy of crypto cases that conclude this module.

Q.: Who are the Cypherpunks?

The Cypherpunks are a loosely organized group of privacy advocates and computer programmers co-founded in September 1992 by Eric Hughes, a freelance cryptographer and Tim May, physicist from Intel. Steven Levy describes the common premises of the Cypherpunks as: (1) strong public key cryptography is a liberating tool, empowers individuals; (2) that should be used to protect communications from the government; (3) and the Cypherpunks should educate and distribute widely strong cryptographic tools to members of the public. The Cypherpunks Hyperarchive of threaded e-mail discussions is available at <http://www.inet-one.com/cypherpunks>

Q.: Where did the public key encryption program known as Pretty Good Privacy (PGP) originate?

In 1991, Philip R. Zimmerman, a software engineer and cryptographic consultant, put together a public key encryption program he called Pretty Good Privacy (PGP) for computer data and e-mail use. PGP at the time used a military-grade 128-bit key. Zimmerman literally gave a small number of programs away for free. Sensing that the government was getting very interested in cryptography, Zimmerman wanted to get free copies of PGP in circulation before a possible government ban on strong encryption tools. One of the people that Zimmerman had given a copy of PGP installed it on a computer attached to the Internet it on a computer attached to the Net the day after Zimmerman's free release and within days, thousands of people had copies of military-strength PGP. Partially to avoid violating the ITAR and being charged with munitions trafficking (see below), the second upgrade of PGP was made from New Zealand. While ITAR (and later EAR) controls exports passing out of the U.S., it is much more difficult restricting imports passing into the U.S. from another country. Since Zimmerman's release of PGP in 1991, it has been through several upgrades and revisions , and PGP 6.5.1 is available as freeware downloadable from the MIT distribution site at http://web.mit.edu/network/pgp.html.

In early 1993, Philip Zimmerman received a visit from US Customs Service Agents wanting to know how PGP found its way overseas without an export license from the State Department. Beginning in Fall 1993, Zimmerman was targeted by a federal grand jury investigation in San Jose, California. The investigation dragged on for more than three years but the case was eventually dropped and no charges were brought because of the difficulty in obtaining injunctive relief outside of the U.S.

While no charges were ultimate brought, Zimmerman's situation was a foreshadowing of the disputes that Philip Karn, Peter Junger and Daniel Bernstein would encounter when they challenged the ability of the U.S. Government to use export licensing regimes to prevent the placing of strong encryption programs on Internet-accessible computers.

VI. The Export Restriction Regulations: Legislation and Litigation

This module now shifts from question and answer mode into a brief chronology and outline of export restrictions on strong encryption in the 1990s.

On November 15, 1996 President Clinton transferred jurisdiction over regulated cryptographic tools from the State Department to the Commerce Department in Executive Order No. 13026. While the Clinton Administration never formally acknowledged that this was a response to the Bernstein II decision in the Northern District of California which was released on October 24, 1996. In the Bernstein II opinion Judge Marilyn Patel found the ITAR regulations an unconstitutional prior restraint as applied to Professor Bernstein's SNUFFLE encryption program. The jurisdiction shift laid the groundwork for the Bernstein III and Bernstein IV opinions. However, in order to understand the current Export Administration Regulations that are administered by the Commerce Department, you must also have an understanding of the pre-1996 regulatory scheme, which played a big part in bringing the controversies over strong encryption to where they are today.

  1. Pre-1996 Regulatory Scheme: International Traffic in Arms Regulations
  2. The Old (Pre-1996) Regulations: Glossary

    AECA: The Arms Export Control Act, which is the Congressional statutory authorization for the International Traffic in Arms Regulations (ITAR)

    BXA: Bureau of Export Administration, the federal agency that controls most exports from the U.S.

    U.S. Department of State: the Cabinet Department that initially held jurisdiction over the export of cryptographic items under the AECA and the ITAR

    ITAR: The International Traffic in Arms Regulations. Regulation that controlled the import, export and manufacture of items on the United States Munitions List (USML). To be found at 22 C.F.R. Sections 120-130.

    ODTC: Office of Defense Trade Controls. Prior to the end of 1996, the OTDC was the federal agency responsible for determining whether an item qualified as a controlled item, and therefore required an export license, under the USML and the ITAR.

    USML: The United States Munitions List was created under the AECA as the list of "defense articles and defense services," which were almost all weapons, explosives, and military vehicles. Items listed on the USML required a munition dealer license before they could be exported or imported. Until the end of 1996, cryptographic items were controlled under Category XIII(b)(1) of the list. To be found at 22 C.F.R. Section 121.1

    * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

    Prior to 1996, the International Traffic In Arms Regulations ("ITAR") and the Arms Export Control Act ("AECA") were used by the U.S. Department of State to regulate the export of products using encryption. ITAR concerned primarily explosives, weapons and military vehicles — and until November 15, 1996, also included "cryptographic (including key management systems, equipment, assemblies modules, integrated circuits, components or software with the capability if maintaining secrecy or confidentiality if information or information systems…" Generally, any product incorporating encryption stronger than a 40-bit algorithm required an export license, which were virtually never granted. Except for financial and banking institutions, licenses for export of products containing 56-bit DES were restricted.

    ITAR allowed the U.S. State department to directly control items listed on the USML as well as including "technical data" about a listed item. Thus, if 40-bit plus strength encryption software was listed on the USML, then "technical data" related to encryption software, i.e., documentation of encrypted algorithms was also subject to export license restrictions. The 40-bit line was drawn so that theoretically, the NSA would be able to crack any encryption shipped abroad. However, in reality during the 1990s, extremely strong cryptography without any "backdoor" escrowed key was widely available from numerous foreign sites in countries such as Finland, making the rationale of preserving the NSA's ability to crack foreign codes was implausible. For example, after a British national posted source code for an virtually identical encryption algorithm on an internet discussion list, U.S. Law Professor Peter Junger was unable to respond by posting his own computer source code in order to be in compliance with ITAR.

    ITAR was created by the ACEA, which was enacted in the early 1990s in response to Congressional concerns over military security threats to U.S. interests in the Persian Gulf region due to an overproliferation of munitions. Willful violation of the ITAR was a criminal offense (with fines of up to $1 million and imprisonment of up to 10 years), to be investigated and

    administered by the President, or the U.S. Department of State, to whom the President delegated his authority to act under the AECA. The State Department (through the Office of Defense Trade Controls, or ODTC) determined which defense "articles and services" were going to be subject to licensing requirements and placed on the United States Munitions List. Ostensibly, the USML was created and administered under the AECA to help advance "world peace and the security and foreign policy of the United States."

    In the event of unclear coverage, a party had the option to submit a "commodity jurisdiction" request to the OTDC, which would then make a determination of whether or not the product is covered by the USML. Once such a designation was made, it was not subject to judicial review, except on constitutional grounds. This is the type of application that Professor Bernstein made in 1992 under the ITAR so that he could post his strong encryption algorithm SNUFFLE on the Internet so his students could access and download it.

    An "export" for purposes of the ITAR included sending outside of the U.S. and "[d]isclosing (including oral or visual disclosure) or transferring of technical data to a foreign person, whether in the U.S. or abroad," and thus could easily include the posting of source or object code on an Internet USENET newsgroup that discussed cryptography such as "sci.crypt."

  3. Post-1996 Regulatory Scheme: Export Administration Regulations

The New (post-1996) Regulations: Glossary

CCL: The Commerce Control List. Analogous to the USML for non-military items. Anyone wishing to export items listed on the CCL must obtain a license from the BXA. To be found at 15 C.F.R. Section 774.

U.S. Department of Commerce: Cabinet agency given jurisdiction over export of cryptographic items after 1996 by the Clinton Administration Jurisdictional Transfer. The Commerce Department is responsible for issuing the EAR amendments.

EAR: Export Administration Regulations. The regulations controlling the import and export of items found on the CCL. Analogous to the ITAR under the old regulations. To be found at 50 U.S.C. Sections 2401 et seq.

EAR Amendments: New rules promulgated by the Commerce Department to control cryptographic items per President Clinton's Jurisdictional Transfer order. To be found at 51 Fed. Reg. 68572-587.

Jurisdictional Transfer: Executive Order 13026, to be found at 61 Fed. Reg. 58768 (issued November 15, 1996), in which President Clinton transferred jurisdiction over most cryptographic items from the State Department to the Commerce Department.

* * * * * * * * * * * * * * * * * * * * * * * * * *

On November 15, 1996 President Clinton issued Executive Order 13026 transferring jurisdiction over regulated cryptographic tools from the U.S. State Department to the U.S. Commerce Department, citing concern for the increasing use of encryption tools in nonmilitary contexts.

What this meant was that all encryption tools that had formerly been listed on the USML and regulated under ITAR via the AECA, were now placed on the Commerce Control List (CCL), which had been created regulated Export Administration Act of 1969 which allowed the Bureau of Export Administration (BXA) to issue the 1997 Export Administration Regulations (the "EAR" amendments). Anyone seeking to export an item listed on CCL needs a prior license from BXA. In his Executive Order, President Clinton explicitly retained the non-judicial review provisions regarding licensing and listing determinations. Also,, Clinton's Executive Order required that "export" for purposes of encryption tools meant all forms of internet access.

The Commerce Department officially assumed jurisdiction on December 30, 1996, when it issued the rapidly drafted interim EAR amendments. Under the definition of encryption items, the three new subcategories added to the CCL were: (1) encryption commodities, (2) software and (3) technology containing encryption features.

One distinction between the ITAR and the EAR was that the EAR amendments allowed licensing exemptions after BXA review for software already possessing key escrow and recovery, such as the Clipper Chip. The EAR amendments also allowed for export licensing for 56-bit key encryption on the condition that if an applicant "makes satisfactory commitments to build and/or market recoverable encryption items and to help build the supporting international infrastructure." The EAR amendments also provided for an expedited 15-day review for "mass market software" as long as it incorporates government-approved algorithms and contains key lengths no longer than 40-bits.

Note however that the EAR Amendments prohibit any person without an export license from "providing technical assistance (including training) to foreign persons with the intent to aid a foreign person in the DEVELOPMENT OR MANUFACTURE OUTSIDE THE UNITED STATES of encryption commodities and software that, if of United State origin, would be controlled." The ITAR did not have this language which probably led to Philip Zimmerman's investigation being dropped, however, had this language been in place at the time, Zimmerman would have been in direct violation when the second update/release of PGP was made over the Internet from a site in New Zealand.

This is the administrative and regulatory backdrop that was in place in the mid-1990s when the crypto trilogy of cases arose.

C. Brief Background on the Bernstein I, II, & III cases.

In 1992, a then-Berkeley graduate mathematics student, Daniel Bernstein came up with a public key encryption algorithm he named named "Snuffle." He prepared the "Snuffle" materials in two formats: one, a paper for publication and second, computer source code that he wanted to post on the UseNet newsgroup "sci.crypt, " a group devoted to discussions of cryptographic techniques. However, Bernstein also knew that he ran a risk of violating the ITAR if he made such a posting and so he submitted a Commodity Jurisdiction Request in June 1992 as required by ITAR to the Office of Defense Trade Controls. The ODTC responded that they considered both his paper and the source code for "Snuffle" as defense items as defined by Category XIII of the USML, and as such, were subject to licensing by the U.S. State Department. Bernstein corresponded inconclusively with the ODTC about their rationale for this classification for a year.

In 1993, Bernstein, in an attempt to clarify exactly what was and wasn't classified as a defense item and therefore subject to export licensing, submitted five Commodity Jurisdictions to the ODTC for: (1) his paper, entitled, "The Snuffle Encryption System;" (2) "the computer source code for the encryption program "Snuffle.c"; (3) the computer source code for the decryption program "Unsnuffle.c"; (4) a set of plain English instructions on how to use "Snuffle"; and (5) instructions on how to program a computer to use "Snuffle." The ODTC determined all five items were defense items and therefore subject to export licensing.

1. Bernstein v. U.S. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996) (Bernstein I)

In September 1992, Bernstein appealed the ODTC's Commodity Jurisdiction within the agency, but because he hadn't received a response in over a year, he sued for declaratory and injunctive relief against the ODTC enforcing ACEA and ITAR enforcement. Bernstein's claims were limited to Constitutional claims since the ACEA foreclosed judicial review of administrative classifications. Bernstein claimed the ITAR constituted (1) an impermissible content-based speech restriction; (2) that the export restrictions constituted an invalid prior restraint scheme; (3) that the ACEA/ITAR scheme was unconstitutionally vague and overbroad; and (4) that the way the ITAR had been administered constituted an "abuse of discretion" under the Administrative Procedure Act.

The State Department moved to dismiss Bernstein's claim, and also sent him a letter that told Bernstein that his academic paper and the other two set of explanations in English were not subject to export control. However, the two pieces of computer source code were defense articles.

Judge Marilyn Patel, writing for the Northern District of California , rejected the government's motion to dismiss and ruled that Bernstein indeed had a colorable constitutional claim. She also rejected the government argument that computer source code was conduct, not speech, and as such, was not protected by the first amendment. Importantly, Judge Patel found that while source code may be functional (a set of instructions to a computer) it was also a language, and therefore expressive and protected under the first amendment.

2. Bernstein v. U.S. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) (Bernstein II)

Eight months after denying the government's motion to dismiss, Judge Patel issued the court's final judgement. Her opinion focused on the question of whether the ACEA/ITAR scheme of export licensing constituted an unconstitutional prior restraint with respect to Bernstein's encryption source code. Applying the three-part test from Freedman v. Maryland, Judge Patel held that ITAR failed, because ITAR CJ requests (1) had no clear time limit; (2) were not subject to judicial review; and (3) ODTC didn't meet burden of justifying its license denial. While Judge Patel did not breach the question of whether the ITAR regulations were content-based or content-neutral, she did find that the ITAR definition of "export" was not impermissibly vague or overbroad.

3. Bernstein v. U.S. Department of Justice, 974 F. Supp. 1288 (N.D. Cal. 1997)(Bernstein III)

Three weeks after the Bernstein II was released, the Clinton Administration shifted jurisdiction over encryption products from the U.S. State Department and ITAR to the U.S. Commerce Department with directions to draft interim EAR amendments. This sudden jurisdiction shift had the effect of rendering Bernstein II instantly moot: the source code for Snuffle, no longer enmeshed by ITAR, was now subject to the EAR amendments. Judge Patel allowed Daniel Bernstein to amend his complaint to include the Commerce Department and the EAR amendments.

On August 25, 1997, almost 5 years since Daniel Bernstein first began to consider posting "Snuffle" to UseNet, Bernstein III was released by the District Court.

Judge Patel began the Bernstein III opinion by noting that there were few differences between ITAR and EAR —both sets of regulations relied on national security and foreign policy interests to justify or bypass first amendment considerations. Both ITAR and EAR acted as unconstitutional prior restraint/licensing schemes aimed at protected first amendment speech without important procedural safeguards. Citing the Lakewood v. Plain Dealer Publishing Co. case, Judge Patel was concerned with the type of standardless discretion with regard to national security and foreign policy that the EAR amendments conferred on the BXA. She also was concerned about the EAR amendments "teaching exception" and the explicit bar to providing technical assistance (including training) to foreign persons could also include activities such as teaching or discussing cryptography in an academic setting thereby making "the most common expressive activities of scholars — teaching a class, publishing their ideas, speaking at conferences, or writing to colleagues over the Internet — are subject to a prior restraint . . . when they involve cryptographic source code or computer programs." Additionally, she found the printed matter exception (that allowed source code that was printed on paper to move without restriction, the same source code on a computer disk or posted electronically would trigger the export restrictions) in the EAR amendments "so irrational and administratively unreliable that it may well serve to exacerbate the potential for self-censorship." Additionally, Judge Patel found the distinction between print and electronic media in the EAR amendments untenable, particularly after the Supreme Court's opinion in Reno v. ACLU, "Thus, the dramatically different treatment of the same materials depending on the medium by which they are conveyed is not only irrational, it may be impermissible under traditional First Amendment analysis."

The U.S. Department of Justice appealed the Bernstein III case into the 9th Circuit, arguments were heard in December 1997, and the 9th Circuit released the Bernstein IV opinion, which brings us to the Crypto Trilogy.

  1. A CRYPTO TRILOGY: BERNSTEIN, JUNGER & KARN

Case Cites

9th Circuit: The Bernstein Cases

Bernstein I: Bernstein v. U.S. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996)(Judge Marilyn Patel refused to dismiss Daniel Bernstein's complaint and held that source code was speech for purposes of a First Amendment challenge to the ITAR) (decided February 1996)

Bernstein II: Bernstein v. U.S. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) (Judge Patel declared that the ITAR and AECA were an unconstitutional prior restraint licensing scheme with regard to cryptographic items) (opinion issued on October 24, 1996)

Bernstein III: Bernstein v. U.S. Department of Commerce, 974 F. Supp. 1288 (N.D. Cal. 1997)( (Judge Patel reprised her prior ITAR ruling with respect to the CCL and the EAR amendments)(opinion issued on August 25, 1997)

Bernstein IV: Bernstein v. U.S. Department of Justice, 176 F. 3d 1132 (9th Cir. 1999) (opinion issued May 9, 1999) (Judge Betty Fletcher affirmed the District Court's finding that the EAR regulations were facially invalid as a prior restraint on speech; but note Judge T.G. Nelson's strong dissent arguing that encryption source code is not expression but a functional tool)

D.C. Circuit

Karn v. U.S. Department of State, 925 F. Supp. 1 (D.D.C. 1996)(Judge Charles Richey granting U.S. Gov't Summary Judgement motion on Karn's claims); Karn v. U.S. Department of State, 107 F.3d 923 (D.C. Cir. 1997)(refusal to consider merits or constitutional issues and remand to District Court to consider reviewability of Karn's claims under the Administrative Procedures Act)

Northern District of Ohio

Junger v. Daley, 8 F. Supp. 2d 708 (N.D. Ohio 1998) (U.S. Gov't Summary Judgement motion granted by Judge James Gwin) (note that as of March 10, 1999, Professor Junger has appealed this decision to the Sixth Circuit and the government has responded with a countermotion. The briefs are available at: <http://samsara.LAW.CWRU.Edu/comp_law/jvd>

* * * * * * * * * * * * * * * * * * * * * * * * * * *

A. Questions to consider as you read through the following opinions:

Q: Are attempts to control the spread of strong encryption by the U.S. Government attempts to control "speech" or "conduct"?

Q: What does the idea that in cyberspace the First Amendment is a local ordinance mean for regulating strong cryptography?

Q.: What does the idea that some Cyberpundits have floated out that the Internet interprets censorship as damage and routes around it likewise mean for regulating strong cryptography?

Q: Is (or should) Object Code be considered to be outside first amendment protection? Source Code?

Q: Are communication and functionality mutually exclusive?

Q.: The EAR amendments make distinctions between printed source code (such as you would find in a book) and source code within electronic media -- should the choice of medium affect the scope of first amendment protection?

Q.: If the solution to the problem of geographic containment on the Internet can only be solved by forbidding the posting of strong encryption on a computer connected to the Internet, are "ample alternatives" available to the foreclosed communicative outlets to UseNet, and are they really adequate substitutes?

Q.: What does the notion of strong cryptography as a form of privacy "self-help" mean when we talk about copyright management schemes that may be used to encrypt uncopyrightable materials as well as legal regimes that make it a crime to tamper with such copyright management systems?

Q.: What, if any, metaphors are helpful (or harmful) when discussing the Internet?

Cases Reproduced (as of September 1999)

A. Bernstein v. Department of Justice (Bernstein IV) (opinion filed May 6, 1999, 9th Circuit)

B. Junger v. Daley (opinion filed July 2, 1998, Northern District of Ohio)

C. Karn v. U.S. Department of State (opinion filed March 22, 1996, District Ct. for the District of Columbia)

B. The Cases

  1. Bernstein IV (opinion filed May 6, 1999)
  2. DANIEL J. BERNSTEIN, Plaintiff-Appellee, v. UNITED STATES DEPARTMENT OF JUSTICE, ET AL, 176 F.3d 1132 (9th Cir., May 6, 1999)

    JUDGES: Before: Myron H. Bright (Senior United States Circuit Judge for the Eighth Circuit, sitting by designation), Betty B. Fletcher, and Thomas G. Nelson, Circuit Judges. Opinion by Judge B. Fletcher; Concurrence by Judge Bright; Dissent by Judge T.G. Nelson.

    B. FLETCHER, Circuit Judge:

    The government defendants appeal the grant of summary judgment to the plaintiff, Professor Daniel J. Bernstein ("Bernstein"), enjoining the enforcement of certain Export Administration Regulations ("EAR") that limit Bernstein's ability to distribute encryption software. We find that the EAR regulations (1) operate as a prepublication licensing scheme that burdens scientific expression, (2) vest boundless discretion in government officials, and (3) lack adequate procedural safeguards. Consequently, we hold that the challenged regulations constitute a prior restraint on speech that offends the First Amendment. Although we employ a somewhat narrower rationale than did the district court, its judgment is accordingly affirmed.

    BACKGROUND

    A. Facts and Procedural History

    Bernstein is currently a professor in the Department of Mathematics, Statistics, and Computer Science at the University of Illinois at Chicago. As a doctoral candidate at the University of California, Berkeley, he developed an encryption method - "a zero-delay private-key stream encryptor based upon a one-way hash function" that he dubbed "Snuffle." Bernstein described his method in two ways: in a paper containing analysis and mathematical equations (the "Paper") and in two computer programs written in "C," a high-level computer programming language ("Source Code"). Bernstein later wrote a set of instructions in English (the "Instructions") explaining how to program a computer to encrypt and decrypt data utilizing a one-way hash function, essentially translating verbatim his Source Code into prose form.

    Seeking to present his work on Snuffle within the academic and scientific communities, Bernstein asked the State Department whether he needed a license to publish Snuffle in any of its various forms. The State Department responded that Snuffle was a munition under the International Traffic in Arms Regulations ("ITAR"), and that Bernstein would need a license to "export" the Paper, the Source Code, or the Instructions. There followed a protracted and unproductive series of letter communications between Bernstein and the government, wherein Bernstein unsuccessfully attempted to determine the scope and application of the export regulations to Snuffle.

    Bernstein ultimately filed this action, challenging the constitutionality of the ITAR regulations. The district court found that the Source Code was speech protected by the First Amendment, see Bernstein v. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996) ("Bernstein I"), and subsequently granted summary judgment to Bernstein on his First Amendment claims, holding the challenged ITAR regulations facially invalid as a prior restraint on speech, see Bernstein v. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) ("Bernstein II").

    In December 1996, President Clinton shifted licensing authority for nonmilitary encryption commodities and technologies from the State Department to the Department of Commerce. See Exec. Order No. 13,026, 61 Fed. Reg. 58,767 (1996). The Department of Commerce then promulgated regulations under the EAR to govern the export of encryption technology, regulations administered by the Bureau of Export Administration ("BXA"). See 61 Fed. Reg. 68,572 (1996) (codified at 15 C.F.R. Pts. 730-74). Bernstein subsequently amended his complaint to add the Department of Commerce as a defendant, advancing the same constitutional objections as he had against the State Department. The district court, following the rationale of its earlier Bernstein opinions, once again granted summary judgment in favor of Bernstein, finding the new EAR regulations facially invalid as a prior restraint on speech. See Bernstein v. Department of State, 974 F. Supp. 1288 (N.D. Cal. 1997) ("Bernstein III"). The district court enjoined the Commerce Department from future enforcement of the invalidated provisions, an injunction that has been stayed pending this appeal.

    B. Overview of Cryptography

    Cryptography is the science of secret writing, a science that has roots stretching back hundreds, and perhaps thousands, of years. See generally David Khan, The Codebreakers (2d ed. 1996). For much of its history, cryptography has been the jealously guarded province of governments and militaries. In the past twenty years, however, the science has blossomed in the civilian sphere, driven on the one hand by dramatic theoretical innovations within the field, and on the other by the needs of modern communication and information technologies. As a result, cryptography has become a dynamic academic discipline within applied mathematics. It is the cryptographer's primary task to find secure methods to encrypt messages, making them unintelligible to all except the intended recipients:

    Encryption basically involves running a readable message known as "plaintext" through a computer program that translates the message according to an equation or algorithm into unreadable "ciphertext." Decryption is the translation back to plaintext when the message is received by someone with an appropriate "key." Bernstein III, 974 F. Supp. at 1292. The applications of encryption, however, are not limited to ensuring secrecy; encryption can also be employed to ensure data integrity, authenticate users, and facilitate nonrepudiation (e.g., linking a specific message to a specific sender). See id.

    It is, of course, encryption's secrecy applications that concern the government. The interception and deciphering of foreign communications has long played an important part in our nation's national security efforts. In the words of a high-ranking State Department official:

    Policies concerning the export control of cryptographic products are based on the fact that the proliferation of such products will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests. Cryptographic products and software have military and intelligence applications. As demonstrated throughout history, encryption has been used to conceal foreign military communications, on the battlefield, aboard ships and submarines, or in other military settings. Encryption is also used to conceal other foreign communications that have foreign policy and national security significance for the United States. For example, encryption can be used to conceal communications of terrorists, drug smugglers, or others intent on taking hostile action against U.S. facilities, personnel, or security interests. Lowell Decl. at 4 (reproduced in Appellant's Excerpts of Record at 97).

    As increasingly sophisticated and secure encryption methods are developed, the government's interest in halting or slowing the proliferation of such methods has grown keen. The EAR regulations at issue in this appeal evidence this interest.

    C. The EAR regulations

    The EAR contain specific regulations to control the export of encryption software, expressly including computer source code. Encryption software is treated differently from other software in a number of significant ways. First, the term "export" is specifically broadened with respect to encryption software to preclude the use of the internet and other global mediums if such publication would allow passive or active access by a foreign national within the United States or anyone outside the United States. 15 C.F.R. § 734.2(b)(9)(B)(ii). Second, the regulations governing the export of nonencryption software provide for several exceptions that are not applicable to encryption software. In addition, although printed materials containing encryption source code are not subject to EAR regulation, the same materials made available on machine-readable media, such as floppy disk or CD-ROM, are covered. 15 C.F.R. § 734.3(b), Note to Paragraphs (b)(2) & (b)(3). The government, moreover, has reserved the right to restrict source code in printed form that may be easily "scanned," thus creating some ambiguity as to whether printed publications are necessarily exempt from licensing.See 61 Fed. Reg. 68,575 (1996).

    If encryption software falls within the ambit of the relevant EAR provisions, the "export" of such software requires a prepublication license. When a prepublication license is requested, the relevant agencies undertake a "case-by-case" analysis to determine if the export is "consistent with U.S. national security and foreign policy interests." 15 C.F.R. § 742.15(b). All applications must be "resolved or referred to the President no later than 90 days" from the date an application is entered into the BXA's electronic license processing system. 15 C.F.R. § 750.4(a). There is no time limit, however, that applies once an application is referred to the President. Although the regulations do provide for an internal administrative appeal procedure, such appeals are governed only by the exhortation that they be completed "within a reasonable time." 15 C.F.R. § 756.2(c)(1). Final administrative decisions are not subject to judicial review. 15 C.F.R. § 756.2(c)(2).

    DISCUSSION

    I. Prior Restraint

    The parties and amici urge a number of theories on us. We limit our attention here, for the most part, to only one: whether the EAR restrictions on the export of encryption software in source code form constitute a prior restraint in violation of the First Amendment. We review de novo the district court's affirmative answer to this question. See Roulette v. Seattle, 97 F.3d 300, 302 (9th Cir. 1996).

    It is axiomatic that "prior restraints on speech and publication are the most serious and least tolerable infringement on First Amendment rights." Nebraska Press Ass'n v. Stuart, 427 U.S. 539, 559, 49 L. Ed. 2d 683, 96 S. Ct. 2791 (1976). Indeed, the Supreme Court has opined that "it is the chief purpose of the [First Amendment] guaranty to prevent previous restraints upon publication." Near v. Minnesota, 283 U.S. 697, 713, 75 L. Ed. 1357, 51 S. Ct. 625 (1931). Accordingly, "any prior restraint on expression comes ... with a 'heavy presumption' against its constitutional validity." Organization for a Better Austin v. Keefe, 402 U.S. 415, 419, 29 L. Ed. 2d 1, 91 S. Ct. 1575 (1971). At the same time, the Supreme Court has cautioned that "the phrase 'prior restraint' is not a self-wielding sword. Nor can it serve as a talismanic test." Kingsley Books, Inc. v. Brown, 354 U.S. 436, 441, 1 L. Ed. 2d 1469, 77 S. Ct. 1325 (1957). We accordingly turn from "the generalization that prior restraint is particularly obnoxious" to a "more particularistic analysis." Id. at 442. The Supreme Court has treated licensing schemes that act as prior restraints on speech with suspicion because such restraints run the twin risks of encouraging self-censorship and concealing illegitimate abuses of censorial power. See Lakewood v. Plain Dealer Publishing Co., 486 U.S. 750, 759, 100 L. Ed. 2d 771, 108 S. Ct. 2138 (1988). As a result, "even if the government may constitutionally impose content-neutral prohibitions on a particular manner of speech, it may not condition that speech on obtaining a license or permit from a government official in that official's boundless discretion." Id. at 764 (emphasis in original). We follow the lead of the Supreme Court and divide the appropriate analysis into two parts. The threshold question is whether Bernstein is entitled to bring a facial challenge against the EAR regulations. See 486 U.S. at 755. If he is so entitled, we proceed to the second question: whether the regulations constitute an impermissible prior restraint on speech. See id. at 769.

    A. Is Bernstein entitled to bring a facial attack?

    A licensing regime is always subject to facial challenge n8 as a prior restraint where it "gives a government official or agency substantial power to discriminate based on the content or viewpoint of speech by suppressing disfavored speech or disliked speakers," and has "a close enough nexus to expression, or to conduct commonly associated with expression, to pose a real and substantial threat of ... censorship risks." Id. at 759.

    The EAR regulations at issue plainly satisfy the first requirement - "the determination of who may speak and who may not is left to the unbridled discretion of a government official." 486 U.S. at 763. BXA administrators are empowered to deny licenses whenever export might be inconsistent with "U.S. national security and foreign policy interests." 15 C.F.R. § 742.15(b). No more specific guidance is provided. Obviously, this constraint on official discretion is little better than no constraint at all. See Lakewood, 486 U.S. at 769-70 (a standard requiring that license denial be in the "public interest" is an "illusory" standard that "renders the guarantee against censorship little more than a high-sounding ideal."). The government's assurances that BXA administrators will not, in fact, discriminate on the basis of content are beside the point. See id. at 770 (presumption that official will act in good faith "is the very presumption that the doctrine forbidding unbridled discretion disallows."). After all, "the mere existence of the licensor's unfettered discretion, coupled with the power of prior restraint, intimidates parties into censoring their own speech, even if the discretion and power are never actually abused." Id. at 757.

    The more difficult issue arises in relation to the second requirement - that the challenged regulations exhibit "a close enough nexus to expression." We are called on to determine whether encryption source code is expression for First Amendment purposes.

    We begin by explaining what source code is. "Source code," at least as currently understood by computer programmers, refers to the text of a program written in a "high-level" programming language, such as "PASCAL" or "C." The distinguishing feature of source code is that it is meant to be read and understood by humans and that it can be used to express an idea or a method. A computer, in fact, can make no direct use of source code until it has been translated ("compiled") into a "low-level" or "machine" language, resulting in computer-executable "object code." That source code is meant for human eyes and understanding, however, does not mean that an untutored layperson can understand it. Because source code is destined for the maw of an automated, ruthlessly literal translator - the compiler - a programmer must follow stringent grammatical, syntactical, formatting, and punctuation conventions. As a result, only those trained in programming can easily understand source code.

    Also important for our purposes is an understanding of how source code is used in the field of cryptography. Bernstein has submitted numerous declarations from cryptographers and computer programmers explaining that cryptographic ideas and algorithms are conveniently expressed in source code.n12 That this should be so is, on reflection, not surprising. As noted earlier, the chief task for cryptographers is the development of secure methods of encryption. While the articulation of such a system in layman's English or in general mathematical terms may be useful, the devil is, at least for cryptographers, often in the algorithmic details. By utilizing source code, a cryptographer can express algorithmic ideas with precision and methodological rigor that is otherwise difficult to achieve. This has the added benefit of facilitating peer review - by compiling the source code, a cryptographer can create a working model subject to rigorous security tests. The need for precisely articulated hypotheses and formal empirical testing, of course, is not unique to the science of cryptography; it appears, however, that in this field, source code is the preferred means to these ends.

    Thus, cryptographers use source code to express their scientific ideas in much the same way that mathematicians use equations or economists use graphs. Of course, both mathematical equations and graphs are used in other fields for many purposes, not all of which are expressive. But mathematicians and economists have adopted these modes of expression in order to facilitate the precise and rigorous expression of complex scientific ideas. Similarly, the undisputed record here makes it clear that cryptographers utilize source code in the same fashion.

    In light of these considerations, we conclude that encryption software, in its source code form and as employed by those in the field of cryptography, must be viewed as expressive for First Amendment purposes, and thus is entitled to the protections of the prior restraint doctrine. If the government required that mathematicians obtain a prepublication license prior to publishing material that included mathematical equations, we have no doubt that such a regime would be subject to scrutiny as a prior restraint. The availability of alternate means of expression, moreover, does not diminish the censorial power of such a restraint - that Adam Smith wrote Wealth of Nations without resorting to equations or graphs surely would not justify governmental prepublication review of economics literature that contain these modes of expression.

    The government, in fact, does not seriously dispute that source code is used by cryptographers for expressive purposes. Rather, the government maintains that source code is different from other forms of expression (such as blueprints, recipes, and "how-to" manuals) because it can be used to control directly the operation of a computer without conveying information to the user. In the government's view, by targeting this unique functional aspect of source code, rather than the content of the ideas that may be expressed therein, the export regulations manage to skirt entirely the concerns of the First Amendment. This argument is flawed for at least two reasons.

    First, it is not at all obvious that the government's view reflects a proper understanding of source code. As noted earlier, the distinguishing feature of source code is that it is meant to be read and understood by humans, and that it cannot be used to control directly the functioning of a computer. While source code, when properly prepared, can be easily compiled into object code by a user, ignoring the distinction between source and object code obscures the important fact that source code is not meant solely for the computer, but is rather written in a language intended also for human analysis and understanding.

    Second, and more importantly, the government's argument, distilled to its essence, suggests that even one drop of "direct functionality" overwhelms any constitutional protections that expression might otherwise enjoy. This cannot be so. The distinction urged on us by the government would prove too much in this era of rapidly evolving computer capabilities. The fact that computers will soon be able to respond directly to spoken commands, for example, should not confer on the government the unfettered power to impose prior restraints on speech in an effort to control its "functional" aspects. The First Amendment is concerned with expression, and we reject the notion that the admixture of functionality necessarily puts expression beyond the protections of the Constitution.

    The government also contends that the challenged regulations are immune from prior restraint analysis because they are "laws of general application" rather than being "directed narrowly and specifically at expression." Lakewood, 486 U.S. at 760-61. We cannot agree. Because we conclude that source code is utilized by those in the cryptography field as a means of expression, and because the regulations apply to encryption source code, it necessarily follows that the regulations burden a particular form of expression directly.

    The Supreme Court in Lakewood explored what it means to be a "law of general application" for prior restraint purposes. In that case, the Court cited a law requiring building permits as a "law of general application" that would not be subject to a facial attack as a prior restraint, reasoning that such a law carried "little danger of censorship," even if it could be used to retaliate against a disfavored newspaper seeking to build a printing plant. Id. at 761. In the Court's view, "such laws provide too blunt a censorship instrument to warrant judicial intervention prior to an allegation of actual misuse." Id. Unlike a building permit ordinance, which would afford government officials only intermittent and unpredictable opportunities to exercise unrestrained discretion over expression, the challenged EAR regulations explicitly apply to expression and place scientific expression under the censor's eye on a regular basis. In fact, there is ample evidence in the record establishing that some in the cryptography field have already begun censoring themselves, for fear that their statements might influence the disposition of future licensing applications. See, e.g., National Research Council, Cryptography's Role in Securing the Information Society 158 (1996) ("Vendors contended that since they are effectively at the mercy of the export control regulators, they have considerable incentive to suppress any public expression of dissatisfaction with the current process."). In these circumstances, we cannot conclude that the export control regime at issue is a "law of general application" immune from prior restraint analysis.

    Because the prepublication licensing scheme challenged here vests unbridled discretion in government officials, and because it directly jeopardizes scientific expression, we are satisfied that Bernstein may properly bring a facial challenge against the regulations. We accordingly turn to the merits.

    B. Are the regulations an impermissible prior restraint?

    "The protection even as to previous restraint is not absolutely unlimited." Near, 283 U.S. at 716. The Supreme Court has suggested that the "heavy presumption" against prior restraints may be overcome where official discretion is bounded by stringent procedural safeguards. See FW/PBS, 493 U.S. at 227 (plurality opinion of O'Connor, J.); Freedman v. Maryland, 380 U.S. 51, 58-59, 13 L. Ed. 2d 649, 85 S. Ct. 734 (1965); Kingsley Books, 354 U.S. at 442-43; 11126 Baltimore Blvd. v. Prince George's County, 58 F.3d 988, 995 (4th Cir. 1995) (en banc). As our analysis above suggests, the challenged regulations do not qualify for this First Amendment safe harbor. In Freedman v. Maryland, the Supreme Court set out three factors for determining the validity of licensing schemes that impose a prior restraint on speech: (1) any restraint must be for a specified brief period of time; (2) there must be expeditious judicial review; and (3) the censor must bear the burden of going to court to suppress the speech in question and must bear the burden of proof. See 380 U.S. at 58-60. The district court found that the procedural protections provided by the EAR regulations are "woefully inadequate" when measured against these requirements. Bernstein III, 974 F. Supp. at 1308. We agree.

    Although the regulations require that license applications be resolved or referred to the President within 90 days, see 15 C.F.R. § 750.4(a), there is no time limit once an application is referred to the President. Thus, the 90-day limit can be rendered meaningless by referral. Moreover, if the license application is denied, no firm time limit governs the internal appeals process. See 15 C.F.R. § 756.2(c)(1) (Under Secretary "shall decide an appeal within a reasonable time after receipt of the appeal."). Accordingly, the EAR regulations do not satisfy the first Freedman requirement that a licensing decision be made within a reasonably short, specified period of time. See FW/PBS, 493 U.S. at 226 (finding that "a prior restraint that fails to place time limits on the time within which the decisionmaker must issue the license is impermissible"); Riley v. National Fed. of the Blind, 487 U.S. 781, 802, 101 L. Ed. 2d 669, 108 S. Ct. 2667 (1988) (licensing scheme that permits "delay without limit" is impermissible); Vance v. Universal Amusement Co., 445 U.S. 308, 315-17, 63 L. Ed. 2d 413, 100 S. Ct. 1156 (1980) (prior restraint of indefinite duration is impermissible). The EAR regulatory regime further offends Freedman's procedural requirements insofar as it denies a disappointed applicant the opportunity for judicial review. See 15 C.F.R. § 756.2(c)(2); FW/PBS, 493 U.S. at 229 (plurality opinion of O'Connor, J.) (finding failure to provide "prompt" judicial review violates Freedman); Freedman, 380 U.S. at 59 (licensing procedure must assure a prompt final judicial decision).

    We conclude that the challenged regulations allow the government to restrain speech indefinitely with no clear criteria for review. As a result, Bernstein and other scientists have been effectively chilled from engaging in valuable scientific expression. Bernstein's experience itself demonstrates the enormous uncertainty that exists over the scope of the regulations and the potential for the chilling of scientific expression. In short, because the challenged regulations grant boundless discretion to government officials, and because they lack the required procedural protections set forth in Freedman, we find that they operate as an unconstitutional prior restraint on speech. See Lakewood, 486 U.S. at 769-772 (holding that newsrack licensing ordinance was an impermissible prior restraint because it conferred unbounded discretion and lacked adequate procedural safeguards).

    C. Concluding comments.

    We emphasize the narrowness of our First Amendment holding. We do not hold that all software is expressive. Much of it surely is not. Nor need we resolve whether the challenged regulations constitute content-based restrictions, subject to the strictest constitutional scrutiny, or whether they are, instead, content-neutral restrictions meriting less exacting scrutiny. We hold merely that because the prepublication licensing regime challenged here applies directly to scientific expression, vests boundless discretion in government officials, and lacks adequate procedural safeguards, it constitutes an impermissible prior restraint on speech.

    We will, however, comment on two issues that are entwined with the underlying merits of Bernstein's constitutional claims. First, we note that insofar as the EAR regulations on encryption software were intended to slow the spread of secure encryption methods to foreign nations, the government is intentionally retarding the progress of the flourishing science of cryptography. To the extent the government's efforts are aimed at interdicting the flow of scientific ideas (whether expressed in source code or otherwise), as distinguished from encryption products, these efforts would appear to strike deep into the heartland of the First Amendment. In this regard, the EAR regulations are very different from content-neutral time, place and manner restrictions that may have an incidental effect on expression while aiming at secondary effects.

    Second, we note that the government's efforts to regulate and control the spread of knowledge relating to encryption may implicate more than the First Amendment rights of cryptographers. In this increasingly electronic age, we are all required in our everyday lives to rely on modern technology to communicate with one another. This reliance on electronic communication, however, has brought with it a dramatic diminution in our ability to communicate privately. Cellular phones are subject to monitoring, email is easily intercepted, and transactions over the internet are often less than secure. Something as commonplace as furnishing our credit card number, social security number, or bank account number puts each of us at risk. Moreover, when we employ electronic methods of communication, we often leave electronic "fingerprints" behind, fingerprints that can be traced back to us. Whether we are surveilled by our government, by criminals, or by our neighbors, it is fair to say that never has our ability to shield our affairs from prying eyes been at such a low ebb. The availability and use of secure encryption may offer an opportunity to reclaim some portion of the privacy we have lost. Government efforts to control encryption thus may well implicate not only the First Amendment rights of cryptographers intent on pushing the boundaries of their science, but also the constitutional rights of each of us as potential recipients of encryption's bounty. Viewed from this perspective, the government's efforts to retard progress in cryptography may implicate the Fourth Amendment, as well as the right to speak anonymously, see McIntyre v. Ohio Elections Comm'n, 514 U.S. 334, 115 S. Ct. 1511, 1524, 131 L. Ed. 2d 426 (1995), the right against compelled speech, see Wooley v. Maynard, 430 U.S. 705, 714, 51 L. Ed. 2d 752, 97 S. Ct. 1428 (1977), and the right to informational privacy, see Whalen v. Roe, 429 U.S. 589, 599-600, 51 L. Ed. 2d 64, 97 S. Ct. 869 (1977). While we leave for another day the resolution of these difficult issues, it is important to point out that Bernstein's is a suit not merely concerning a small group of scientists laboring in an esoteric field, but also touches on the public interest broadly defined.

    II. Scope of Declaratory Relief

    The government also challenges the scope of the declaratory relief granted by the district court. The government argues that the relief provided is invalid in two respects: (1) that the relief extends to encryption object code and encryption commodities; (2) that the relief extends to encryption technology. The district held that

    the Export Administration Regulations, 15 C.F.R. pt. 730 et seq. (1997) and all rules, policies and practices promulgated or pursued thereunder insofar as they apply to or require licensing for encryption and decryption software and related devices and technology are in violation of the First Amendment on the grounds of prior restraint and are, therefore, unconstitutional as discussed above, and shall not be applied to plaintiff's publishing of such items, including scientific papers, algorithms or computer programs. Bernstein III, 974 F. Supp. at 1310.

    We review the district court's grant of declaratory relief de novo. See Crawford v. Lungren, 96 F.3d 380, 384 (9th Cir. 1996); Ablang v. Reno, 52 F.3d 801, 803 (9th Cir. 1995).

    This inquiry leads us into the uncertain jurisprudence of "severability." See generally John Copeland Nagle, Severability, 72 N.C. L. Rev. 203 (1993). The general principle is clear: "[A] court should refrain from invalidating more of [a] statute than is necessary .... 'Whenever an act of Congress contains unobjectionable provisions separable from those found to be unconstitutional, it is the duty of this court to so declare, and to maintain the act in so far as it is valid.'" Alaska Airlines, Inc. v. Brock, 480 U.S. 678, 684, 94 L. Ed. 2d 661, 107 S. Ct. 1476 (1987) (quoting Regan v. Time, Inc., 468 U.S. 641, 652, 82 L. Ed. 2d 487, 104 S. Ct. 3262 (1984)); see also National Collegiate Athletic Ass'n v. Miller, 10 F.3d 633, 640 (9th Cir. 1993). The applicable legal standard has also been oft repeated: "unless it is evident that the Legislature would not have enacted those provisions which are within its power, independently of that which is not, the invalid part may be dropped if what is left is fully operative as a law." Buckley v. Valeo, 424 U.S. 1, 108, 46 L. Ed. 2d 659, 96 S. Ct. 612 (1976) (per curiam); accord NCAA v. Miller, 10 F.3d at 640. Thus, in the general case, severability analysis properly focuses on legislative intent. See Alaska Airlines, Inc., 480 U.S. at 685.

    This case, however, is not the general case. First, the challenged enactment here is a regulation, rather than a statute. As a result, we cannot look to the usual public sources to determine the intentions of the drafters. Nevertheless, we agree with the government that the EAR regulations can be conceptually severed into component parts governing commodities, software, and technology. We also assume that the Department of Commerce, even if barred from imposing prepublication licensing on encryption source code, would have enacted regulations controlling the export of encryption commodities, object code, and technology.

    But while the district court may have erred in treating software and commodities as the same item, the integrated structure of the regulations does not permit us to sever the various provisions in the manner requested by the government. To sever the unconstitutional portion of the regulations, we would have to line edit individual sections, deleting or modifying the definition of "software" while retaining "commodities" and "technology." We would then have to redefine general terms such as "items" which refer collectively to commodities, software, and technology. We have neither the power nor the capacity to engage in line by line revisions of the challenged regulations or to redefine terms within the regulations. See Hill v. Wallace, 259 U.S. 44, 70-71, 66 L. Ed. 822, 42 S. Ct. 453 (1922); American Booksellers Ass'n v. Hudnut, 771 F.2d 323, 332-33 (7th Cir. 1985). To do so would be to improperly invade the province reserved to the Executive. Accordingly, we affirm the district court's grant of declaratory relief.

    CONCLUSION

    Because the prepublication licensing regime challenged by Bernstein applies directly to scientific expression, vests boundless discretion in government officials, and lacks adequate procedural safeguards, we hold that it constitutes an impermissible prior restraint on speech. We decline the invitation to line edit the regulations in an attempt to rescue them from constitutional infirmity, and thus endorse the declaratory relief granted by the district court.

    AFFIRMED.

    BRIGHT, Circuit Judge, separately concurring.

    I join Judge Fletcher's opinion. I do so because the speech aspects of encryption source code represent communication between computer programmers. I do, however, recognize the validity of Judge Nelson's view that encryption source code also has the functional purpose of controlling computers and in that regard does not command protection under the First Amendment. The importance of this case suggests that it may be appropriate for review by the United States Supreme Court.

    T.G. NELSON, Circuit Judge, Dissenting:

    Bernstein was not entitled to bring a facial First Amendment challenge to the EAR, and the district court improperly granted an injunction on the basis of a facial challenge. I therefore respectfully dissent.

    The basic error which sets the majority and the district court adrift is the failure to fully recognize that the basic function of encryption source code is to act as a method of controlling computers. As defined in the EAR regulations, encryption source code is "[a] precise set of operating instructions to a computer, that when compiled, allows for the execution of an encryption function on a computer." 15 C.F.R. pt. 722. Software engineers generally do not create software in object code - the series of binary digits (1's and 0's) - which tells a computer what to do because it would be enormously difficult, cumbersome and time-consuming. Instead, software engineers use high-level computer programming languages such as "C" or "Basic" to create source code as a shorthand method for telling the computer to perform a desired function. In this respect, lines of source code are the building blocks or the tools used to create an encryption machine. See e.g., Patrick Ian Ross, Bernstein v. United States Department of State, 13 Berkeley Tech. L.J. 405, 410-11 (1998) ("Electronic source code that is ready to compile merely needs a few keystrokes to generate object code - the equivalent of flipping an 'on' switch. Code used for this purpose can fairly easily be characterized as 'essentially functional.'"); Pamela Samuelson et al., A Manifesto Concerning Legal Protection of Computer Programs, 94 Colum. L. Rev. 2308, 2315-30 (1994) ("Programs are, in fact, machines (entities that bring about useful results, i.e., behavior) that have been constructed in the medium of text (source code and object code)."). Encryption source code, once compiled, works to make computer communication and transactions secret; it creates a lockbox of sorts around a message that can only be unlocked by someone with a key. It is the function or task that encryption source code performs which creates its value in most cases. This functional aspect of encryption source code contains no expression; it is merely the tool used to build the encryption machine.

    This is not to say that this very same source code is not used expressively in some cases. Academics, such as Bernstein, seek to convey and discuss their ideas concerning computer encryption. As noted by the majority, Bernstein must actually use his source code textually in order to discuss or teach cryptology. In such circumstances, source code serves to express Bernstein's scientific methods and ideas.

    While it is conceptually difficult to categorize encryption source code under our First Amendment framework, I am still inevitably led to conclude that encryption source code is more like conduct than speech. Encryption source code is a building tool. Academics and computer programmers can convey this source code to each other in order to reveal the encryption machine they have built. But, the ultimate purpose of encryption code is, as its name suggests, to perform the function of encrypting messages. Thus, while encryption source code may occasionally be used in an expressive manner, it is inherently a functional device.

    We are not the first to examine the nature of encryption source code in terms of First Amendment protection. Judge Gwin of the United States District Court for the Northern District of Ohio also explored the function versus expression conundrum of encryption source code at some length in Junger v. Daley, 8 F. Supp. 2d 708 (N.D. Ohio 1998). Junger, like Bernstein, is a professor, albeit a law professor, who wished to publish in various forms his work on computers, including a textbook, Computers and the Law. The book was determined by the Government to be subject to export without a license, but his software programs were determined to come within the licensing provisions of the EAR. In the course of rejecting Junger's claims, the court said:

    Like much computer software, encryption source code is inherently functional; it is designed to enable a computer to do a designated task. Encryption source code does not merely explain a cryptographic theory or describe how the software functions. More than describing encryption, the software carries out the function of encryption. The software is essential to carry out the function of encryption. In doing this function, the encryption software is indistinguishable from dedicated computer hardware that does encryption.

    In the overwhelming majority of circumstances, encryption source code is exported to transfer functions, not to communicate ideas. In exporting functioning capability, encryption source code is like other encryption devices. For the broad majority of persons receiving such source code, the value comes from the function the source code does. Id. at 716.

    The Junger decision thus adds considerable support for the propositions that encryption source code cannot be categorized as pure speech and that the functional aspects of encryption source code cannot be easily ignored or put aside.

    Both the district court and the majority hold that because source code can be used expressively in some circumstances, Bernstein was entitled to bring a facial challenge to the EAR. Such an approach ignores the basic tenet that facial challenges are inappropriate "unless, at a minimum, the challenged statute 'is directed narrowly and specifically at expression or conduct commonly associated with expression.'" Roulette v. City of Seattle, 97 F.3d 300, 305 (9th Cir. 1996) (quoting City of Lakewood v. Plain Dealer Publishing Co., 486 U.S. 750, 760, 100 L. Ed. 2d 771, 108 S. Ct. 2138 (1988)). That encryption source code may on occasion be used expressively does not mean that its export is "conduct commonly associated with expression" or that the EAR regulations are directed at expressive conduct. See 97 F.3d at 303 ("The fact that sitting can possibly be expressive, however, isn't enough to sustain plaintiffs' facial challenge."); see also Junger, 8 F. Supp. 2d at 718 ("The prior restraint doctrine is not implicated simply because an activity may on occasion be expressive.").

    The activity or conduct at issue here is the export of encryption source code. As I noted above, the basic nature of encryption source code lies in its functional capacity as a method to build an encryption device. Export of encryption source code is not conduct commonly associated with expression. Rather, it is conduct that is normally associated with providing other persons with the means to make their computer messages secret. The overwhelming majority of people do not want to talk about the source code and are not interested in any recondite message that may be contained in encryption source code. Only a few people can actually understand what a line of source code would direct a computer to do. Most people simply want to use the encryption source code to protect their computer communications. Export of encryption source code simply does not fall within the bounds of conduct commonly associated with expression such as picketing or handbilling. See Roulette, 97 F.3d at 303-04.

    Further, the EAR regulates the export of encryption technology generally, whether it is software or hardware. See 15 C.F.R. § 742.15; Junger, 8 F. Supp. 2d at 718 ("The Export Regulations do not single out encryption software."). These regulations are directed at preventing the functional capacity of any encryption device, including its source code, from being exported without a government license. The EAR is not specifically directed towards stifling the expressive nature of source code or Bernstein's academic discussions about cryptography. This is demonstrated by the fact that the regulations do not object to publication in printed form of learned articles containing source code. See 15 C.F.R. § 734.3. Thus, the EAR is generally directed at non-expressive conduct - the export of source code as a tool to make messages secret and impervious to government eavesdropping capabilities.

    Because this is a law of general application focused at conduct, Bernstein is not entitled to bring a facial challenge. The district court's injunction based upon the finding of a facial prior restraint is thus impermissible. This is not to say that Bernstein's activities would not be entitled to First Amendment protection, but that the legal path chosen to get that protection must be the correct one. We should be careful to "entertain[ ] facial freedom-of-expression challenges only against statutes that, 'by their terms,' sought to regulate 'spoken words,' or patently 'expressive or communicative conduct.'" Roulette, 97 F.3d at 303 (citing Broadrick v. Oklahoma, 413 U.S. 601, 612-13, 37 L. Ed. 2d 830, 93 S.Ct. 2908 (1973)). Bernstein may very well have a claim under an as-applied First Amendment analysis; however, such a claim must be left to the district court's determination in the first instance. Here, the district court did not rule on Bernstein's as-applied claims. I would therefore vacate the district court's injunction and remand for consideration of Bernstein's as-applied challenges to the EAR. Accordingly, I respectfully dissent.

  3. Junger v. Daley (opinion filed July 2, 1998)

PETER JUNGER, Plaintiff, v. WILLIAM M. DALEY, United States Secretary of Commerce, et al., Defendants, 8 F. Supp. 2d 708, (N.D. Oh., July 2, 1998)

James S. Gwin, J.

In October and November 1997, Plaintiff Peter Junger ("Junger") and Defendants United States Secretary of Commerce, et al. ("the government") filed cross-motions for summary judgment in this First Amendment case [Doc. 58, 62]. In his motion for judgment, Plaintiff Junger seeks injunctive and declaratory relief from the government's enforcement of export controls on encryption software. In support of his motion for injunctive relief, Junger claims the Export Administration Regulations ("Export Regulations"), 15 C.F.R. pt. 730 et seq., violate rights protected by the First Amendment.

The government denies that the Export Regulations implicate First Amendment rights. The government says its licensing requirement seeks only to restrict the distribution of encryption software itself, not ideas on encryption. Stated otherwise, the government says it seeks to control only the engine for encrypting data. The government says it controls the distribution of sophisticated encryption software for valid national security purposes.

For the reasons that follow, the Court denies Plaintiff Junger's motion for summary judgment, and grants the government's motion for summary judgment.

I. Background

A. Description of claims made

Plaintiff Junger claims the Export Regulations violate rights protected by the First Amendment. In Count One of his five-count complaint, Plaintiff Junger says licensing requirements for exporting encryption software work a prior restraint, violating the First Amendment's free speech clause. In Count Two, Junger argues that the Export Regulations are unconstitutionally overbroad and vague. In Count Three, he argues that the Export Regulations engage in unconstitutional content discrimination by subjecting certain types of encryption software to more stringent export regulations than other items. In Count Four, Junger claims that the Export Regulations restrict his ability to exchange software, by that infringing his First Amendment rights to academic freedom and freedom of association. In Count Five, Junger alleges that executive regulation of encryption software under the International Emergency Economic Powers Act, 50 U.S.C. § 1701 et seq., is a violation of the separation of powers doctrine.

In addressing these claims, the Court decides whether encryption source code is sufficiently expressive to merit heightened First Amendment protection. The Court then examines whether the Export Regulations are a prior restraint on speech subject to greater First Amendment scrutiny. If the regulatory scheme does not warrant increased scrutiny, the Court decides if the scheme survives intermediate scrutiny.

The Court finds that the Export Regulations are constitutional because encryption source code is inherently functional, because the Export Regulations are not directed at source code's expressive elements, and because the Export Regulations do not reach academic discussions of software, or software in print form. For these reasons, the Court grants the government's motion for summary judgment and denies Junger's motion for summary judgment.

B. Cryptography

Once almost the exclusive province of military and governmental bodies, cryptography is now increasingly available to businesses and private individuals wishing to keep their communications confidential. See Bernstein v. United States Dep't of State, 974 F. Supp. 1288, 1292 (N.D. Cal. 1997) ("Bernstein III"). To keep their communications confidential, users encrypt and decrypt n2 communications, records and other data. Through encryption, users seek to prevent the unauthorized interception, viewing, tampering, and forging of such data. Without encryption, information sent by a computer is unsecured. Without encryption those other than the intended recipient may view sensitive information.

Encryption has been used for decades although the methods of encryption have changed. Until the end of World War II, mechanical devices commonly did encryption, such as Nazi Germany's Enigma machines. Today, computers and electronic devices have largely replaced mechanical encryption. In using electronic devices, encryption can be done with dedicated hardware (such as a telephone scrambler's electronic circuitry) or with computer software. Encryption software carries out a cryptographic "algorithm," which is a set of instructions that directs computer hardware to encrypt plaintext into an encoded ciphertext. Mathematical functions or equations usually make up the instructions.

Like all software, encryption programs can take two general forms: object code and source code. Source code is a series of instructions to a computer in programming languages such as BASIC, PERL, or FORTRAN. Object code is the same set of instructions translated into binary digits (1's and 0's). Thus, source code and object code are essentially interchangeable. While source code is not directly executable by a computer, the computer can easily convert it into executable object code with "compiler" or "interpreter" software.

C. Regulatory background

On November 15, 1996, President Clinton issued Executive Order 13026. With that order, he transferred jurisdiction over export controls on nonmilitary encryption products and related technology from the State Department to the Commerce Department. The order specified that encryption products formerly designated as defense articles on the United States Munitions List after that would be subjected to Commerce Department regulations (the "Export Regulations"). In his order, the President found that "the export of encryption software, like the export of other encryption products described in this section, must be controlled because of such software's functional capacity, rather than because of any possible informational value of such software. ..." Exec. Order No. 13026, 1996 WL 666563. The Export Regulations remain in effect.

The Export Regulations control the "export" of certain software. The Export Regulations define "export" of controlled encryption source code and object code software as "downloading, or causing the downloading of, such software to locations ... outside the United States ... unless the person making the software available takes precautions adequate to prevent unauthorized transfer of such code outside the United States." 15 C.F.R. § 734.2(b)(9).

The Export Regulations forbid the transfer of certain encryption software outside the United States. Unless very difficult precautions are taken, posting software on the Internet is an export. See 15 C.F.R. § 734.2(b)(9)(ii)(B). However, it is nearly impossible for most Internet users to carry out or verify the precautions. n6 Because of the difficulty of the precautions, almost any posting of software on the Internet is an export.

The Export Regulations set up procedures to obtain approval for exporting items on the Control List. To export any item listed on the Commerce Control List, one must first submit a commodity classification request to the Bureau of Export Administration. See 15 C.F.R. Pts. 740-44. All items on the Commerce Control List are given an Export Control Classification Number, and Bureau of Export Administration regulations specify three categories of controlled Encryption Items.

Export Classification Number 5A002 covers encryption commodities (such as circuitry and hardware products), Export Classification Number 5D002 covers encryption software, n7 and Export Classification Number 5E002 covers encryption technology. See 15 C.F.R. § 774 supp. I. Although the Export Administration Act defines "technology" to include software, 50 U.S.C. App. § 2415(4), Bureau of Export Administration regulations treat encryption software the same as encryption commodities. 15 C.F.R. Part 774, Note following 5D002.

For software falling under Export Classification Numbers 5A002, 5D002 and 5E002, the Export Regulations requires licenses for export to all destinations except Canada. See 15 C.F.R. § 742.15(a). As later described, Plaintiff Junger's application involves software classified under Classification Number 5D002. As to this classification number, licensing is required except for encryption source code in a book or other printed material, 15 C.F.R. § 734.3, Notes to Paragraphs (b)(2) and (b)(3). Encryption source code in printed form is not subject to the Export Regulations and, thus, is outside the scope of the licensing requirement.

D. Junger's commodity classification requests

Plaintiff Junger is a law professor. He teaches a course titled "Computers and the Law" at Case Western Reserve University Law School in Cleveland, Ohio. Junger maintains sites on the World Wide Web that include information about courses that he teaches, including a computers and law course. His web sites also set out documents involved with this litigation. n8 Plaintiff Junger uses his web site to describe the process of this litigation through press releases and filed materials. n9 Besides descriptions of this lawsuit, the web site has information from Junger's courses and other topics of interest to him.

Plaintiff Junger wishes to post to his web site various encryption programs that he has written to show how computers work. Such a posting is an export under the Export Regulations. See 15 C.F.R. § 734.2(b)(9).

On June 12, 1997, Plaintiff Junger submitted three applications to the Commerce Department requesting determination of commodity classifications for encryption software programs and other items. With these applications, Plaintiff Junger sought a Commerce Department determination whether they restricted the materials from export. On July 4, 1997, the Bureau of Export Administration told Junger that Export Classification Number 5D002 covered four of the five software programs he had submitted, and therefore were subject to the Export Regulations. Although it found that four programs were subject to the Export Regulations, the Commerce Department found that the first chapter of Junger's textbook, Computers and the Law, was an allowed unlicensed export. While deciding that the printed book chapter containing encryption code could be exported, the Commerce Department said that export of a software program itself would need a license. After receiving the classification determination, Junger has not applied for a license to export his classified encryption software.

II. Legal standards

In reviewing the parties' motions for summary judgment, the Court first examines the standard for judgment under Fed. R. Civ. P. 56. After a brief review of the standard for summary judgment, the Court examines the First Amendment protection afforded computer software. In deciding what level of protection is afforded and what level of scrutiny should be applied, the Court looks to whether the software is expressive or functional. Then, the Court considers whether the Commerce Department licensing scheme is a prior restraint of speech, whether Plaintiff Junger has standing to claim the Export Regulations are overbroad or vague, and then whether the Export Regulations are content-based discrimination.

A. Summary Judgment

Pursuant to Fed. Rule Civ. Proc. 56, summary judgment will be granted if the evidence presented in the record shows that there is no genuine issue as to any material fact and that the moving party is entitled to judgment as a matter of law. In assessing the merits of the motion, this court shall draw all justifiable inferences from the evidence presented in the record in the light most favorable to the non-moving party. Woythal v. Tex-Tenn Corp., 112 F.3d 243, 245 (6th Cir.), cert. denied, 139 L. Ed. 2d 317, 118 S. Ct. 414 (1997). However, an opponent to a motion for summary judgment may not rest upon the mere allegations or denials of his pleadings, but must set forth through competent and material evidence specific facts showing that there is a genuine issue for trial. "The mere existence of some alleged factual dispute between the parties will not defeat an otherwise properly supported motion for summary judgment; the requirement is that there be no genuine issue of material fact." Anderson v. Liberty Lobby, Inc., 477 U.S. 242, 247-48, 91 L. Ed. 2d 202, 106 S. Ct. 2505 (1986); Miller v. Lorain County Bd. of Elections, 141 F.3d 252, slip op. at 6-7 (6th Cir. 1998).

Summary judgment is particularly appropriate where, as here, the parties contest only legal issues and there are no issues of material fact to be resolved by a trial. See Oscar W. Larson Co. v. United Capitol Ins. Co., 64 F.3d 1010, 1012 (6th Cir. 1995).

B. First Amendment Scrutiny

The scrutiny the Court will apply to the Export Regulations depends upon whether the export of encryption source code is expressive, and whether the Export Regulations are directed at the content of ideas. Prior restraints on expressive materials bear a heavy presumption against their constitutional validity, and are subject to the strictest judicial scrutiny. See New York Times Co. v. United States, 403 U.S. 713, 714, 29 L. Ed. 2d 822, 91 S. Ct. 2140 (1971) (per curiam).

If a law distinguishes among types of speech based on their content of ideas, the Court reviews it under strict scrutiny. See Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, 642, 129 L. Ed. 2d 497, 114 S. Ct. 2445 (1994). To survive strict scrutiny, the government must employ narrowly tailored means that are necessary to advance a compelling government interest. See id.

If a law does not distinguish among types of speech based upon the content of the speech, the law will not be subject to strict scrutiny. Turner, 512 U.S. at 658 (laws favoring broadcast programs over cable programs are not subject to strict scrutiny unless the laws reflect government preference for the content of one speaker). As described in Turner: "It would be error to conclude, however, that the First Amendment mandates strict scrutiny for any speech regulation that applies to one medium (or a subset thereof) but not others." Id. at 660.

If the Export Regulations are not expressive and if the Export Regulations are not aimed at the content of the ideas, then the Court reviews the regulations under an intermediate scrutiny standard. See id. at 662. Under intermediate scrutiny, a law is constitutional if it furthers a substantial governmental interest, if the interest is unrelated to the suppression of free expression, and if the restriction is no greater than is essential to the furtherance of that interest. See id. (citing United States v. O'Brien, 391 U.S. 367, 377, 20 L. Ed. 2d 672, 88 S. Ct. 1673 (1968)).

III. Does the First Amendment protect export of software?

The most important issue in the instant case is whether the export of encryption software source code is sufficiently expressive to merit First Amendment protection. This is a matter of first impression in the Sixth Circuit. Indeed, the Court is aware of only two other courts in the United States that have addressed this question, and they reached opposite results. This Court finds that although encryption source code may occasionally be expressive, its export is not protected conduct under the First Amendment.

As the Supreme Court observed in Roth v. United States, 354 U.S. 476, 1 L. Ed. 2d 1498, 77 S. Ct. 1304 (1957), the First Amendment was adopted to foster the spread of ideas: "The protection given speech and press was fashioned to assure unfettered interchange of ideas for the bringing about of political and social changes desired by the people." Id. at 484 (upholding a federal statute that prohibited mailing obscene materials). Conversely, speech that is "so far removed from any exposition of ideas, and from truth, science, morality, and arts in general, in its diffusion of liberal sentiments on the administration of Government" lacks First Amendment protection. Virginia State Bd. of Pharmacy v. Virginia Citizens Consumer Council, Inc., 425 U.S. 748, 762, 48 L. Ed. 2d 346, 96 S. Ct. 1817 (1976) (ruling that commercial speech is not wholly without First Amendment protection).

In reviewing governmental regulation of computer software, the Court need examine the software involved. Certain software is inherently expressive. Such expressive software contains an "exposition of ideas," Chaplinsky v. State of New Hampshire, 315 U.S. 568, 572, 86 L. Ed. 1031, 62 S. Ct. 766 (1942). In contrast, other software is inherently functional. With such software, users look to the performance of tasks with scant concern for the methods employed or the software language used to control such methods.

Among computer software programs, encryption software is especially functional rather than expressive. Like much computer software, encryption source code is inherently functional; it is designed to enable a computer to do a designated task. Encryption source code does not merely explain a cryptographic theory or describe how the software functions. More than describing encryption, the software carries out the function of encryption. The software is essential to carry out the function of encryption. In doing this function, the encryption software is indistinguishable from dedicated computer hardware that does encryption.

In the overwhelming majority of circumstances, encryption source code is exported to transfer functions, not to communicate ideas. In exporting functioning capability, encryption source code is like other encryption devices. For the broad majority of persons receiving such source code, the value comes from the function the source code does.

The Court now examines the relationship between source code's inherent functionality and First Amendment protection. In Bernstein v. United States Dep't of State, 922 F. Supp. 1426 (N.D. Cal. 1996) ("Bernstein I"), the district court held that the inherent functionality of software does not vitiate its status as protected speech: instructions, do-it-yourself manuals, and recipes "are often purely functional," but they are also protected as speech because they are written in a language. Bernstein I, 922 F. Supp. at 1435.

That court's ruling rested on its conclusion that anything written in a language necessarily is protected speech: "language is by definition speech, and the regulation of any language is the regulation of speech." Id. at 1435 (quoting Yniguez v. Arizonans for Official English, 69 F.3d 920, 935 (9th Cir. 1995), vacated on other grounds, 520 U.S. 43, 117 S. Ct. 1055, 137 L. Ed. 2d 170 (1997)). Whether the alleged "speech" is actually expressive is immaterial if it is communicated through language. A court "need only assess the expressiveness of conduct in the absence of 'the spoken or written word.'" Id. 922 F. Supp. at 1434 (construing Texas v. Johnson, 491 U.S. 397, 404, 105 L. Ed. 2d 342, 109 S. Ct. 2533 (1989)).

The Bernstein court's assertion that "language equals protected speech" is unsound. "Speech" is not protected simply because we write it in a language. Instead, what determines whether the First Amendment protects something is whether it expresses ideas. See Roth v. United States, 354 U.S. at 484; Virginia Citizens Consumer Counsel 425 U.S. at 762.

"Fighting words" are written or spoken in a language. While spoken or written in language, they are excluded from First Amendment protection. See, e.g., Sandul v. Larion, 119 F.3d 1250, 1255 (6th Cir.), cert. dismissed, 118 S. Ct. 439 (1997) (observing that words "which by their very utterance inflict injury or tend to incite an immediate breach of the peace" are not protected because they "are no essential part of any exposition of ideas ....") (quoting Chaplinsky, 315 U.S. at 572. Similarly, commercial advertisements are written in a language, but are afforded a lesser level of protection under the First Amendment. See Central Hudson Gas & Elec. Corp. v. Public Serv. Comm'n of New York, 447 U.S. 557, 566, 65 L. Ed. 2d 341, 100 S. Ct. 2343 (1980) (acknowledging that the government may ban forms of communication more likely to deceive the public than to inform).

Furthermore, the court in Bernstein I misunderstood the significance of source code's functionality. Source code is "purely functional," 922 F. Supp. at 1435, in a way that the Bernstein Court's examples of instructions, manuals, and recipes are not. Unlike instructions, a manual, or a recipe, source code actually performs the function it describes. While a recipe provides instructions to a cook, source code is a device, like embedded circuitry in a telephone, that actually does the function of encryption.

While finding that encryption source code is rarely expressive, in limited circumstances it may communicate ideas. Although it is all but unintelligible to most people, trained computer programmers can read and write in source code. Moreover, people such as Plaintiff Junger can reveal source code to exchange information and ideas about cryptography.

Therefore, the Court finds that exporting source code is conduct that can occasionally have communicative elements. Nevertheless, merely because conduct is occasionally expressive, does not necessarily extend First Amendment protection to it. As the Supreme Court has observed, "it is possible to find some kernel of expression in almost every activity--for example, walking down the street or meeting one's friends at the shopping mall--but such a kernel is not sufficient to bring the activity within the protection of the First Amendment." City of Dallas v. Stanglin, 490 U.S. 19, 25, 104 L. Ed. 2d 18, 109 S. Ct. 1591 (1989).

In Spence v. State of Washington, 418 U.S. 405, 41 L. Ed. 2d 842, 94 S. Ct. 2727 (1974) (per curiam), the Supreme Court established guidelines for determining whether occasionally expressive conduct is "sufficiently imbued with the elements of communication to fall within the scope of the First ... Amendment." Id. at 409-10. "An intent to convey a particularized message [must be] present, and in the surrounding circumstances the likelihood [must be] great that the message would be understood by those who viewed it." Id. at 411. For example, in Johnson, an individual desecrated an American flag during the Republican National Convention, and the "overtly political nature of this conduct was both intentional and overwhelmingly apparent." Johnson, 491 U.S. at 406. Similarly, in Tinker v. Des Moines Independent Community School Dist., 393 U.S. 503, 21 L. Ed. 2d 731, 89 S. Ct. 733 (1969), a student's black arm band "conveyed an unmistakable message" about his stance on the Vietnam war, a "contemporaneous issue of intense political concern." Id. at 505-06.

Applying this standard, it is evident that exporting encryption source code is not sufficiently communicative. In both Johnson and Tinker, the expressive nature of the conduct was clear. Unlike Tinker, encryption source code does not convey "an unmistakable message." Unlike Johnson, the communicative nature of encryption source code is not "overwhelmingly [**25] apparent." Instead, source code is by design functional: it is created and, if allowed, exported to do a specified task, not to communicate ideas. Because the expressive elements of encryption source code are neither "unmistakable" nor "overwhelmingly apparent," its export is not protected conduct under the First Amendment.

IV. Prior Restraint

Plaintiff Judger urges that the Export Regulations are invalid on their face as an unconstitutional prior restraint on the export of encryption source code. Specifically, he alleges that the Export Regulations function as a prior restraint by requiring prepublication review and licensing of inherently expressive encryption software. Junger further argues that the Export Regulations lack adequate procedural safeguards to prevent the licensing officials' abuse of discretion. The Court finds that a facial challenge is inappropriate, and holds that the Export Regulations do not serve as a prior restraint on expressive conduct.

Prior restraints on publication of expressive materials are anathema to American constitutionalism. As the Supreme Court has recognized, "it has been generally, if not universally, considered that it is the chief purpose [**26] of the [First Amendment's free press] guaranty to prevent previous restraints upon publication." Near v. State of Minnesota ex rel. Olson, 283 U.S. 697, 713, 75 L. Ed. 1357, 51 S. Ct. 625 (1931). It is for this reason that "any prior restraint on expression comes to this Court with a 'heavy presumption' against its constitutional validity." Organization for a Better Austin v. Keefe, 402 U.S. 415, 419, 29 L. Ed. 2d 1, 91 S. Ct. 1575 (1971) (citations omitted).

In order for a licensing law to be invalidated by a prior restraint facial challenge, it "must have a close enough nexus to expression, or to conduct commonly associated with expression, to pose a real and substantial threat" of censorship. City of Lakewood v. Plain Dealer Publ'g Co., 486 U.S. 750, 759, 100 L. Ed. 2d 771, 108 S. Ct. 2138 (1988). The mere fact that regulated conduct possibly can be expressive is not enough to invalidate a law on its face on prior restraint grounds. See Roulette v. City of Seattle, 97 F.3d 300, 303 (9th Cir. 1996) (although sitting on city sidewalks may occasionally be expressive, city ordinance prohibiting sitting is not subject to facial challenge). As described above, the Court has found that exporting encryption software has little expressive nature. A facial attack upon legislation on First Amendment grounds is appropriate only where the challenged statute "is directed narrowly and specifically at expression or conduct commonly associated with expression." See Lakewood, 486 U.S. at 760.

Exporting encryption source code is not an activity that is "commonly associated with expression." Source code is a set of instructions to a computer that is commonly distributed for the wholly non-expressive purpose of controlling a computer's operation. It may, as the Court has noted, occasionally be exported for expressive reasons. Nevertheless, the prior restraint doctrine is not implicated simply because an activity may on occasion be expressive.

In Roulette, the Ninth Circuit recognized that Seattle's anti-sitting ordinance impaired the unquestionably expressive acts of a registrar of voters, a street musician, the Freedom Socialist Party, and the National Organization for Women. 97 F.3d at 302. Nevertheless, the law was not an unconstitutional prior restraint because neither sitting nor lying on the sidewalk are "integral to, or commonly associated with, expression." Id. at 304.

As in Roulette, exporting encryption software is not integral to expression. Because encryption software is not typically expression, a facial challenge does not succeed. Even if the Export Regulations have impaired the isolated expressive acts of academics like Plaintiff Junger, exporting software is typically non-expressive.

Neither are the Export Regulations "directed narrowly and specifically" at the expressive export of encryption source code. Lakewood, 486 U.S. at 760. The Export Regulations do not single out encryption software. Instead, all types of devices that have the capacity to encrypt data, whether software or hardware, are subject to licensing. See 15 C.F.R. 742.15. The Export Regulations are not "directed quite specifically" to "an entire field of scientific research and discourse." Bernstein III, 974 F. Supp. at 1305. Instead, the Export Regulations allow academic discussion and descriptions of software in print media while restricting the export of software that can actually encrypt data.

The Court, therefore, finds that Plaintiff Junger's facial challenge to the Export Regulations' licensing scheme fails. Because the Court finds that the Export Regulations are not narrowly directed at expressive conduct, and therefore not a prior restraint, considering Junger's claim that the Export Regulations lack adequate procedural safeguards is unnecessary. See, e.g., Lakewood, 486 U.S. at 772.

V. Overbreadth and vagueness

Plaintiff Junger argues that he is entitled to bring a facial challenge to the export regulatory scheme as unconstitutionally overbroad and vague. The Court finds that a facial challenge on overbreadth grounds is inappropriate because Junger fails to show that the Export Regulations injure third parties in a manner different from the way they affect the plaintiff. Also, the Court finds that the Export Regulations are not vague.

Overbreadth challenges are an exception to the usual requirement that a plaintiff "must assert his own legal rights and interests." Warth v. Seldin, 422 U.S. 490, 499, 45 L. Ed. 2d 343, 95 S. Ct. 2197 (1975). An overbreadth challenge allows a plaintiff to attack laws alleged to be unconstitutional under any circumstances, not merely as applied to the plaintiff's own circumstances. See New York State Club Ass'n, Inc. v. City of New York, 487 U.S. 1, 14, 101 L. Ed. 2d 1, 108 S. Ct. 2225 (1988). Overbreadth challenges are "strong medicine" that should be used "sparingly and only as a last resort." Id. (quoting Broadrick v. Oklahoma, 413 U.S. 601, 613, 37 L. Ed. 2d 830, 93 S. Ct. 2908 (1973)).

The overbreadth rule arises from the purpose of the doctrine. The overbreadth doctrine allows a challenge to laws having the potential to repeatedly chill the exercise of expressive activity by many individuals. To make the overbreadth challenge, there must be a realistic danger that the statute will significantly compromise recognized First Amendment protections of parties not before the Court. Members of City Council of City of Los Angeles v. Taxpayers for Vincent, 466 U.S. 789, 800, 80 L. Ed. 2d 772, 104 S. Ct. 2118 (1984). Under Vincent, to prevail on a facial overbreadth challenge, the plaintiff must show that the challenged law is "substantially overbroad." Id. at 801. To establish substantial overbreadth, a plaintiff must show that the law will have a significant and different impact on third parties' free speech interests than it has on his own. See id. A challenge on overbreadth grounds requires a showing that the governmental action impair third parties' free speech rights in a manner different from the law's effect on the plaintiff. The overbreadth doctrine does not apply where the law affects the plaintiff and third parties in the same manner. See id. at 802-803 (observing that "appellees' attack on the ordinance is basically a challenge to the ordinance as applied to their activities").

Junger does not show any difference between his professed injuries and those of parties not before the Court. The heart of his overbreadth argument is that the Export Regulations control the distribution of encryption software among fellow academics, and that such distribution does not pose a threat to United States security interests. But the resulting injury to other academics is the very same injury that Junger allegedly suffers. Because the Export Regulations potentially injure other academics in the same manner as Junger, their injury cannot be the basis of an overbreadth challenge.

Plaintiff Junger's overbreadth challenge fails because he does not show that the Export Regulations injure parties not before the Court in a manner different from the way they affect Junger. Even if Junger could bring the overbreadth challenge, he does not show the Export Regulations significantly compromise recognized First Amendment protections through a challenged law that is "substantially overbroad."

Junger also alleges that the Export Regulations' controls are vague because they do not give fair notice of what items are subject to the licensing requirement. The Court finds that the Export Regulations are not vague. The Export Regulations provide adequate notice. The regulations are quite detailed in describing which encryption software programs are subject to export licensing, and those that are not. Indeed, the Export Regulations even contain a description of the key length in "bits" for regulated programs. See 15 C.F.R. § 742.15 (b)(3)(i)-(ii).

VI. Content discrimination

A. Appropriate level of scrutiny

Plaintiff Junger urges this Court to review the Export Regulations under a strict scrutiny standard. He argues that strict scrutiny is appropriate because where the government seeks to "suppress, disadvantage, or impose differential burdens on speech because of its content," such regulations must be subject to the most searching judicial review. Turner Broadcasting System, Inc., 512 U.S. at 642. While laws working a disadvantage on speech because of its content are subject to penetrating review, the Court finds the subject regulations are content neutral. Because the regulations are content neutral, they are subject to intermediate scrutiny.

In deciding whether a law discriminates based on content, the government's purpose in adopting the challenged restriction is "the controlling consideration." Ward v. Rock Against Racism, 491 U.S. 781, 791, 105 L. Ed. 2d 661, 109 S. Ct. 2746 (1989). The test is whether the government adopted the restriction "because of disagreement with the message [the speech] conveys." Id. Generally, laws that distinguish favored speech from unfavored speech because of the views expressed are content based. By contrast, laws that benefit or burden speech without reference to the expressed views are usually deemed content neutral. See Turner, 512 U.S. at 643. See also Boos v. Barry, 485 U.S. 312, 318-19, 99 L. Ed. 2d 333, 108 S. Ct. 1157 (1988) (plurality opinion) (invalidating ordinance because whether people are permitted to picket in front of a foreign embassy depends "entirely upon whether their picket signs are critical of the foreign government or not"); Taxpayers for Vincent, 466 U.S. at 804 (upholding ordinance prohibiting the posting of signs on public property because it "is neutral--indeed it is silent--concerning any speaker's point of view").

Junger first alleges that the Export Regulations discriminate because of content because they treat other types of software more favorably than encryption software. Plaintiff Junger is correct that the government subjects encryption software to heightened licensing regulations that do not apply to other types of software. Under the Export Administration Act, all types of software are regulated as "technology." 50 U.S.C. App. § 2415(4). However, encryption software is categorized under the stricter "commodity" standard. 15 C.F.R. Part 774, note following 5D002.

The Export Regulations are not content based, however, because the regulations burden encryption software without reference to any views it may express. As the President has made clear, encryption software is regulated because it has the technical capacity to encrypt data and by that jeopardize American security interests, not because of its expressive content. Exec. Order No 13026, 1996 WL 666563. The regulatory distinction between encryption software and other types of software does not turn on the content of ideas. Instead, it turns on the ability of encryption software to actually do the function of encrypting data.

That the Export Regulations are not directed at the content of ideas is further suggested because the Export Regulations do not attempt to restrict the free flow of public information and ideas about cryptography. Publicly available information that can be used to design or operate encryption products is not subject to the Export Regulations and may be freely exported. 15 C.F.R. § 734.3(b)(3). More important, the Export Regulations exclude books, magazines, and other printed materials, by that imposing no controls on the export of publications on cryptography. 15 C.F.R. § 734.3(b)(2).

The plaintiff also argues that the Export Regulations are content based because they discriminate based upon media: export of encryption software in print form is not subject to the Export Regulations' licensing requirement, whereas software exported electronically is subject to licensing. The plaintiff argues that Reno v. ACLU, 138 L. Ed. 2d 874, 117 S. Ct. 2329 (1997), forecloses any distinction between Internet and print publications. There, the Supreme Court held that "our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to [the Internet]." Id. 117 S. Ct. at 2344.

Plaintiff's argument is misguided for two reasons. First, as discussed above, the media distinction is not content-based discrimination because it is not directed at the content of ideas. Second, Reno is distinguishable.

In Reno, parties challenged the constitutionality of the Communications Decency Act. That act limited the transmittal of indecent (but not obscene) information on the Internet. In finding the Communications Decency Act unconstitutional, the Court found "any person with a phone line can become a town crier with a voice that resonates further than it could from any soapbox." Id. at 2344.

Reno is distinguishable from the instant case. In Reno, the court found the government could not restrict the transmission of indecent (but not obscene) communication. In so finding, the Court held the Decency Act was "a content-based blanket restriction on [indecent] speech." Id. at 2343. But obscenity does not have the functional ability that encryption software does. In other words, the function of a given lascivious photograph is the same whether on a computer screen or in a magazine. Software, by contrast, is functionally different in electronic form than when in print. When in print, encryption source code is simply a description of instructions. When in electronic form, encryption source code is a functional device that directs a computer to perform specified tasks. Unlike Reno, the regulated item is fundamentally and functionally different when in electronic form than when in print form.

Finally, Junger contends that the Export Regulations discriminate based on content by excepting certain mass market and key-recovery software from export regulations. See 15 C.F.R. § 742.15(b)(1)-(2). This argument does not persuade. The government distinguishes among software based upon its functional ability. These distinctions are not directed at the content of ideas. 40-bit mass market and key-recovery software pose a lesser threat to American security interests than more complex types of encryption software. n15 Plaintiff's argument only proves that the government tailors the licensing requirements to the risks presented, with less restrictive requirements for exports that pose lesser risks.

B. Application of intermediate scrutiny standard

Because the Export Regulations are content neutral, the Court must evaluate the licensing scheme under intermediate scrutiny. Turner, 512 U.S. at 662. A content neutral government regulation passes constitutional muster if '"it furthers an important or substantial governmental interest; if the governmental interest is unrelated to the suppression of free expression; and if the incidental restriction of alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest."' Id. (quoting United States v. O'Brien, 391 U.S. 367, 377, 20 L. Ed. 2d 672, 88 S. Ct. 1673 (1968)).

The "important interest" prong is satisfied because the government is properly concerned with controlling the export of encryption software to potentially hostile countries or individuals to protect national security. The use of encryption products by foreign intelligence targets can have "a debilitating effect" on the National Security Agency's "ability to collect and report ... critical foreign intelligence." Without the Export Regulations' licensing requirements, domestic producers of encryption software could export their products, without restriction, to any person abroad for any reason, no matter a particular encryption product's strength and its usefulness to hostile interests abroad.

The government's important interest in controlling the spread of encryption software is not diminished even if certain forms of encryption software are already available abroad. Whatever the present foreign availability of encryption software, the government has a substantial interest to limit future distribution. The government also has an interest in ensuring that the most complex and effective encryption programs, such as 128-bit key length software, are not widely released abroad.

The Export Regulations, furthermore, are "unrelated to the suppression of free expression." O'Brien, 391 U.S. at 377. Plaintiff Junger argues that the Export Regulations are related to the suppression of expression because they limit the publication of software. This argument is off the mark. A regulation is not "related" to the suppression of free expression simply because it may have the effect of suppressing expression. Such an interpretation of the O'Brien test would render it a nullity, for any law that had the incidental effect of burdening expression would violate the First Amendment. Instead, a law violates the "unrelated" prong if it is "directed at the communicative nature of conduct." Johnson, 491 U.S. at 406. In other words, the government cannot prohibit particular conduct to reach its expressive elements. Id.

The Export Regulations are "unrelated to the suppression of free expression," O'Brien, 391 U.S. at 377, for the same reasons that they are content neutral. The Export Regulations are not designed to limit the free exchange of ideas about cryptography. Instead, the government regulates encryption software because it does the function of actually encrypting data.

Besides meeting the "important interest" and "unrelated" prongs, the Export Regulations also satisfy the "narrow tailoring" requirement. The narrow tailoring prong does not require that the government employ the least speech-restrictive means to achieve its purposes. Instead, narrow tailoring requires that the law not "burden substantially more speech than is necessary to further the government's legitimate interests." Ward, 491 U.S. at 799. Accordingly, the requirement is satisfied if the government's interests "would be achieved less effectively absent the regulation." Id.

The government's interest in controlling the spread of encryption software and gathering foreign intelligence surely would be "achieved less effectively" absent the export controls. Encryption software posted on the Internet or on computer diskette can be converted from source code into workable object code with a single keystroke. Elimination of export controls would permit the unrestricted export of encryption software to any person, organization, or country, without regard to the strength of the software, the identity of the recipients, or the uses to which it might be put.

The export controls at issue do not "burden substantially more speech than is necessary to further the government's legitimate interests," Ward, 491 U.S. at 799, for the same reason they are not overbroad. Export controls are targeted at precisely the activity that threatens the government's legitimate interests. First, the Export Regulations do not prohibit exporting encryption products altogether, but only those inconsistent with American national security and foreign policy interests. See 15 C.F.R. § 742.15(b). The licensing requirements are tailored to the risks presented, with less restrictive requirements for exports that pose lesser risks, such as 40-bit mass market and key-recovery software. See 15 C.F.R. § 742.15(b)(1)-(2). Finally, the Export Regulations do not reach print publications. Thus, they "leave open ample alternative channels of communication," Ward, 491 U.S. at 802, for the exchange of information and ideas regarding cryptography.

Because the content neutral export regulations at issue enable the government to collect vital foreign intelligence, are not directed at a source code's ideas, and do not burden more speech than necessary, they satisfy intermediate scrutiny.

VII. Academic freedom and freedom of association

Plaintiff Junger alleges that the Export Regulations violate his First Amendment rights of academic freedom and freedom of association by restricting his ability to teach, publish, and distribute encryption software. Neither Junger nor the defendants address this issue in the briefs submitted to the Court. The Court therefore considers that Junger has waived the academic freedom and freedom of association claims.

VIII. International Emergency Economic Powers Act and the separation of powers

Plaintiff Junger claims that executive regulation of encryption exports under the International Emergency Economic Powers Act is an impermissibly broad delegation of authority and, therefore, a violation of the separation of powers. Specifically, he alleges that the President does not have the statutory authority under the International Emergency Economic Powers Act to extend regulatory control to encryption software. Instead, encryption software is exempt from regulation under the International Emergency Economic Powers Act because it is "informational material." The plaintiff has failed to address the merits of this claim in his briefs, and only argues that summary judgment is inappropriate because there are material facts still in dispute.

The Court lacks jurisdiction to review Junger's claim that the President exceeded his authority under the International Emergency Economic Powers Act when he directed that encryption products be controlled for export. "Longstanding authority holds that such review is not available when the statute in question commits the decision to the discretion of the President." Dalton v. Specter, 511 U.S. 462, 474, 128 L. Ed. 2d 497, 114 S. Ct. 1719 (1994).

The President clearly has statutory authority under the International Emergency Economic Powers Act to extend export controls in general. See United States v. Spawr Optical Research, Inc., 685 F.2d 1076, 1079-1082 (9th Cir. 1982), cert. denied, 461 U.S. 905, 76 L. Ed. 2d 807, 103 S. Ct. 1875 (1983). Congress has recognized and approved of this practice, see id. 685 F.2d at 1081, and courts have consistently held that the President's decision to invoke the International Emergency Economic Powers Act to regulate international trade is unreviewable. See Regan v. Wald, 468 U.S. 222, 242, 82 L. Ed. 2d 171, 104 S. Ct. 3026 (1984); Beacon Prods. Corp. v. Reagan, 633 F. Supp. 1191, 1194-95 (D. Mass. 1986), aff'd, 814 F.2d 1 (1st Cir. 1987); Spawr Optical, 685 F.2d at 1080. "Where there is no contrary indication of legislative intent and where, as here, there is a history of congressional acquiescence in conduct of the sort engaged in by the President," the President has the greatest discretion to act. Dames & Moore v. Regan, 453 U.S. 654, 678-79, 69 L. Ed. 2d 918, 101 S. Ct. 2972 (1981).

IX. Conclusion

For these reasons, plaintiff's motion for summary judgment is denied, and defendants' motion for summary judgment is granted.

3. Karn v. U.S. Department of State (opinion filed March 22, 1996); see also Karn v. U.S. Dept. of State, 107 F.3d 21 (Jan. 21, 1997)(remand of appeal to District Ct. to re-evaluate the claims in light of the jurisdictional transfer from the State Dept. to the Comerce Dept. made by Exec. Order 13026 (issued November 1996)

PHILIP R. KARN, JR., Plaintiff, v. U.S. DEPARTMENT OF STATE, and THOMAS B. MCNAMARA, Defendants, 925 F. Supp. 1 (D.Ct. D.C, March 22, 1996)

CHARLES R. RICHEY, J.

 

INTRODUCTION

This case presents a classic example of how the courts today, particularly the federal courts, can become needlessly invoked, whether in the national interest or not, in litigation involving policy decisions made within the power of the President or another branch of the government. The plaintiff, in an effort to export a computer diskette for Profit, raises administrative law and meritless constitutional claims because he and others have not been able to persuade the Congress and the Executive Branch that the technology at issue does not endanger the national security. This is a "political question" for the two elected branches under Articles I and II of the Constitution.

The case arises out of the defendants' designation of the plaintiff's computer diskette as a "defense article" pursuant to the Arms Export Control Act (AECA), 22 U.S.C. § § 2751-2796d, and the International Traffic in Arms Regulations (ITAR), 22 C.F.R. § § 120-130. The plaintiff alleges that the defendants' designation of a diskette containing source codes for cryptographic algorithms n1 as a defense article subject to the export controls set forth in the ITAR, when the defendant deemed a book containing the same source codes not subject to said export controls, is arbitrary and capricious and an abuse of discretion in violation of the Administrative Procedure Act (APA), 5 U.S.C. § 706(2)(a). The plaintiff also raises a number of constitutional claims. Specifically, the plaintiff alleges that the defendants' regulation of the diskette violates the plaintiff's First Amendment right to freedom of speech and arbitrarily treats the diskette differently than the book in violation of the plaintiff's Fifth Amendment right to substantive due process.

The defendants move to dismiss the plaintiff's APA challenge based on a provision in the AECA precluding the judicial review of the designation of items as defense articles subject to the AECA. The defendants [**4] move for summary judgment on the ground that the plaintiff's First Amendment and Fifth Amendment rights have not been violated by the defendants' regulation of his computer diskette under the AECA and the ITAR.

Upon consideration of the filings by the parties, the entire record herein, the applicable law thereto, and for the reasons set forth below, the Court shall grant the defendants' Motion to Dismiss the plaintiff's APA claim as nonjusticiable, and the Court shall grant the defendant's Motion for Summary Judgment with respect to the plaintiff's First and Fifth Amendment claims.

BACKGROUND

On February 12, 1994, the plaintiff submitted to the Department of State a commodity jurisdiction request for the book Applied Cryptography, by Bruce Schneier. The book Applied Cryptography provides, among other things, information on cryptographic protocols, cryptographic techniques, cryptographic algorithms, the history of cryptography, and the politics of cryptography. Part Five of Applied Cryptography contains source code for a number of cryptographic algorithms. This first commodity jurisdiction submission did not include "machine-readable media" such as a computer diskette or CD-ROM. Lowell Dec., Tab. 4.

On March 2, 1994, in response to the plaintiff's commodity jurisdiction request, the Department of State's Office of Defense Trade Controls (ODTC) determined that the book is not subject to the jurisdiction of the Department of State pursuant to the ITAR. Joint St. P 4. The ODTC's response explicitly stated, however, that this determination did not extend to the two diskettes referenced in the book and available from the author -- said disks apparently containing the encryption source code printed in Part Five of Applied Cryptography. Joint St. P 5.

On March 9, 1994, the plaintiff submitted a commodity jurisdiction request for a diskette containing the source code printed in Part Five of Applied Cryptography (the "Karn diskette"). The request stated that "the diskette contains source code for encryption software that provides data confidentiality" and that "the software on this diskette is provided for those who wish to incorporate encryption into their applications." Joint St. P 7. The ODTC responded, stating that the Karn diskette is subject to the jurisdiction of the Department of State pursuant to the ITAR and the AECA because the diskette "is designated as a defense article under category XIII(b)(1) of the United States Munitions List." Joint St. P 8; Lowell Decl. P 15, Tab 9.

Pursuant to procedures set forth in 22 C.F.R. § 120.4(g), the plaintiff appealed the commodity jurisdiction determination concerning the source code diskette to the Deputy Assistant Secretary of State by letter dated June 10, 1994. Joint St. P 9. The Deputy Assistant denied the plaintiff's appeal by letter dated October 7, 1994. Joint St. P 10. The plaintiff appealed this denial to the Assistant Secretary of State for Political-Military Affairs, defendant Thomas McNamara, by letter dated December 5, 1994. Defendant McNamara denied the plaintiff's appeal on June 13, 1995, reaffirming the earlier determinations that the Karn diskette "contains cryptographic software [and] is designated as a defense article under Category XIII(b)(1)." Joint St. P 12; Lowell Decl. P 22, Tab 14.

The plaintiff filed his Complaint in this Court on September 21, 1995, and the defendants filed their Motion to Dismiss or, in the Alternative, for Summary Judgment on November 15, 1995, containing the declarations of William P. Crowell, Deputy Director of the National Security Agency, and William J. Lowell, Director of the Office of Defense Trade Controls, Bureau of Political-Military Affairs, Department of State. The plaintiff filed an Opposition thereto on December 11, 1995, including the declarations of Barbara Tuthill, a Secretary at Venable, Baetjer & Howard, Philip R. Zimmerman, a software developer, and plaintiff Philip R. Karn, Jr. The defendants Replied on December 18, 1995. The plaintiff filed a Supplemental Memorandum on December 22, 1995, containing the supplemental declaration of plaintiff Karn, in order to correct alleged misstatements made by defendants. The defendants filed a Response on January 16, 1996.

I. THE COURT SHALL DISMISS THE PLAINTIFF'S APA CHALLENGE TO THE DEFENDANTS' DESIGNATION OF THE PLAINTIFF'S DISK AS A "DEFENSE ARTICLE" BECAUSE THE ARMS EXPORT CONTROL ACT PRECLUDES JUDICIAL REVIEW.

A. Pursuant to AECA, The President Designated The Karn Diskette As A "Defense Article" Subject To Export Regulation Under The ITAR.

The AECA authorizes the President to control the export of "defense articles":

The President is authorized to designate those items which shall be considered as defense articles and defense services for the purposes of this section and to promulgate regulations for the import and export of such articles and services. The items so designated shall constitute the United States Munitions List.

22 U.S.C. § 2778(a)(1). The President delegated this authority to the Secretary of State, who then promulgated the ITAR. See 22 C.F.R. § 120.1(a).

The ITAR contains 10 subparts, including "Purpose and definitions," contained in Part 120, and "The United States munitions list," contained in Part 121. Part 121, containing the munitions list, describes those items designated "defense articles" by the Secretary of State. The descriptions of such defense articles do not include specific manufacturer names or highly-detailed descriptions, but instead contain relatively general descriptions, such as the following: "Military tanks, combat engineer vehicles, bridge launching vehicles, half tracks and gun carriers"; "Military training equipment including but not limited to attack trainers, radar target trainers ... and simulation devices related to defense articles"; "Body armor specifically designed, modified or equipped for military use ..."; n6 and "Radar systems, with capabilities such as (i) Search, (ii) Acquisition, (iii) Tracking ...." Likewise, the munitions list specifically addresses cryptographic systems and components and includes the following description:

(b) Information Security Systems and equipment, cryptographic devices, software, and components specifically designed or modified therefor, including: (1) Cryptographic ... systems, equipment, assemblies, modules, integrated circuits, components or software with the capability of maintaining secrecy or confidentiality of information or information systems .... 22 C.F.R. § 120.4, category XIII.

Part 120 of the ITAR provides assistance for interpreting the terms used in the munitions list. The "commodity jurisdiction procedure" is explained and provided for in this definitional section, and the ITAR directs that the ODTC use the procedure "if doubt exists as to whether an article or service is covered by the U.S. Munitions List." 22 C.F.R. § 120.4. As set forth previously, the plaintiff requested the use of this procedure for the book and the diskette. The ODTC determined that the diskette was covered by the munitions list but that the book was not.

The plaintiff argues that the diskette, like the book, is not a "defense article" covered by the munitions list. The plaintiff contends that pursuant to sections 125.1 and 120.11, the diskette is in the "public domain" and therefore is not subject to the ITAR. See Plaint's Opp. 19-21. However, the defendants contend that the diskette does not fall within the "public domain" exemption because said exemption only applies to "technical data" which, according to the defendants, does not include cryptographic software. See Defs' Reply 21-22 (citing 22 C.F.R. § § 120.10(a)(4), 120.11 and 121.8(f)).

B. Section 2778(h) Of The Arms Export Control Act Precludes Judicial Review Of The Designation Of The Karn Diskette As A "Defense Article."

The AECA explicitly bars judicial review of the President's designation of an item as a defense article:

The designation by the President (or by an official to whom the President's functions under subsection (a) of this section have been duly delegated), in regulations issued under this section, of items as defense articles or defense services for purposes of this section shall not be subject to judicial review.

22 U.S.C. § 2778(h). The plaintiff argues, however, that the Court should construe this provision so narrowly as to cover only the act of listing items on the munitions list contained in Part 121 of the ITAR and not the determination whether an item, in this case the plaintiff's diskette, is actually covered by the language of the munitions list pursuant to the definitional provisions contained in Part 120 of the ITAR. The plaintiff bases this argument upon the presumption in favor of judicial review and the language of § 2778(h) when read in conjunction with § 2778(a) of the AECA.

It is often stated that there is a presumption in favor of judicial review of agency action absent "clear and convincing evidence" of legislative intent to preclude it. Bowen v. Michigan Acad. of Family Phys., 476 U.S. 667, 671, 90 L. Ed. 2d 623, 106 S. Ct. 2133 (1986). However, the Supreme Court has cautioned that this standard "is not a rigid evidentiary test but a useful reminder to the courts that, where substantial doubt about the congressional intent exists, the general presumption favoring judicial review of administrative action is controlling." Block v. Community Nutrition Inst., 467 U.S. 340, 351, 81 L. Ed. 2d 270, 104 S. Ct. 2450 (1984).The presumption may be overcome, and the appropriate standard for determining whether "a particular statute precludes judicial review is determined not only from its express language, but also from the structure of the statutory scheme, its objectives, its legislative history, and nature of the administrative action involved." Id. at 345.

Section 2778(h) expressly bars judicial review of the President's (or his designee's -- in this case, the Secretary of State's and/or ODTC's) designation of items as defense articles. In the case at bar, the Department of State designated the Karn diskette as a defense article by listing "cryptographic software" in Part 121 to the ITAR and then confirming that the diskette was covered by this description pursuant to the definitional provisions to the ITAR contained in § 120.4. The plaintiff argues, however, that because § 2778(a)(1) authorizes the designation of "defense articles and .... the items so designated shall constitute the United States Munitions List," and because the bar to judicial review appearing in § 2778(h) only applies to "the designation ... in regulations ... of items as defense articles," judicial review is precluded only for the act of listing items as defense articles on the munitions list published in Part 121 of the ITAR. Therefore, according to the plaintiff, any interpretation and application of the descriptions on the munitions list by the agency are reviewable. See Plaint's Opp. at 36-37.

The Court finds the plaintiff's reading strained and unreasonable. It is far more reasonable to read § 2778(a)(1) and (h) to preclude judicial review for the designation of items as defense articles pursuant to the language of the munitions list and the procedures provided for interpreting the list, all set forth in the ITAR -- in other words, if the defendants follow the procedures set forth in the ITAR and authorized by the AECA for designating an item as a defense article, such item is a part of the munitions list. The defendants did precisely that. Furthermore, even if the plaintiff's reading of the statute were plausible, the Court finds that any ambiguity would be dispelled by the objective of the AECA, the structure of the United States export scheme, and the nature of the plaintiff's challenge.

To parse the statute as the plaintiff suggests makes little sense in light of the objectives of the AECA. The AECA was enacted to permit the Executive Branch to control the export and import of certain items in order to further "world peace and the security and foreign policy" of the United States. 22 U.S.C. § 2778(a)(1). Designating an export such that it is subject to the AECA and the ITAR requires first describing the type of item in the regulations, and second, if asked by a potential exporter, confirming that the item in question is or is not covered by such description. The commodity jurisdiction procedure provides the latter function, as provided for explicitly both in the definitional section of the ITAR and in the munitions list with respect to cryptographic software. See 22 C.F.R. § § 120.4 and 121.1, category XIII, (b)(1), Note. Determining whether an item is covered by the munitions list is critical to the President's ability to designate and control the export of those items the Executive Branch considers to be defense articles. Simply put, the Court discerns from the legislative scheme that Congress has precluded judicial review of the commodity jurisdiction procedure.

Judicial non-reviewability of the defendants' commodity control decision is also consistent with the structure of the United State's export control scheme. Items not regulated by the ITAR but which have both commercial and potential military application -- "dual use" items -- are regulated by the Secretary of Commerce pursuant to the Export Administration Regulations (EAR), 15 C.F.R. § § 768-99, which were promulgated pursuant to the Export Administration Act (EAA), 50 App. U.S.C. § § 2401-20. Like the munitions list contained in Part 121 of the ITAR, the EAR contains a description of items, the commodity control list (CCL), subject to the licensing requirements of the EAR. See 15 C.F.R. part 799. Similarly, the EAA also contains a judicial review prohibition. Section 2412(a) of the EAA states that "the functions exercised under this Act are excluded from the operation of [the APA.]" In a criminal appeal before the Ninth Circuit, brought by a defendant who exported laser mirrors without obtaining a license from the Secretary of Commerce pursuant to the EAA and EAR, the court held that section 2412(a) of the EAA precluded judicial review of the Secretary of Commerce's determination that the type of laser mirrors exported by the defendant were in fact covered by the language of the CCL. See United States v. Spawr Optical Research, Inc., 864 F.2d 1467 (9th Cir. 1988), cert. denied, 493 U.S. 809, 107 L. Ed. 2d 20, 110 S. Ct. 51 (1989). In other words, under the EAR and EAA judicial review is precluded for the act of describing items on the CCL and for the act of determining whether an item is covered by the CCL descriptions.

In addition to the obvious analogy between the schemes of the EAA and the AECA, these statutes are in fact part of a singular export scheme, in that the commodity jurisdiction procedure set forth in § 120.4 of the ITAR not only determines whether an item is a defense article covered by the munitions list, it also determines whether an item is covered by the EAR. See Lowell Decl., Tab 2. In fact, Part 121 of the ITAR, containing the published munitions list which the plaintiff concedes is not subject to judicial review, states the following:

NOTE: A procedure has been established to facilitate the expeditious transfer to the Commodity Control List of mass market software products with encryption that meet specified criteria regarding encryption for the privacy of data ... Requests to transfer commodity jurisdiction of mass market software products designed to meet the specified criteria may be submitted in accordance with the commodity jurisdiction provisions of § 120.4.

22 C.F.R. § 121.1, Category XIII(b)(1). To achieve consistency between the EAA and the AECA the Court concludes that decisions made pursuant to the commodity jurisdiction procedure should not be reviewable. Furthermore, considering the deference afforded the President in matters of foreign policy, it would be strange indeed if Congress precluded judicial review of the determination that an item has merely a potential for military use (i.e., subject to the commodity control list), but permitted review of the determination that an item was in fact a defense article (i.e., subject to the the munitions list).

The Court has reviewed House Report No. 101-296, Senate Report No. 101-173, and the Congressional Record for legislative history regarding the Anti-Terrorism and Amendments to the Arms Export Control Act of 1989, which added § 2888(h) to the AECA. The legislative history regarding § 2778(h) is scant. However, one of the stated purposes of the Act -- to eliminate "overlapping standards that lead to confusion and misinterpretation," including "identifying which arms are subject to restrictions," see S. Rep. No. 173, 101st Cong., 1st Sess. 2 (1989); see also H.R. Rep. No. 296, 101st Cong., 1st Sess. 3 (1989) -- further supports finding consistency between the AECA and EAA.

Finally, the plaintiff's challenge is not of a nature that commands a heightened presumption in favor of judicial review. Courts have held that "the presumption of judicial review is particularly strong" where the plaintiff alleges that the agency facially violated its authority delegated under the statute. Dart v. United States, 270 U.S. App. D.C. 160, 848 F.2d 217, 223 (D.C. Cir. 1988). One rationale for this general rule is that such "facial" challenges typically raise "a discrete issue, unrelated to the facts of the case, that only needs to be resolved once," and therefore, entertaining the challenge does not "open the floodgates to litigation." Id.; see also, Bowen v. Michigan Acad. of Family Phys., 476 U.S. 667 at 677, 680 n.11, 90 L. Ed. 2d 623, 106 S. Ct. 2133 (1986); Johnson v. Robison, 415 U.S. 361, 370, 39 L. Ed. 2d 389, 94 S. Ct. 1160 (1974). Another rationale for presuming the reviewability of such facial challenges is that, "when Congress limits its delegation of power, courts infer (unless the statute clearly directs otherwise) that Congress expects this limitation to be judicially enforced." Dart, 848 F.2d at 223.

In the case at bar, the plaintiff's APA claim does not raise a facial challenge to the agency's action in the context of the agency's statutory authority, n13 but instead disputes whether the Karn diskette constitutes a defense article subject to the licensing restrictions of the ITAR. Permitting judicial review of the plaintiff's APA claim would in fact open the floodgates to litigation; every time a potential exporter is informed through the commodity jurisdiction procedure that the item he wishes to export (or has already exported) is designated in the munitions list, the exporter could seek judicial review of that decision.

Based on the authorities cited in the plaintiff's Opposition, the plaintiff maintains that it is incumbent upon courts to essentially torture the language of finality provisions in order to permit judicial review whenever possible. See Plaint's Opp. 34-38, 41-42. Each of the cases cited by the plaintiff involved either legislative historyor a statutory scheme that raised substantial doubt as to whether Congress intended to preclude judicial review. Moreover, some of the cases often involved a facial challenge n16 to a statute or to the agency's action. Thus, the Court finds that the authorities relied upon by the plaintiff are inapposite to the circumstances presented in the present case. For the reasons discussed above (i.e., the express language of the AECA, the arms export control scheme, and the nature of the plaintiff's challenge) the Court holds that Congress has barred judicial review.

II. THE DEFENDANTS ARE ENTITLED TO SUMMARY JUDGMENT AS A MATTER OF LAW ON THE PLAINTIFF'S FIRST AND FIFTH AMENDMENT CLAIMS.

Although the Court holds that the AECA precludes judicial review of the Department

of State's determination that the Karn diskette is a designated defense article subject to the ITAR, the plaintiff also asserts that the regulation of the diskette violates the plaintiff's First and Fifth Amendment rights. Both parties agree that such constitutional challenges are not barred by § 2778(h). See Webster v. Doe, 486 U.S. 592, 602-05, 100 L. Ed. 2d 632, 108 S. Ct. 2047 (1988). Accordingly, the Court will now proceed to address the defendants' Motion for Summary Judgment with respect to these claims.

A party is entitled to summary judgment when there are no material facts in dispute and its position is correct as a matter of law. Anderson v. Liberty Lobby, Inc., 477 U.S. 242, 247-48, 91 L. Ed. 2d 202, 106 S. Ct. 2505 (1986); Celotex v. Catrett, 477 U.S. 317 at 321, 322, 91 L. Ed. 2d 265, 106 S. Ct. 2548 (1986). Material facts are those "facts that might affect the outcome of the suit under the governing law ...." Anderson, 477 U.S. at 248. The Court finds that there are no material facts in dispute with respect to the plaintiff's First and Fifth Amendment claims, and for the reasons set forth below, the defendants are entitled to summary judgment as a matter of law.

A. The Court Shall Grant The Defendants' Motion For Summary Judgment On The Plaintiff's First Amendment Claim Because The Regulation Is Content-Neutral And Meets The O'Brien Test.

1. Regulation Of The Diskette Is Subject To The O'Brien Test Because The Governmental Interest At Stake Is Unrelated To The Content Of Any Protected Speech Contained On The Diskette.

The plaintiff contends that the defendants' regulation of the Karn diskette n17 constitutes a restraint on free speech in violation of the plaintiff's First Amendment rights. The plaintiff argues the diskette should be considered "speech" for the purpose of First Amendment analysis because the computer language source codes contained on the diskette are comprehensible to human beings when viewed on a personal computer, because the diskette contains "comments" interspersed throughout the source code which are useful only to a human and are ignored by the computer, and because the source code and comments taken together teach humans how to speak in code. n18

As a threshold matter, for the purpose of addressing the dispositive issue whether the regulation is justified and permissible, the Court will assume that the protection of the First Amendment extends to the source code and the comments on the plaintiff's diskette. The Supreme Court has described the First Amendment right to free speech as that which "generally prevents the government from proscribing speech because of disapproval of the ideas expressed." R.A.V. v. City of St. Paul, 505 U.S. 377, 112 S. Ct. 2538, 2542, 120 L. Ed. 2d 305 (citations omitted). Assuming the source codes and comments are within the arena of protected speech, the Court must then determine the basis for the regulation at issue in this case.

The rationale for a regulation determines the level of scrutiny to be applied to said regulation; if the regulation is content-based, the regulation will be "presumptively invalid," whereas if the regulation is content-neutral, then the government may justify the regulation if certain other criteria are met. 112 S. Ct. at 2542-54. These additional criteria -- whether the regulation is (1) within the constitutional power of the government, (2) "furthers an important or substantial governmental interest," and (3) is narrowly tailored to the governmental interest -- have been referred to as the O'Brien test after the Supreme Court upheld the government's prohibition against burning draft cards based on these criteria in United States v. O'Brien, 391 U.S. 367, 20 L. Ed. 2d 672, 88 S. Ct. 1673 (1968).

The plaintiff disputes this characterization of the law, arguing that the nature of the matter regulated, (e.g., whether "conduct" or "pure speech"), as opposed to the rationale for the regulation, actually dictates the level of scrutiny to be applied. The plaintiff submits that the O'Brien criteria are inapplicable because they apply only to the regulation of "conduct", and that the Karn diskette is "pure speech", the regulation of which should require strict scrutiny review. The Court disagrees, as the plaintiff's argument places form over substance. Pursuant to extensive First Amendment jurisprudence, the government's rationale for the regulation controls, regardless of the form of the speech or expression regulated. See Ward v. Rock Against Racism, 491 U.S. 781, 791, 105 L. Ed. 2d 661, 109 S. Ct. 2746 (1989) ("Time, place, and manner" restriction on music permitted where, among other things, regulation was content-neutral); Clark v. Community of Creative Non-Violence, 468 U.S. 288, 298, 82 L. Ed. 2d 221, 104 S. Ct. 3065 (1984) (Standard for evaluating expressive conduct, including requirement that regulation be content-neutral, "is little, if any, different from standard applied to time, place, or manner restrictions"); O'Brien, 391 U.S. at 377 (Government prohibition against burning of draft cards sufficiently justified if, among other things, "the governmental interest is unrelated to the suppression of free expression"). Accordingly, it is unnecessary for the Court to make any finding regarding the nature of the matter contained on the Karn diskette.

The government regulation at issue here is clearly content-neutral. The defendants' rationale for regulating the export of the diskette is that "the proliferation of [cryptographic hardware and software] will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests." Crowell Decl. at 3. The defendants are not regulating the export of the diskette because of the expressive content of the comments and or source code, but instead are regulating because of the belief that the combination of encryption source code on machine readable media will make it easier for foreign intelligence sources to encode their communications. The government considers the cryptographic source code contained on machine-readable media as cryptographic software and has chosen to regulate it accordingly.

The plaintiff does not dispute this motive for regulating the export of the diskette, but instead questions the logic of such a motive in light of the plaintiff's allegations that, without compiling the source code and without further programming, the Karn diskette does not perform a cryptographic function, and that there is no actual danger to national security because the source codes can be obtained abroad through the book or on the Internet. Such issues are not material to the determination of content neutrality. In this case, the plaintiff has not presented any evidence to suggest the bad faith of the government, or that the government's expressed motive is a pretense. Accordingly, the Court finds that the rationale expressed by the government is content-neutral and the regulation is subject to the standards set forth by the Supreme Court in O'Brien.

2. The Regulation Of The Diskette Meets The O'Brien Test Because It Is Within The Power Of The Government To Control The Export Of Defense Articles, It Furthers The Significant Governmental Interest Of Preventing The Proliferation Of Cryptographic Products, And It Is Narrowly Tailored To Meet That Interest.

As stated previously, a content-neutral regulation is justified under the O'Brien test if it is within the constitutional power of the government, it "furthers an important or substantial governmental interest," and "the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest." O'Brien, 391 U.S. at 377. The plaintiff does not dispute that regulating the export of cryptographic software is within the constitutional power of the government. Nor does the plaintiff expressly dispute the second requirement, that the government has an important interest at stake. The defendants have submitted evidence, which the plaintiff does not dispute, stating that the interception of communication made by foreign intelligence targets is "essential to the national defense, national security, and the conduct of the foreign affairs of the United States." Crowell Decl. at 3. In the context of this factual backdrop, the defendants have expressed the following government interest for justifying the regulation of the plaintiff's diskette: "the proliferation of cryptographic products will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests." Crowell Decl. at 3.

The plaintiff argues instead that the third prong of the O'Brien test is not satisfied because the cryptographic algorithms contained on the Karn diskette are "already widely available in other countries [through the Internet and other sources] or are so 'weak' that they can be broken by the [National Security Agency]." n22 Plaint's Opp. 15-16. Although the plaintiff has labeled his argument as one concerning the "narrowly tailored restriction" requirement of the O'Brien test, the plaintiff's argument implicates the second O'Brien requirement by questioning whether the government has a legitimate interest at stake. Indeed, the plaintiff contends that his argument constitutes a factual dispute with the defendants, making the plaintiff's First Amendment claim inappropriate for summary judgment as a matter of law.

The Court does not agree. The plaintiff attempts to disguise a disagreement with the foreign policy judgment of the President as a factual dispute. By placing cryptographic products on the ITAR, the President has determined that the proliferation of cryptographic products will harm the United States. This policy judgment exists despite the availability of cryptographic software through the Internet and the National Security Agency's alleged ability to break certain codes. Even if this were a factual dispute, it is not one into which this Court can or will delve. The Court will not scrutinize the President's foreign policy decision. As the Supreme Court stated in Chicago & Southern Air Lines v. Waterman SS. Corp., 333 U.S. 103, 92 L. Ed. 568, 68 S. Ct. 431 (1948), such decisions:

are delicate, complex, and involve large elements of prophecy. They are and should be undertaken only by those directly responsible to the people whose welfare they advance or imperil. They are decisions of a kind for which the Judiciary has neither aptitude, facilities nor responsibility and which has long been held to belong in the domain of political power not subject to judicial intrusion or inquiry.

333 U.S. at 111. The plaintiff also suggests that the Court balance any First Amendment harms created through regulation of the diskette against the injury caused to national security if the export of the diskette were not regulated. See Plaint's Opp. 16. However, unlike Tinker v. Des Moines Independent Community School District, 393 U.S. 503, 508, 21 L. Ed. 2d 731, 89 S. Ct. 733 (1969), where the Supreme Court applied an ad hoc balancing test, such a test in the case at bar would require the Court to scrutinize the actual injury to national security. Again, the Court declines to do so. See United States v. Mandel, 914 F.2d 1215, 1223 (9th Cir. 1990) ("Whether the export of a given commodity would make a significant contribution to the military potential of other countries ... is a political question not subject to review to determine whether [it] had a basis in fact"); United States v. Martinez, 904 F.2d 601, 602 (11th Cir. 1990) ("The question whether a particular item should have been placed on the Munitions List possesses nearly every trait that the Supreme Court has enumerated traditionally renders a question 'political'"). Furthermore, the plaintiff cannot genuinely dispute that, absent the restriction on the export of cryptographic products and the plaintiff's diskette, the actual number of cryptographic products n23 available to foreign intelligence sources will be greater.

Finally, the plaintiff has not advanced any argument that the regulation is "substantially broader than necessary" to prevent the proliferation of cryptographic products. City Council of Los Angeles v. Taxpayers for Vincent, 466 U.S. 789, 808, 80 L. Ed. 2d 772, 104 S. Ct. 2118 (1984). Nor has the plaintiff articulated any present barrier to the spreading of information on cryptography "by any other means" other than those containing encryption source code on machine-readable media. Clark, 468 U.S. at 295. Therefore, the Court holds that the regulation of the plaintiff's diskette is narrowly tailored to the goal of limiting the proliferation of cryptographic products and that the regulation is justified.

3. The Court Shall Deny The Plaintiff's First Amendment Claim Regarding The "Technical Data" Provisions Of The ITAR Because The Plaintiff Does Not Have Standing And The Defendants Have Limited Their Application Of Said Provisions.

As a last-ditch argument, the plaintiff contends that the Court should invalidate the ITAR under the First Amendment because certain provisions of the ITAR regulating the export of "technical data," as defined in 22 C.F.R. § 120.10, constitute "an unconstitutional system of vague prior restraints." The plaintiff bases his argument on internal government memoranda that express concern regarding the possible overbreadth and vagueness of said provisions.

The plaintiff has no standing to bring this claim regarding the unconstitutionality of the "technical data" provisions. The defendants have not applied the "technical data" restrictions to the plaintiff -- in fact, the crux of the plaintiff's APA claim is that although the plaintiff believes the diskette is exempted from the ITAR because it is in the public domain, the defendants have arbitrarily concluded that the public domain exemption does not apply because the diskette is not "technical data." The Court will not accept the plaintiff's invitation to address the constitutionality of the "technical data" provisions of the ITAR where the plaintiff's injury is not causally connected to said provisions. See Lujan v. Defenders of Wildlife, 504 U.S. 555, 112 S. Ct. 2130, 2136, 119 L. Ed. 2d 351 (1992) (Court held that case-or-controversy standing required, among other things, that the plaintiff must have suffered an injury causally connected to the challenged action of the defendant). While courts have departed from traditional rules of standing with respect to certain First Amendment claims, claims of facial overbreadth and vagueness are rarely entertained with respect to content-neutral regulations. See Broadrick v. Oklahoma, 413 U.S. 601, 613, 37 L. Ed. 2d 830, 93 S. Ct. 2908 (1973).

Furthermore, the plaintiff's overbreadth concerns are not genuine. In United States v. Edler Indus., 579 F.2d 516 (1978), the Ninth Circuit addressed the argument that the definition of "technical data" in the ITAR is "susceptible to an overbroad interpretation." Id. at 520. The court chose to read the technical data provision narrowly to avoid finding a constitutional violation. Id. at 521. The Department of State has since limited its application of the provision in practice to the interpretation expressed by the court in Edler [**40] Indus. See Preamble to Revisions of International Traffic in Arms Regulations (Final Rule) 49 Fed. Reg. 47682, 47683 (Dec. 6, 1984). In evaluating the plaintiff's overbreadth claim, the Court "'must ... consider any limiting construction that a ... court or enforcement agency has proferred.'" Ward, 491 U.S. at 796 (quoting Hoffman Estates v. The Flipside, Hoffman Estates, Inc., 455 U.S. 489, 494, n. 5, 71 L. Ed. 2d 362, 102 S. Ct. 1186 (1982)). Accordingly, based on the plaintiff's lack of standing, and in light of the limitations to the "technical data" provisions adopted by the Department of State, the plaintiff cannot prevail on his First Amendment claim that the ITAR is vague and overbroad.

B. REGULATING THE EXPORT OF THE PLAINTIFF'S DISKETTE IS RATIONAL AND, ACCORDINGLY, DOES NOT VIOLATE THE SUBSTANTIVE DUE PROCESS RIGHTS GUARANTEED THE PLAINTIFF UNDER THE FIFTH AMENDMENT OF THE CONSTITUTION.

As stated previously, § 2778(h) of the AECA precludes the APA claim asserted by the plaintiff, but it cannot bar a constitutional attack. Recognizing the significant possibility that this Court might hold the plaintiff's "arbitrary and capricious" challenge under the APA nonjusticiable, see Part I of this Memorandum Opinion, the plaintiff asserts the same "arbitrary and capricious" challenge under the legal theory that the defendants' actions violated his right to substantive due process as guaranteed by the Fifth Amendment. However, the plaintiff may not backdoor his APA claim through the Fifth Amendment as that would render the judicial review preclusion in § 2778(h) absolutely meaningless. See Sylvia Develop. Corp. v. Calvert County, 48 F.3d 810, 829 n.7 (4th Cir. 1995) ("As the courts have consistently recognized, the inquiry into 'arbitrariness' under the Due Process Clause is completely distinct from and far narrower than the inquiry into arbitrariness under state or federal administrative law").

The substantive due process provided in the Fifth Amendment, absent the assertion of a fundamental right, merely requires a reasonable fit between governmental purpose and the means chosen to advance that purpose. See Usery v. Turner Elkhorn Mining Co., 428 U.S. 1, 19, 49 L. Ed. 2d 752, 96 S. Ct. 2882 (1976) ("Under the deferential standard of review applied in substantive due process challenges to economic legislation there is no need for mathematical precision in the fit between justification and means"). Given this "extremely limited scope of permissible judicial inquiry," Association of Accredited Cosmetic Schools v. Alexander, 298 U.S. App. D.C. 310, 979 F.2d 859, 866 (D.C. Cir. 1992), the plaintiff's due process claim lacks any merit. The government clearly has an interest in preventing the proliferation of cryptographic software to foreign powers, and the regulation of the export of the cryptographic software is a rational means of achieving that goal. The Court will not substitute its policy judgments for that of the President, see Bowen v. Gilliard, 483 U.S. 587, 597, 97 L. Ed. 2d 485, 107 S. Ct. 3008 (1987), especially in the area of national security. See Martinez, 904 F.2d at 602.

Likewise, the regulation of the plaintiff's diskette as cryptographic software is rational, even when considered in conjunction with the defendants' decision not to subject the book Applied Cryptography to the ITAR. As stated by the plaintiff in his commodity jurisdiction application for Applied Cryptography, the book contains no machine-readable media," while the diskette is precisely that. See Lowell Decl., Tab 4. Although Part Five of the book could be placed on machine readable media through the use of optical character recognition technology or through direct typing, the plaintiff concedes that using the source code in Part Five of Applied Cryptography to encode material takes greater effort and time than using the Karn diskette. Karn Decl. PP 10-12. Accordingly, treating the book and diskette differently is not in violation of the plaintiff's substantive due process rights. Finally, to the extent that the plaintiff's substantive due process rights require the Court to review the defendants' interpretation of the public domain exemption to the ITAR, the Court finds the defendants' interpretation reasonable as a matter of law.

CONCLUSION

For the reasons discussed above, the Court shall dismiss the plaintiff's APA claim, and the defendant is entitled to summary judgment on the plaintiff's First and Fifth Amendment claims. The Court shall issue an Order of even date herewith consistent with the foregoing Opinion.

 

VIII. SELECTED BIBLIOGRAPHY

Hal Abelson, et al., Questions and Answers About MIT's Release of PGP available at <http.//web.mit.edu/afs/net/mit/jis/www/pgpfaq.htm>

James Bamford, THE PUZZLE PALACE: A REPORT ON AMERICA'S MOST SECRET AGENCY (1982)

Herbert Burkert, Privacy-Enhancing Technologies: Typology, Critique, Vision in TECHNOLOGY AND PRIVACY: THE NEW LANDSCAPE at 125 (Philip E. Agre and Marc Rotenberg eds., 1997)

James J. Carter, The Devil and Daniel Bernstein: Constitutional Flaws and Practical Fallacies in the Encryption Export Controls, 76 Oregon L. Rev. 981 (1997)

John P. Collins, Note, Speaking in Code, 106 Yale L. J. 2961 (1997)

Whitfield Diffie & Martin E. Hellman, New Directions in Cryptography, IT-22 IEEE Transactions Info. Theory 644 (1976)

Whitfield Diffie, The First Ten Years of Public-Key Cryptography, 76 Proc. IEEE 560 *1988)

David Chaum, Achieving Electronic Privacy, Scientific American, August 1992

Charles L. Evans, U.S. Export Control of Encryption Software: Efforts to Protect National Security Threaten the U.S. Software Industry's Ability to Compete in Foreign Markets, 19 N.C. J. Int'l & Com. Reg. 469 (1994)

A. Michael Froomkin, The Metaphor is the Key: Cryptography, the Clipper Chip, and the Constitution, 143 U. Pa. L. Rev. 709 (1995)

A. Michael Froomkin, Flood Control on the Information Ocean: Living With Anonymity, Cash, and Distributed Databases, 15 J. L. & Comm. 395 (Spring 1995)

Simson Garfinkel, PRETTY GOOD PRIVACY (1995)

Mark B. Hartzler, National Security Export Control on Data Encryption --How They Limit U.S. Competitiveness, 29 Tex, Int'l L. J. 438 (1994)

BUILDING BIG BROTHER: THE CRYPTOGRAPHY POLICY DEBATE (Lance Hoffman ed., 1994)

David Kahn, THE CODEBREAKERS : THE STORY OF SECRET WRITING (rev. ed. 1996)

Charles Kaufman, Radia Perlman, and Mike Spencer, NETWORK SECURITY: PRIVATE COMMUNICATION IN A PUBLIC WORLD(1996)

Henry R. King, Note Big Brother, The Holding Company, A Review of Key-Escrow Encryption Technology, 21 Rutgers Computer & Tech. J. 224 (1995)

Donald E. Knuth, THE ART OF COMPUTER PROGRAMMING (2d ed. 1974)

Timothy B. Lennon, Comment, The Fourth Amendment's Prohibitions on Encryption Limitation: Will 1995 Be Like 1984?. 58 Alb. L. Rev. 467 (1994)

Steven Levy, Crypto Rebels, WIRED, May/June 1993.

Ralph C. Merkle, Secure Communications Over Insecure Channels, Comm. ACM, April 1978 at 294.

Yvonne C. Ocrant, A Constitutional Challenge To Encryption Export Regulations: Software Is Speechless, 48 DePaul L. Rev. 503 (1998)

Kenneth J. Pierce, Public Cryptography, Arms Export Controls and the First Amendment: A Need for Legislation, 17 Cornell Int'l L. J. 197 (1994)

Laura M. Pilkington, First and Fifth amendment Challenges to Export Controls on Encryption: Bernstein and Karn, 37 Santa Clara L. Rev. 159 (1996)

National Research Council, Committee to Study National Cryptography Policy, CRYPTOGRAPHY'S ROLE IN SECURING THE INFORMATION SOCIETY (1996)

Ronald L. Rivest, Adi Shamir, and Leonard Adelman. Martin Gardner, Mathemenatical Games, Scientific American, August 1977.

Daniel R. Rua, "Cryptobabble: How Encryption Export Disputes Are Shaping Free Speech For The New Millennium", 24 N.C.J. Int’l L. & Com. Reg. 125, 136 (Fall, 1998)

Jill M. Ryan, Note, Freedom to Speak Unintelligibly: The First Amendment Implications of Government-Controlled Encryption, 4 Wm. & Mary Bill of Rts. J. 1165 (1996)

Bruce Schneier, APPLIED CRYPTOGRAPHY: PROTOCOLS, ALGORITHMS, AND SOURCE CODE IN C (2d ed. 1996)

Lee Tien, Who's Afraid of Anonymous Speech? McIntyre and the Internet, 75 Oregon L. Rev. 117 (1996)

Philip R. Zimmerman, CRYPTOGRAPHY FOR THE INTERNET, Scientific American, October 1998

Return to Main Module page
Return to Learning Cyberlaw home page