PRIVACY AND ENCRYPTION EXPORT CONTROLS: A CRYPTO TRILOGY (Bernstein, Junger & Karn)

[NOTE: This module has been revised to reflect the state of affairs as of August 24, 2000. The earlier 1999 version of this model is superceded]

By Keith Aoki <kaoki@law.uoregon.edu>

Associate Professor, University of Oregon School of Law

 

I. Brief Introduction

II. Roadmap to the Module

III. A Brief Encryption Glossary of Terms

IV. A Short History of Privacy (and Privacy law)

V. Secret Writing through History

VI. The Export Restriction Regulations: Legislation and Litigation

        A. Pre-1996 Regulatory Scheme: International Traffic in Arms Regulations

        B. Post-1996 Regulatory Scheme: Export Administration Regulations

VII. A Crypto Trilogy: Bernstein, Junger & Karn

        A. Questions to Consider

        B. Brief Background on the Bernstein I, II, & III cases.

1. Bernstein I (1996)

2. Bernstein II (1996)

3. Bernstein III (1997)

        C. The Cases

1. Bernstein v. Dept. of Justice (Bernstein IV) (1999)

2. Junger v. Daley (2000)

3. Karn v. U.S. Dept. of State (1996)

VIII. Summary of Subsequent Events

IX. Selected Bibliography

This module opens with a quote from Judge Betty B. Fletcher of the U.S. 9th Circuit Federal Court of Appeals:

"In this increasingly electronic age, we are all required in our everyday lives to communicate with one another. This reliance on electronic communication however has brought with it a dramatic diminution in our ability to communicate privately. Cellular phones are subject to monitoring, email is easily intercepted, and transactions over the internet are often less than secure. Something as commonplace as furnishing our credit card number, social security number or bank account number puts each of us at risk. Moreover, when we employ electronic methods of communication, we often leave behind electronic "fingerprints" behind, fingerprints that can be traced back to us. Whether we are surveilled by our government, by criminals, or by our neighbors, it is fair to say that never has our ability to shield our affairs from prying eyes been at such a low ebb. The availability and use of strong encryption may offer an opportunity to reclaim some portion of the privacy we have lost. Government efforts to control encryption thus may well implicate not only the First Amendment rights of cryptographers intent on pushing the boundaries of their science, but also the constitutional rights of each of us as potential recipients of encryption's bounty. Viewed from this perspective, the government's efforts to retard progress in cryptography may implicate the Fourth Amendment as well as the right to speak anonymously, see McIntyre v. Ohio Elections Comm'n, 514 U.S. 334 (1995), the right against compelled speech, see Wooley v. Maynard, 430 U.S. 705 (1977), and the right to informational privacy, see Whalen v. Roe, 429 U.S. 589 (1977). While we leave it for another day the resolution of these difficult issues, it is important to point out that Bernstein's is a suit not merely concerning a small group of scientists laboring on an esoteric field, but also touches on the public interest broadly defined." Judge Betty B. Fletcher, Bernstein v. U.S. Dept. of Justice, et al., 176 F. 3d 1132, 1145-1146 (1999)

I. Brief Introduction

Privacy is not synonymous with secrecy. To quote Eric Hughes of the Cypherpunks, "[a] private matter is something one doesn’t want the whole world to know, but a secret matter is something one doesn’t want anybody to know. Privacy is the power to reveal oneself selectively to the world." Strong encryption (such as "public key" cryptography) makes privacy possible in an increasingly digitized electronic world, thereby enabling important preconditions for the existence of an open society.

This module considers how digital electronic communication is both enabling and yet problematically insecure. The use of strong encryption does three things: (1) ensures confidentiality/privacy of one's message; (2) ensures authenticity of a message; and (3) ensures the integrity of the contents of a message. Strong "public key" encryption may be thought of as a type of electronic "self-help" used to protect the content of electronic communications from unwanted intrusion by governmental entities and private actors alike. For much of the post WW II era, the National Security Agency had an almost complete monopoly on strong encryption methods. However, this monopoly was broken by the development of "public-key" cryptography by Whitfield Diffie and Martin Hellman in 1975. "Public-key" cryptography eventually spurred major controversies in the 1990s, when the U.S. government tried implementing the "Clipper Chip," which would have allowed government access to the key for virtually any and all encrypted digital communications.

This module asks you to analyze a trilogy of cases decided beginning in the mid-1990s: (Karn v. U.S. Dept. of State [decided March 22, 1996 in the District Court for the District of Columbia]; Bernstein v. U.S. Department of Justice [decided May 6, 1999 in the 9th Circuit]; and Junger v. Daley [decided April 4, 2000 in the 6th Circuit]). These cases address the problematic limits on the use of extremely strong "public key" cryptographic systems by private actors in the context of the export licensing schemes administered by the State Department and the Commerce Department. These cases have strong first amendment and administrative law components to them, but their implications go beyond those areas of the law to implicate fundamental questions of individual privacy in the digital environment -- who will have presumptive access to strong encryption tools?

II. Roadmap to This Module

This module is organized as a series of questions and answers revolving around the use of encryption to ensure privacy in one's communication. This module looks first at broad notions of privacy, and the common law and constitutional roots of a legal right of privacy within the United States. A brief history of codes and codebreaking follows, covering the Caesar cipher to the German Enigma code machine as well as defining certain basic cryptographic terms and concepts. Next, the post -World War II governmental monopoly on strong cryptography by the National Security Agency (NSA) is described. Finally, this module focuses on the significant advent of public key cryptography first developed by Whitfield Diffie and Martin Hellman in 1975, which made possible a challenge to the NSA's monopoly on strong encryption.

Simultaneous with the refinement of public key cryptographic techniques was the rise and spread of ubiquitous computer network environments. These networks began to enable extensive data communications and thereby threaten the privacy or secrecy of the same transmitted information, especially that of a highly personal or confidential nature. In the late 1970s, the NSA, in concert with the National Institute of Standards and Technology (NIST), released a 56-bit encryption algorithm called Digital Encryption Standard (DES) to be used for electronic transfers of financial and other data. However, almost from the time of its release, DES's ability to protect information began to be outstripped by rapidly advancing decryption technology.

The US Government has repeatedly sought to curtail the spread of strong public key encryption technology in two ways. First, a domestic initiative promoting "key escrow" deposit schemes was promoted. Second, strong public key encryption algorithms were classified as "munitions" and thereby made subject to strict export controls, contingent on obtaining a rarely granted license from the State or Commerce Department. There is a relation between the domestic initiatives and export restrictions on strong cryptographic products. Export controls hinder the spread and development of strong cryptographic tools within the US because software developers are reluctant to incorporate cryptographic tools that might prevent them from selling their products overseas. Government-backed "key escrow" arrangements such as the ill-fated Clipper chip initiative of the mid-1990s ran the risk of seriously harming overseas markets for US products using strong encryption.

This module focuses attention on legal developments occurring in the 1990s over attempts by the Clinton Administration to use State Department regulations, and then Commerce Department regulations, to stop the export of strong public key cryptographic tools. In particular, there have been three cases that have interpreted the relevant government export regulations, one upholding government-attempts to restrict the export of strong cryptography, and two strongly questioning the ability of the government to restrict the use and export of public key cryptographic algorithms on first amendment grounds.

The module opens with a glossary of encryption terms, before moving on to a brief history of privacy law and then to a brief historical backdrop for the Trilogy of Bernstein, Junger & Karn. The module ends with an abbreviated selected bibliography of articles and books on encryption and the law.

III. A Brief Encryption Glossary of Terms

Algorithm: a mathematical function used to encrypt and decrypt a message.

Asymmetric Cryptography: a system of cryptography in which the key needed to encrypt a message differs from that needed to decrypt the same message. Each party possesses both a public key and a private key; neither key can be derived from the other. Invented by mathematicians Whitfield Diffie and Martin Hellman in the mid-1970s, the most widely known asymmetric cryptosystem is Pretty Good Privacy (PGP). This type of system is also known as "public key cryptography."

Authentication: the ability to accurately ascertain the identity of the sender of a message.

Bits: the unit by which cryptographic "keys" are measured. The longer the key, the more secure the cryptosystem using it. The relative security of key lengths is exponential: a 24-bit key system is over 65,000 times harder to break than an 8-bit key system.

Brute Force Attack: an attack on a cryptosystem that involves trying every possible key to decrypt a ciphertext until finding one that works. Usually the average time for a brute force attack is half the number of possible keys multiplied by the time required to test each key (by using it to decrypt the ciphertext and checking if the results are intelligible)

Cipher: a method of encrypting any text, regardless of content.

Ciphertext: a message that has been encrypted.

Code: system of communication relying on a pre-arranged set of meanings such as those found in a codebook.

Confidentiality: the ability to ensure that only an intended recipient can understand a communication.

Cryptanalysis: the practice of defeating attempts to hide communications. Cryptanalysts are also referred to as interlopers, eavesdroppers, enemies, opponents and third parties.

Cryptography: the art or science of secret communication; may be thought of as the storage of information (for a shorter or longer period of time) in a form that allows it to be revealed to whom you choose while remaining hidden from everyone else.

Cryptology: includes both cryptanalysis and cryptography.

Cryptosystem: a method of encrypting information so that decryption can only occur under certain conditions, which generally means only by persons in possession of a decryption engine (like a computer) and a decryption key.

Decryption: the transformation of ciphertext back into plaintext.

DES: Digital Encryption Standard, a 56-bit single key cipher adopted for use within the U.S. in 1977. Developed in the early 1970s by the National Bureau of Standards (since renamed the National Institute of Standards and Technology, or NIST) to be a national interoperable cyrptographic algorithm.

Encryption: the process of disguising a plaintext message to conceal its substance.

Integrity: the assurance that a message has not been modified in transit.

Key: the number or alphanumeric sequence needed to encrypt or decrypt a message. The idea is roughly equivalent to the idea of a "password."

Key Escrow: a system where the government is allowed a "back door" to the encryption keys of private parties. Keys are registered in private data repositories and, under conditions similar to those currently needed to obtain a subpoena or wiretap, law enforcement officials are allowed to decrypt intercepted communications without the knowledge or consent of the sender.

Keyspace: the range of values for a given key (determines strength).

NIST: National Institute of Standards and Technology, the government agency charged with adopting a national cryptographic algorithm standard.

Non-Repudiation: the inability of an author to deny she sent a message.

NSA: National Security Agency; established in 1952 to be the U.S. government's chief signals intelligence and cryptographic department.

Object Code: computer program code that is directly executable by a computer. Humans generally cannot read object code.

Plaintext: data that may be read and understood without any special measures. Also called "cleartext." Plaintext is converted into ciphertext by means of an encryption engine (like a computer) whose operation is fixed (cryptosystem) but functions in a way that is dependent on a piece of information (the encryption key).

Public Key Cryptography: a system of cryptography in which the key necessary to encrypt a message differs from that needed to decrypt the same message. Each party possesses both a public key and a private key; neither key may be derived from the other. Also known as "asymmetric cryptography."

Source Code: computer program code that is actually entered by a human programmer. Must be "compiled," or translated into object code, before a computer can execute the program.

Symmetric Cryptography: a cryptographic system in which the key needed to encrypt and decrypt a particular message is the same. This type of system is much older than public key cryptography. The disadvantage of symmetric cryptography is that a secure, prearranged method is needed to exchange the key or "password." The most widely-know example of a symmetric encryption system is the widely used Data Encryption Standard (DES).

IV. A Short History of Privacy (and Privacy Law)

Q.: What do we mean when we discuss the term "privacy"?

The term "privacy" is a Rorschach-like term that expands and contracts, acquiring a constellation of shifting meanings depending on context. Here are some common ways the word is used:

[T] he privacy of private property; privacy as a proprietary interest in name and image; privacy as the keeping of one's affairs to oneself; the privacy of internal affairs of a voluntary association or of a business; privacy as the physical absence of others who are unqualified by kinship, affection, or other attributes to be present; respect for privacy as the respect for the desire of another person not to disclose or have disclosed information about what he is doing or has done; the privacy of sexual and familial affairs; the desire for privacy as the desire not to be observed by another person or persons; and the privacy of the private citizen as opposed to the public official. (US Congress, Office of Technology Assessment, OTA-TCT-606, Information Security and Privacy in Network Environments 82 (Sept. 1994))

Q: What does it mean to say "I have a RIGHT to privacy"?

While the foundations of the legal recognition of a right to privacy are ancient - the idea that there is a boundary between the public and private realms of social lives at least dates back to Socrates and Aristotle (1) - explicit legal protections for individual privacy did not begin to develop within the US until the late nineteenth century. There were, however, many constitutional and common law antecedents to privacy rights; the norms under-girding privacy rights overlapped with the first amendment freedom of speech. Other sources that protected privacy included the fourth amendment right to be free of unreasonable searches and seizures, the third amendment right to not have troops quartered in one's home and the fifth amendment right against self-incrimination. The common law areas of nuisance and trespass also include implicit norms protecting individual privacy, although they are articulated in the language of property ownership. (2)

Professors Richard Turkington and Anita Allen found the earliest judicial discussion of privacy in the U.S. to be in an 1881 Michigan appellate court opinion in which tort relief was granted a plaintiff who had been observed by the defendant during childbirth and without the plaintiff's permission (3). Turkington and Allen also note that in Judge Thomas Cooley's treatise on torts he discussed the "right to be let alone." (4) This concept was picked up in Samuel Warren and Louis Brandeis' 1890 Harvard Law Review article,"The Right to Privacy," which has traditionally been considered the clearest articulation of the right to privacy. (5)

Despite these early discussions, the widespread legal recognition of a right to privacy was relatively slow; state courts did not begin recognizing a right of action in tort for a violation of privacy until the first decade of the twentieth century. While the final volume of the First Restatement of Torts, published in 1939, officially recognized a tort for invasion of privacy, as of 1947 only nine jurisdictions recognized a common law right of privacy. (6) Furthermore, the common law right of privacy has variously been found to mean protection against commercial exploitation of one's likeness or image, portrayal of information about an individual in a false light, public disclosure of private facts about an individual and intrusion on seclusion. Legislatures have enacted assorted statutes protecting the confidentiality of certain types of information about individuals by both public and private institutions. What has emerged is a crucial ambiguity about the definition of privacy rights; when William Prosser published the influential article "Privacy" in the 1960 California Law Review, he was able to identify over three hundred appellate cases dealing with the common law right to privacy. (7)

Finally, in Griswold v. Connecticut, 381 US 479 (1965), the U.S. Supreme Court held that a constitutional right of privacy shielded decisions about the use of contraceptives from state interference. In his decision, Justice William O. Douglas expressly recognized a general constitutional right of privacy independent of the fourth and fifth amendments as well as the common law right of privacy. The question remains, however; do privacy rights deal with limiting/controlling access to data and information about ourselves, or do privacy rights pertain to the ability to make fundamental decisions about oneself in the face of contrary opinions, as exemplified by the rights found in Griswold, Id.

This module focuses on the former, that is, how privacy rights grant us the ability to limit and control access to data and information about ourselves. This definition of privacy rights involves two types of private information: (1) transactional information, i.e., data footprints; and (2) the substantive content of one's communications. The module written by professor Ann Bartow addresses the issue of transactional privacy and "data footprints." This module considers the issue of securing the contents of communications and documents stored and transmitted electronically.

Endnotes for Section IV.


(1) Milton R. Knovitz, Privacy and the Law: A Philosophical Prelude, 31 Law & Contemp. Probs. 272, (1966)("Once a civilization has made a distinction between the "outer" and "inner" man, between the, between the life of the soul and the rest of the body, between the spiritual and the material;, between the sacred and the profane, between the realm of God and the realm of Caesar, between church and state, between rights inherent and inalienable and rights that are in the power of the government to give and take away, between public and private, between society and solitude, it becomes impossible to avoid the idea of privacy by whatever name it may be called -- the idea of a 'private space in which man may become and remain 'himself.'"); Jurgen Habermas. The Structural Transformation of the Public Sphere 3-4 (1062)("We are dealing here with categories of Greek origin transmitted to us bearing a Roman stamp. In the fully developed Greek city-state the sphere of the polis, which was common (koine) to the free citizens, was strictly separated from the sphere of the oikos; in the sphere of the oikos, each individual is in his own realm (idia). . . .Status in the polis was . . . based on status as the unlimited master of an oikos.")

(2) Alan Westin, Privacy and Freedom (1967); David H. Flaherty, Privacy in Colonial New England (1972).

(3) DeMay v. Roberts, 46 Mich. 160, 9 NW 146 (1881).

(4) Richard C. Turkington and Anita L. Allen, Privacy Law: Cases and Materials 23 (1999).

(5) Pavesich v. New England Life Ins. Co., 122 Ga. 190 , 50 SE 68 (1905).

(6) Feinberg, Recent Developments in the Law of Privacy, 48 Colum. L. Rev. 713 (1948).

(7) William Prosser, Privacy, 48 Calif. L. Rev. 383 (1960).

V. Secret Writing Through History

Q.: What are some historical examples of the uses of codes?

Writing has been a medium of sharing thoughts and ideas, of enlightenment and understanding, for thousands of years. However, for almost as long as humans have used writing to proliferate ideas, writing has also been used to deliberately conceal messages and meanings from unwanted eyes. This concealment has been achieved through the use ofcodes and ciphers, and examples abound throughout history.

Egyptian hieroglyphics were made deliberately arcane and obscure, and literacy in writing or interpreting a cipher was restricted to ensure that the elite powers of priests and scribes remained secure. Julius Caesar used what came to be known as "Caesar cipher" to ensure that his military and political communications remained safe from prying ears and eyes. Science -fiction writer Bruce Sterling describes how the ancient Assyrians used a form of "funerary cryptography " in which tombs would have odd sets of cryptographic cuneiform symbols written on them. These curious symbols would sometimes cause mystified passersby to utter the translation aloud, thereby inadvertently uttering a blessing for the dead. Consider Paul Revere's famous warning: "one if by land, two if by sea" as a rudimentary, if effective use of a code. The key is the number of lanterns in the Old North Church and the message is effectively concealed from all who do not know the key.

In the 1940s, Alan Turing, British mathematician, cryptographer and namesake of the "Turing Test" for artificial intelligence, was part of a top-secret team of British code-breakers that used electronic machines that might be thought of as predecessors to computers to break Nazi messages that had been encrypted using the German Enigma Code machine. This top-secret breakthrough was significant and had much to do with the ultimate Allied victory, enhancing their ability to track and sink German U-Boats and learn of other strategic maneuvers. England's top-secret triumph meant that at the dawn of the cold war era, cryptography would continue remain a jealously guarded state secret, at least up until the mid 1970s.

In 1949, Claude Shannon, pioneer information theorist, described the "entropy" (degree of disorder or uncertainty in a system) of a message as well as devising a formal measurement for the amount of information within a particular stream of digital bits. In the postwar era, following Shannon’s important theoretical work, early digital computers (in the name of National Security) were able to repeatedly chomp through dense streams of encrypted information, searching out repetitions, structures and variations from the random.

Q.: Why should I care about cloak-and-dagger stuff like codes and cryptography?

The relation between literacy, technology, communication and power has always been intimate and complex, and literacy and the ability to communicate are central to the retention of power in today's digital communication age. Until the twentieth century, cryptography was traditionally the domain of armies, spies and diplomats, but in the increasingly digital world of the Internet and networked communications, both personal and corporate privacy have been placed in jeopardy. Ironically, however, the very technological tools that place more and more of our confidential communications and data in danger also provide the means to protect and secure that information. In a privacy sense, as with Dickens, it is the worst of times, but it is also the best of times.

An example of this situation is law enforcement's argument for key escrow encryption systems (such as the Clipper Chip) on the grounds that they need to be able to intercept the communications of terrorists, drug dealers and other criminals. Such systems, however, allow for potential invasions of privacy that would compromise the many legitimate reasons and situations where individuals and groups of people may want to conceal information. Companies may possess confidential employee data (medical, salary and other records) that may be susceptible to access by data entry clerks -- encryption protects these records. Employees may share workspaces and equipment with others and may want to ensure the confidentiality of information about projects they are working on. Companies may need to transfer confidential information between branch offices and field agents -- encryption helps keep the information secure from interception. Companies may possess proprietary information and trade secrets, R & D results, product plans, manufacturing processes, legal and financial data, etc. that they want to keep secret from competitors. An individual or company may want to transmit sensitive information on a computer that they would like to keep private from persons who may examine the computer en route. Or two persons may correspond via e-mail and wish to keep the content of their electronic communications private.

Q.: What is the NSA?

During the Cold War Era, U.S. cryptography came under the jurisdiction of the newly created National Security Agency ("NSA"), an extremely secretive bureaucracy established by President Harry Truman in 1952. For the next 23 years, the NSA held a monopoly on the use of strong cryptographic tools within the U.S., while pursuing its primary purpose, which was to protect the communications of the U.S. government and crack those of the U.S. government’s real, imagined, or potential adversaries. The NSA, rumored to employ the world's largest number of mathematicians, labored to make sure that the NSA, and only the NSA, possessed every known cryptographic technique. Under the auspices of the Invention Secrecy Act of 1952 and in the interest of national security, the NSA succeeded in withholding patent grants on new cryptography discoveries for much of the early Cold War era.

When information about "public key" cryptography became more widely known in the late 1970s and 1980s, the NSA tried to prevent the demise of its influence by using its power to influence scientific research grants in cryptography and mathematics at the National Science Foundation. Even the Public Cryptography Study Group, founded in 1978 as a response to scientists' complaints, contained the hand of the NSA. This group established "voluntary control" measures, but the word "voluntary" was illusory because the NSA had pre-publication access to all research papers on encryption by private parties and, therefore, had the ability to censor any material it deemed incompatible with national security. This situation was tolerated for years because many felt that concern for national security was indeed a sufficiently important reason for the NSA to remain on top of cryptographic techniques. However, by the mid-1990s the NSA's attempts to control encryption tools in the name of national security had begun to draw very strong opposition in the cryptographic community, spurring the series of lawsuits that end this module.

Q.: How do you tell how strong an encryption algorithm is?

Encryption algorithms are described in terms of relative strength. The stronger the protection from a particular algorithm, the harder it is to crack a message that has been encrypted by that algorithm. The length of the key determines an algorithm's relative strength. Key length is described in terms of "bits" (how many numbers are in a particular key). The range of values for a given cryptographic key is called "keyspace". An important characteristic of complex cryptosystems is that the relative strength of different keys is exponential, rather than geometric. This fact means that a 24 -bit algorithm is not three times as hard to break as an 8-bit algorithm, but is over 65,000 times harder to break (65,536, or 2 to the 16th power) than an 8-bit key.

Q.: What is DES?

In the 1960s, IBM developed an encryption program named LUCIFER, a variant of which became the NSA’s Date Encryption Standard (DES). DES is a 56-bit symmetric cryptosystem designated in the 1970s by the then-named National Bureau of Standards (since renamed National Institute for Standards and Technology or NIST). DES was adopted as a federal standard on November 23, 1976 and has continued to be re-certified every five years,

DES is the most widely used symmetric cryptosystem (the encryption and decryption keys are the same), and stands at a NSA regulated 56 bits (bits are the unit by which the strength of cryptographic keys are measured). The process for using a single-key algorithm such as DES involves four steps: (1) agreeing on a security key with the intended receiver or sending her the key, (2) supplying plaintext and using the key to an algorithm to create the ciphertext, (3) sending the ciphertext to the receiver and (4) having the receiver supply the ciphertext and the key to an algorithm to create the plaintext. One cryptography scholar says that, "the original design submitted by IBM permitted all 16 x 48 = 768 bits of key used in the 16 rounds to be selected independently. A U.S. Senate Select Committee ascertained in 1977 that the [NSA] was instrumental in reducing the DES secret key to 56 bits that are each used many times."

Despite ongoing suspicion of DES due to rumors that the NSA had forced IBM to intentionally weaken the system down to 56 bits so that it was easier for the NSA, and no others, to break, up to the early 1990s many banks and financial institutions used DES. However, with the rise of parallel processing (using many computers simultaneously) in the mid-1990s, many encryption experts believe that DES ciphertext is crackable in a matter of hours and that therefore DES has reached the end of its useful life as a cryptographic tool.

The NSA has never affirmed or denied their ability to decipher DES (some people have said that NSA stands for "never say anything" or even "No Such Agency"). Of course, if a method of cracking DES ciphertext was developed, the developer would benefit by keeping that method secret, because if it were widely known people would stop using that cryptosystem and the ability to crack it would become worthless.

Q.: What is "public key" cryptography?

The NSA’s cryptographic monopoly came to end in 1975, with the invention of "public-key" cryptography by mathematicians Whitfield Diffie, then at MIT's Artificial Intelligence Lab, and Martin Hellman. During the mid-1960s, individual user files in time-shared mainframe computers, such as those in MIT’s AI Lab, were protected by passwords. However, the system manager had complete access to the passwords of all the users. Diffie was concerned that if law enforcement officers served the system manager with a subpoena, the subpoenaed passwords and user files would go to law enforcement — the system manager had absolutely no incentive to risk a contempt citation for noncompliance with a subpoena. Whitfield Diffie wanted to find a way to eliminate the need for trusted third parties like the system manager, and he realized that the only way to do this was through a decentralized system of password control.

Diffie focused on the age-old problem of key management. As mentioned earlier, Julius Caesar had developed a simple cipher system in order to encode sensitive military messages. What Caesar did was take an original message ("plaintext") and then encrypt it into what appeared to be gibberish ("ciphertext"). The recipient of the gibberish message would use the same key as Caesar and thus would decrypt the message back into plaintext. The big problem in this cryptosystem was protecting the key. Anyone who knew Caesar key would be able to understand the encrypted message. Therefore, to keep his communications secret Caesar would have to change the key often. Changing keys, however, often created a related problem: if you changed the key frequently, how do you inform your spies behind enemy lines what the new key is or even when you're changing it. If you tell them what the new key is using the old code (which may have been intercepted and broken) then your enemies will learn your new code and your secret plans would be for naught.

Whitfield Diffie and Martin Hellman, through the aid of rapidly increasing computing technology, conceptually split the then-unitary cryptographic key in 1975. They envisaged a cryptosystem in which each user had two keys, a "public" key and a "private" key, each unique to an owner. Any message encrypted with one key may be decrypted by the other, and neither key can be used to determine the other. If I send you a message, I first need to obtain your "public" key. It is possible to distribute one's public key without compromising security -- however, possessing someone's "public" key is absolutely no help in deciphering information about what one's "private" key is. If I use your "public" key to encrypt a message to you, my message will be pure gibberish to anyone who may intercept it and only one person in the world can decode it -- you, who hold the other key, your "private" key. If you want to respond to me with a secret message, you would then use my "public" key to encrypt your message and I would use my "private" key to decrypt.

Thus, with the advent of "public key" encryption, instead of using the same key to encrypt and decrypt a message (such as with a symmetric cryptosystem like DES), every user in the system would have both a public key and a private key. The public key can be published or be made available in a key repository and the private key would never be revealed.

With the advent of computer programs that implemented variants of the Diffie-Hellman scheme, the age-old boundaries on cryptography were suddenly vanquished. Once Diffie and Hellman published their ideas in 1975, the de facto NSA monopoly on cryptographic tools was on its way out. In 1977, three MIT mathematicians -- Ronald L. Rivest, Adi Shamir and Leonard M. Adelman -- developed a cryptosystem that put Diffie-Hellman's findings into practice through the use of very large prime numbers. Their RSA algorithm (used in an encryption program such as RIPEM) created encryption keys by multiplying two extremely large prime numbers together. While breaking this code involves only reversing the multiplication process (required to discover the private key), to do so can take a long time even on a powerful computer. These algorithms were seen as a big improvement over the DES. The strength of public key algorithms rest in large part upon how large the key is, or in other words how many bits of information make up the key. The larger the key, the harder it is to break the code. Whereas DES was limited to 56 bits, RSA keys could be any size. Furthermore, the relative strength of different keys are exponential rather than geometric, meaning that a 24 bit algorithm is not three times harder to break than an 8 bit algorithm, but 64,000 times harder.

Needless to say, the prospect that every company/citizen within or without the United States would have access to extraordinarily strong cryptographic tools and techniques of the sort that had formerly ranked alongside advanced missile guidance systems, biological warfare knowledge and nuclear power was the NSA's worst possible nightmare.

Q.: Why use public key cryptography?

There are two general situations in which using strong encryption is desirable. First, there is a need to store information in a location secure from unauthorized persons. Second, there is a need to keep information that is being transmitted from point A to point B safe from interception. In the second situation there is a problem of secure key exchange; the person who will be receiving and decrypting the information will usually not be the person who sent and encrypted the information.

In a traditional symmetric cryptographic system, the same password/algorithm is used to encrypt and decrypt messages. In a public key cryptographic system each user has two keys: one, a "public" key that is widely published and easily available through some type of distribution infrastructure, and two, a "private" key that is never revealed. Any message encrypted with one key may be decrypted with the other, so that someone may use my "public" key to encrypt a message to me, and I can use my "private" key to decrypt the same message. Neither key can be used to determine the other, and public key cryptography allows people who never met or exchanged a key to communicate in a highly private manner.

So, public key cryptography allows users to accomplish three important objectives with their electronic communications. First a user can ensure the confidentiality and privacy of a communication such that only the intended recipient can read the message. Public key cryptography also allows for more efficient authentication, ensuring that the recipient of an encrypted communication can accurately ascertain and verify the identity of the sender of the message. Finally, public key cryptography ensures a message's integrity, reassuring the user that no one tampered with or changed the message en route.

Q.: What are the weaknesses or drawbacks of public key cryptosystems?

On one hand, the strength of public key cryptosystems may also be a weakness: the extremely large prime number keys used by "public key" encryption systems take much longer for the intended recipient to decrypt than with a single key system such as DES. On the other hand, ironically, as strong (and as slow to decrypt) as public-key systems may be in comparison to traditional symmetric cryptosystems, they are not invulnerable. For example, in 1977, Rivest, Shamir and Adelman issued a challenge to the world in an article in Martin Gardner's column in Scientific American. A single phrase was encrypted using the RSA 129-bit algorithm and readers of Scientific American were challenged to find the plaintext. Using 1977 computing technology as a benchmark, Gardner estimated it would take millions of years to decrypt the RSA encoded ciphertext. However, an MIT student using almost 2,000 computers worldwide hooked together into a network to undertake the necessary calculations decrypted the RSA-129 message after 16 years and 8 months on April 16, 1994. Therefore, as computing technology advances the first drawback to public key cryptosystems will likely be reduced, while the second is increased.

Another disadvantage of public key systems is that there may be a key validation problem. When you wish to send encrypted data to someone, without the required key-validation protocol, how can you be sure that the public key that you have obtained for the receiver is indeed truly his/her public key? A knowledgeable encryption specialist could publish a public key in the receiver’s name, and then proceed to intercept any message you send using that key. Establishing an appropriate key-distribution and validation system is a significant cost to implementing an effective public key cryptosystem.

Another drawback of public key encryption isn't technical, but rather intrinsic to the use of strong cryptography. Cheap, easy-to-use extremely strong cryptography unfortunately shields both the law abiding and legitimate and lawless alike. Take the following remark from FBI agent Jim Kallstrom as quoted in an article by Steven Levy in the New York Times Sunday Magazine: "Sure, we want those new steel doors ourselves, to protect our banks, to protect the American corporation trade secrets, patents rights, technology. But people operating in legitimate business are not violating the laws -- it becomes a different ball of wax when we have probable cause and we have to get into that domain. Do we want a digital superhighway where not only the commerce of the nation can take place but where major criminals can operate impervious to the legal process?" (1)

While the debates in this area have been heated and controversial, they arenot the primary focus of this module. However, one should note that throughout much of the 1990s there has been an ongoing debate over whether the government, and its law enforcement agencies such as the FBI, should have access under certain circumstances to an electronic "backdoor"( via an Escrowed Encryption Standard such as the Clipper Chip or some other key escrow scheme).

Q.: If the U.S. Government does not want to see virtually unbreakable cryptography from becoming routine, what has it been doing to prevent it?

The U.S. government has been trying to prevent the proliferation of strong cryptography domestically and, similarly, to prevent the export of such cryptography. Attempted domestic control has focused on trying to adopt key escrow standards, while exporting strong cryptography has been controlled via stringent export licensing schemes.

On the domestic front there have been several government initiatives to have either mandatory or voluntary key EES or key escrow schemes adopted by the computing and communications industry. In the late 1980s and early 1990s the NSA and NIST proposed replacing the increasing vulnerable DES algorithm with a new algorithm they named Skipjack, which was purported to be over 16 million times stronger than DES. The Skipjack encryption algorithm was integrated into a system using what was called the Law Enforcement Access Field (LEAF) that added a signal to an encrypted message that directed a potential wiretapper (theoretically from a government law enforcement agency) to the appropriate key to decipher the message. The Skipjack algorithm and the LEAF system were integrated into the Capstone Chip that could handle phone communications, computer data and digital signatures.

In the early 1990s the NSA proposed the "Clipper Chip" to the Bush Administration. The Clipper Chip is a stripped down version of the Capstone chip incorporating both the Skipjack algorithm and the LEAF system's escrow key. While the Bush Administration failed to act on this proposal, the Clinton Administration picked up the Clipper Chip proposal within two months of taking office and aggressively pushed to implement it as soon as possible. In early 1993, NIST was ordered to consider using the Clipper Chip as the new encryption standard. Later that year NIST published the proposed Clipper Chip Key Escrow standard in the Federal Register, and allowed 60 days for public comment. NIST received 320 responses, only 2 positive. The ensuing heated public debates have focused on how to reconcile two fundamentally opposed interests: individual privacy and public safety. In 1994, Michael R. Nelson, a White House Technology Consultant, said that the Clipper Chip proposal was "the Bosnia of Telecommunications." (2)

Basically, escrow key systems require the placement of private user keys in data repository under federal control. Under conditions similar to those currently required for law enforcement officers to obtain a warrant/wiretap, they could also obtain an escrowed encryption key. Recalling that Whitfield Diffie and Martin Hellman developed public key cryptography precisely to get around the problem of having a trusted third party holding keys in escrow, it becomes apparent that escrow systems are a step backward for strong cryptography. While this is not the focus of this paper, University of Miami Law School Professor A. Michael Froomkin's seminal 1995 article, "The Metaphor is the Key: Cryptography, the Clipper Chip, and the Constitution," is a good starting place for further research. For current legislative developments such as the proposed "Electronic Data Storage Act", the "Cyberspace Electronic Security Act", or the "Security and Freedom through Encryption (SAFE) Act", see the Center for Democracy & Technology's Encryption website at <URL for CESA>http://www.cdt.org/crypt/970312_admin.html and http://www.cdt.org/crypto/legis_106/SAFE/index.shtml#provisions

In addition to aggressively pushing for domestic use of the Clipper Chip, the Clinton Administration aimed to implement stringent export restrictions and licenses on cryptographic products with algorithms stronger than 40-bits. This is the other front of the cryptographic policy debates and the subject of the trilogy of crypto cases that conclude this module.

By the beginning of the 1990s, RSA Data Security, Inc. (the company formed by Rivest, Shamir and Adelman) public key encryption technology was in the process of being adopted by companies such as Apple, AT & T, Lotus, Microsoft and Novell. However, the NSA believed that extremely strong cryptographic techniques incorporating cryptosystems like PGP should be treated like a munition and therefore needed an export license. Despite arguments from U.S. companies that they wouldn't be able to compete in a global market if they were forced to 'water-down' their computer products (by incorporating less than 40-bit cryptography), Congress was sympathetic to the NSA's national security arguments and began working on regulations that required export licensing of products incorporating strong cryptography.

In 1993, AT & T approached the NSA seeking an export license for the Surity 3600 phone security system designed to use the nonexportable DES algorithm. Aware of the high degree of market penetration AT & T possessed with regard to telecommunications devices, the NSA suggested that an export license would not be a problem provided AT & T use a Clipper Chip in their products. If AT & T incorporated the Clipper Chip, it would receive two valuable things. First, AT & T would receive a contract with the U.S. government to buy tens of thousands of phones (with no export ban). Second, AT & T would also sell a lot more phones to private parties because companies would need to use Clipper Chip-equipped devices to communicate with government Clipper Chip phones. Both of these "gains," however, furthered the Clinton Administration's goal of including mandatory key escrow encryption systems in all computers manufactured within, and thus exported from, the U.S.

Q.: Where did the public key encryption program known as Pretty Good Privacy (PGP) originate?

In 1991, Philip R. Zimmerman, a software engineer and cryptographic consultant, put together a public key encryption program he called Pretty Good Privacy (PGP) for computer data and e-mail use. PGP at the time used a military-grade 128-bit key. Zimmerman literally gave a small number of programs away for free. Sensing that the government was getting very interested in cryptography, Zimmerman wanted to get free copies of PGP in circulation before a possible government ban on strong encryption tools. One of the people that Zimmerman had given a copy of PGP installed it on a computer attached to the Internet it on a computer attached to the Net the day after Zimmerman's free release and within days, thousands of people had copies of military-strength PGP. Partially to avoid violating the ITAR and being charged with munitions trafficking (see below), the second upgrade of PGP was made from New Zealand. While ITAR (and later EAR) controls exports passing out of the U.S., it is much more difficult restricting imports passing into the U.S. from another country. Since Zimmerman's release of PGP in 1991, it has been through several upgrades and revisions, and PGP 6.5.1 is available as freeware downloadable from the MIT distribution site at http://web.mit.edu/network/pgp.html.

In early 1993, Philip Zimmerman received a visit from US Customs Service Agents wanting to know how PGP found its way overseas without an export license from the State Department. Beginning in fall 1993, Zimmerman was targeted by grand jury investigation in San Jose, California. The investigation dragged on for more than three years and eventually dropped the case because of the difficulty in obtaining injunctive relief outside of the U.S.

While no charges were ultimate brought, Zimmerman's situation was a foreshadowing of the disputes that Philip Karn, Peter Junger and Daniel Bernstein would encounter when they challenged the ability of the U.S. Government to use export-licensing regimes to prevent the placing of strong encryption programs on Internet-accessible computers.

Q.: Who are the Cypherpunks?

The Cypherpunks are a loosely organized group of privacy advocates and computer programmers co-founded in September 1992 by Eric Hughes, a freelance cryptographer and Tim May, physicist from Intel, meeting originally at the offices of John Gilmore’s Cyngus Corp. Steven Levy describes the common premises of the Cypherpunks as: (1) strong public key cryptography is a liberating tool which empowers individuals; (2) cryptography should be used to protect communications from the government; and (3) the Cypherpunks should educate, and widely distribute strong cryptographic tools to, members of the public. The Cypherpunks Hyperarchive of threaded e-mail discussions is available at http://www.inet-one.com/cypherpunks

Endnotes to Section V.

(1) Steven Levy, The Cypherpunks vs. Uncle Sam: Battle of the Clipper Chip, New York Times Sunday Magazine, June 12, 1994, at 48

(2) Steven Levy, The Cypherpunks vs. Uncle Sam: Battle of the Clipper Chip, New York Times Sunday Magazine, June 12, 1994, at 51

VI. The Export Restriction Regulations: Legislation and Litigation

This module now shifts from question and answer mode into a brief chronology and outline of Export restrictions on strong encryption in the 1990s.

On November 15, 1996 President Clinton transferred jurisdiction over regulated cryptographic tools from the State Department to the Commerce Department in Executive Order No. 13026. The Clinton Administration never formally acknowledged that this shift was in response to the Bernstein II decision in the Northern District of California, released October 24, 1996. However, in the Bernstein II opinion Judge Marilyn Patel found the ITAR regulations an unconstitutional prior restraint as applied to Professor Bernstein's SNUFFLE encryption program, implicating the importance of the jurisdiction shift as the groundwork for the Bernstein III and Bernstein IV opinions.

In order to understand the current Export Administration Regulations that are administered by the Commerce Department, you must also have an understanding of the pre-1996 regulatory scheme, which played a big part in bringing the controversies over strong encryption to where they are today.

A. Pre-1996 Regulatory Scheme: International Traffic in Arms Regulations

The Old (Pre-1996) Regulations: Glossary

AECA: The Arms Export Control Act, which is the Congressional statutory authorization for the International Traffic in Arms Regulations (ITAR).

BXA: Bureau of Export Administration, the federal agency that controls most exports from the U.S.

U.S. Department of State: the Cabinet Department that initially held jurisdiction over the export of cryptographic items under the AECA and the ITAR.

ITAR: The International Traffic in Arms Regulations. Regulation that controlled the import, export and manufacture of items on the United States Munitions List (USML). To be found at 22 C.F.R. Sections 120-130.

ODTC: Office of Defense Trade Controls. Prior to the end of 1996, the ODTC was the federal agency responsible for determining whether an item qualified as a controlled item, and therefore required an export license, under the USML and the ITAR.

USML: The United States Munitions List was created under the AECA as the list of "defense articles and defense services," which were almost all weapons, explosives, and military vehicles. Items listed on the USML required a munition dealer license before they could be exported or imported. Until the end of 1996, cryptographic items were controlled under Category XIII (b)(1) of the list. To be found at 22 C.F.R. Section 121.1.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Prior to 1996, the International Traffic In Arms Regulations ("ITAR") and the Arms Export Control Act ("AECA") were used by the U.S. Department of State to regulate the export of products using encryption. ITAR concerned primarily explosives, weapons and military vehicles. Until November 15, 1996, ITAR also included cryptographic systems "including key management systems, equipment, assemblies modules, integrated circuits, components or software with the capability of maintaining secrecy or confidentiality of information or information systems…". (1) Generally, any product incorporating an encryption algorithm stronger than 40-bits required an export license. Except for financial and banking institutions, licenses for export of products containing 56-bit DES were restricted and virtually never granted.

ITAR allowed the U.S. State department to directly control items listed on the USML as well as "technical data" about a listed item. Thus, if 40-bit plus encryption software was listed on the USML, then "technical data" related to encryption software, i.e., documentation of encrypted algorithms, was also subject to export license restrictions. The 40-bit line was drawn so that, theoretically, the NSA would be able to crack any encryption shipped abroad. In reality, however, extremely strong cryptography without any "backdoor" escrowed key was widely available during the 1990's from numerous foreign sites in countries such as Finland, making the rationale of preserving the NSA's ability to crack foreign codes implausible. For example, U.S. Law Professor Peter Junger was unable to remain in compliance with ITAR after posting his computer source code on an internet discussion list in response to a virtually identical encryption algorithm already posted by a British national.

ITAR was created by the ACEA, which was enacted in the early 1990s in response to Congressional concerns over military security threats to U.S. interests in the Persian Gulf region due to an overproliferation of munitions. Willful violation of the ITAR was a criminal offense (with fines of up to $1 million and imprisonment of up to 10 years), to be investigated and

administered by the President, or the U.S. Department of State, to whom the President delegated his authority to act under the AECA. The State Department (through the Office of Defense Trade Controls, or ODTC) determined which defense "articles and services" were going to be subject to licensing requirements and placed on the United States Munitions List. Ostensibly, the USML was created and administered under the AECA to help advance "world peace and the security and foreign policy of the United States." (2)

In the event of unclear coverage, a party had the option to submit a "commodity jurisdiction" request to the ODTC, which would then make a determination of whether or not the product is covered by the USML. Once such a designation was made, it was not subject to judicial review except on constitutional grounds. (3) This is the type of application that Professor Bernstein made in 1992, under the ITAR, so that he could post his strong encryption algorithm SNUFFLE on the Internet for his students to access and download.

An "export" for purposes of the ITAR included sending outside of the U.S. and "[d] isclosing (including oral or visual disclosure) or transferring of technical data to a foreign person, whether in the U.S. or abroad, (4) " and thus could easily include the posting of source or object code on an Internet USENET newsgroup that discussed cryptography such as "sci.crypt."

B. Post-1996 Regulatory Scheme: Export Administration Regulations

The New (post-1996) Regulations: Glossary

CCL: The Commerce Control List. Analogous to the USML for non-military items. Anyone wishing to export items listed on the CCL must obtain a license from the BXA. To be found at 15 C.F.R. Section 774.

U.S. Department of Commerce: Cabinet agency given jurisdiction over export of cryptographic items after 1996 by the Clinton Administration Jurisdictional Transfer. The Commerce Department is responsible for issuing the EAR amendments.

EAR: Export Administration Regulations. The regulations controlling the import and export of items found on the CCL. Analogous to the ITAR under the old regulations. To be found at 50 U.S.C. Sections 2401 et seq.

EAR Amendments: New rules promulgated by the Commerce Department to control cryptographic items per President Clinton's Jurisdictional Transfer order. To be found at 51 Fed. Reg. 68572-587.

Jurisdictional Transfer: Executive Order 13026, to be found at 61 Fed. Reg. 58768 (issued November 15, 1996), in which President Clinton transferred jurisdiction over most cryptographic items from the State Department to the Commerce Department.

* * * * * * * * * * * * * * * * * * * * * * * * * *

On November 15, 1996 President Clinton issued Executive Order 13026 transferring jurisdiction over regulated cryptographic tools from the U.S. State Department to the U.S. Commerce Department, citing concern for the increasing use of encryption tools in nonmilitary contexts.

This meant that all encryption tools formerly listed on the USML and regulated under ITAR via the AECA were now placed on the Commerce Control List (CCL), created by the Export Administration Act of 1969. It was this act which allowed the Bureau of Export Administration (BXA) to issue the 1997 Export Administration Regulations (the "EAR" amendments). Anyone seeking to export an item listed on the CCL needed a prior license from BXA. In his Executive Order, President Clinton explicitly retained the non-judicial review provisions regarding licensing and listing determinations. Also, Clinton's Executive Order required that "export" for purposes of encryption tools meant all forms of Internet access.

The Commerce Department officially assumed jurisdiction on December 30, 1996, when it issued the rapidly drafted interim EAR amendments. Under the definition of encryption items, the three new subcategories added to the CCL were: (1) encryption commodities, (2) software and (3) technology containing encryption features.

One distinction between the ITAR and the EAR was that the EAR amendments allowed licensing exemptions after BXA review for software already possessing key escrow and recovery, such as the Clipper Chip. The EAR amendments also allowed export licensing for 56-bit encryption on the condition that if an applicant "makes satisfactory commitments to build and/or market recoverable encryption items and to help build the supporting international infrastructure." (5) The EAR amendments also provided for an expedited 15-day review for "mass market software" as long as it incorporated government-approved algorithms and contains key lengths no longer than 40-bits.

Note however that the EAR Amendments prohibit any person without an export license from "providing technical assistance (including training) to foreign persons with the intent to aid a foreign person in the DEVELOPMENT OR MANUFACTURE OUTSIDE THE UNITED STATES of encryption commodities and software that, if of United State origin, would be controlled." (6) The ITAR did not have this language which probably led to Philip Zimmerman's investigation being dropped, however, had this language been in place at the time, Zimmerman would have been in direct violation when the second update/release of PGP was made over the Internet from a site in New Zealand.

This is the administrative and regulatory backdrop that was in place in the mid-1990s when the trilogy of crypto cases arose.

Endnotes to Section VI.

(1) 22 CFR � 121.1 (1994), Category XIII (b) (1).

(2) 22 U.S.C. � 2778(a)(1).

(3) 22 U.S.C. � 2778(h).

(4) 22 U.S.C. � 120.17.

(5) 61 Fed. Reg. 68,572 (1996).

(6) 15 CFGR Section 744.9(a)(1997).

 

VII. A CRYPTO TRILOGY: BERNSTEIN, JUNGER & KARN

Case Cites

9th Circuit: The Bernstein Cases

Bernstein I: Bernstein v. U.S. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996)(Judge Marilyn Patel refused to dismiss Daniel Bernstein's complaint and held that source code was speech for purposes of a First Amendment challenge to the ITAR) (decided February 1996)

Bernstein II: Bernstein v. U.S. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) (Judge Patel declared that the ITAR and AECA were an unconstitutional prior restraint licensing scheme with regard to cryptographic items) (opinion issued on October 24, 1996)

Bernstein III: Bernstein v. U.S. Department of Commerce, 974 F. Supp. 1288 (N.D. Cal. 1997)((Judge Patel reprised her prior ITAR ruling with respect to the CCL and the EAR amendments)(opinion issued on August 25, 1997)

Bernstein IV: Bernstein v. U.S. Department of Justice, 176 F. 3d 1132 (9th Cir. 1999) (opinion issued May 6, 1999) (Judge Betty Fletcher affirmed the District Court's finding that the EAR regulations were facially invalid as a prior restraint on speech; but note Judge T.G. Nelson's strong dissent arguing that encryption source code is not expression but a functional tool)

D.C. Circuit

Karn v. U.S. Department of State, 925 F. Supp. 1 (D.D.C. 1996)(Judge Charles Richey granting U.S. Gov't Summary Judgement motion on Karn's claims); Karn v. U.S. Department of State, 107 F.3d 923 (D.C. Cir. 1997)(refusal to consider merits or constitutional issues and remand to District Court to consider reviewability of Karn's claims under the Administrative Procedures Act)

Northern District of Ohio & 6th Circuit

Junger v. Daley, 8 F. Supp. 2d 708 (N.D. Ohio 1998) (U.S. Gov't Summary Judgement motion granted by Judge James Gwin) (note that as of March 10, 1999, Professor Junger has appealed this decision to the Sixth Circuit and the government has responded with a countermotion. The briefs are available at: http://samsara.LAW.CWRU.Edu/comp_law/jvd; Junger v. Daley (opinion filed April 4, 2000, 6th Circuit)

* * * * * * * * * * * * * * * * * * * * * * * * * * *

A. Questions to consider as you read through the following opinions:

Q: Are attempts to control the spread of strong encryption by the U.S. Government attempts to control "speech" or "conduct"?

Q: What does the idea that in cyberspace the First Amendment is a local ordinance mean for regulating strong cryptography?

Q.: What does the idea that some Cyberpundits have floated out that the Internet interprets censorship as damage and routes around it likewise mean for regulating strong cryptography?

Q: Is Object Code considered outside first amendment protection? Should it be? Source Code?

Q: Are communication and functionality mutually exclusive?

Q.: The EAR amendments make distinctions between printed source code (such as you would find in a book) and source code within electronic media -- should the choice of medium affect the scope of first amendment protection?

Q.: If the problem of geographic containment of Strong encryption technology on the Internet can only be solved by forbidding the posting of such encryption on a computer connected to the Internet, are "ample alternatives" available to the foreclosed communicative outlets to UseNet, and are they really adequate substitutes?

Q.: What does the notion of strong cryptography as a form of privacy "self-help" mean when we talk about copyright management schemes that may be used to encrypt uncopyrightable materials as well as legal regimes that make it a crime to tamper with such copyright management systems?

Q.: What, if any, metaphors are helpful (or harmful) when discussing the Internet?

B. Brief Background on the Bernstein I, II, & III cases.

In 1992, a then-Berkeley graduate mathematics student, Daniel Bernstein came up with a public key encryption algorithm he named named "Snuffle." He prepared the "Snuffle" materials in two formats: one, a paper for publication and second, computer source code that he wanted to post on the UseNet newsgroup "sci.crypt, " a group devoted to discussions of cryptographic techniques. However, Bernstein also knew that he ran a risk of violating the ITAR if he made such a posting and so he submitted a Commodity Jurisdiction Request in June 1992, as required by ITAR, to the Office of Defense Trade Controls. The ODTC responded that they considered both his paper and the source code for "Snuffle" as defense items as defined by Category XIII of the USML, and as such, were subject to licensing by the U.S. State Department. Bernstein corresponded inconclusively with the ODTC about their rationale for this classification for a year.

In 1993, Bernstein, in an attempt to clarify exactly what was and wasn't classified as a defense item and therefore subject to export licensing, submitted five Commodity Jurisdictions to the ODTC for: (1) his paper, entitled, "The Snuffle Encryption System;" (2) "the computer source code for the encryption program "Snuffle.c"; (3) the computer source code for the decryption program "Unsnuffle.c"; (4) a set of plain English instructions on how to use "Snuffle"; and (5) instructions on how to program a computer to use "Snuffle." The ODTC determined all five items were defense items and therefore subject to export licensing.

1. Bernstein v. U.S. Department of State, 922 F. Supp. 1426 (N.D. Cal. 1996) (Bernstein I)

In September 1992, Bernstein appealed the ODTC's Commodity Jurisdiction within the agency, but because he hadn't received a response in over a year, he sued for declaratory and injunctive relief against the ODTC enforcing ACEA and ITAR enforcement. Bernstein's claims were limited to Constitutional claims since the ACEA foreclosed judicial review of administrative classifications. Bernstein claimed the ITAR constituted (1) an impermissible content-based speech restriction; (2) that the export restrictions constituted an invalid prior restraint scheme; (3) that the ACEA/ITAR scheme was unconstitutionally vague and overbroad; and (4) that the way the ITAR had been administered constituted an "abuse of discretion" under the Administrative Procedure Act.

The State Department moved to dismiss Bernstein's claim, and also sent him a letter that told Bernstein that his academic paper and the other two set of explanations in English were not subject to export control. However, the two pieces of computer source code were defense articles.

Judge Marilyn Patel, writing for the Northern District of California, rejected the government's motion to dismiss and ruled that Bernstein indeed had a colorable constitutional claim. She also rejected the government argument that computer source code was conduct, not speech, and as such, was not protected by the first amendment. Importantly, Judge Patel found that while source code may be functional (a set of instructions to a computer) it was also a language, and therefore expressive and protected under the first amendment.

2. Bernstein v. U.S. Department of State, 945 F. Supp. 1279 (N.D. Cal. 1996) (Bernstein II)

Eight months after denying the government's motion to dismiss, Judge Patel issued the court's final judgement. Her opinion focused on the question of whether the ACEA/ITAR scheme of export licensing constituted an unconstitutional prior restraint with respect to Bernstein's encryption source code. Applying the three-part test from Freedman v. Maryland (1), Judge Patel held that ITAR failed, because ITAR CJ requests (1) had no clear time limit; (2) were not subject to judicial review; and (3) ODTC didn't meet burden of justifying its license denial. While Judge Patel did not breach the question of whether the ITAR regulations were content-based or content-neutral, she did find that the ITAR definition of "export" was not impermissibly vague or overbroad.

3. Bernstein v. U.S. Department of Justice, 974 F. Supp. 1288 (N.D. Cal. 1997)(Bernstein III)

Three weeks after the Bernstein II was released, the Clinton Administration shifted jurisdiction over encryption products from the U.S. State Department and ITAR to the U.S. Commerce Department with directions to draft interim EAR amendments. This sudden jurisdiction shift had the effect of rendering Bernstein II instantly moot: the source code for Snuffle, no longer enmeshed by ITAR, was now subject to the EAR amendments. Judge Patel allowed Daniel Bernstein to amend his complaint to include the Commerce Department and the EAR amendments.

On August 25, 1997, almost 5 years since Daniel Bernstein first began to consider posting "Snuffle" to UseNet, Bernstein III was released by the District Court.

Judge Patel began the Bernstein III opinion by noting that there were few differences between ITAR and EAR —both sets of regulations relied on national security and foreign policy interests to justify or bypass first amendment considerations. Both ITAR and EAR acted as unconstitutional prior restraint/licensing schemes aimed at protected first amendment speech without important procedural safeguards. Citing the Lakewood v. Plain Dealer Publishing Co. (2) case, Judge Patel was concerned with the type of standardless discretion with regard to national security and foreign policy that the EAR amendments conferred on the BXA. (3) She also was concerned about the EAR amendments' "teaching exception" and that the explicit bar to providing technical assistance (including training) to foreign persons could also include activities such as teaching or discussing cryptography in an academic setting thereby making "the most common expressive activities of scholars — teaching a class, publishing their ideas, speaking at conferences, or writing to colleagues over the Internet — . . . subject to a prior restraint . . . when they involve cryptographic source code or computer programs." (4) Additionally, she found the printed matter exception (that allowed source code that was printed on paper to move without restriction, the same source code on a computer disk or posted electronically would trigger the export restrictions) in the EAR amendments "so irrational and administratively unreliable that it may well serve to exacerbate the potential for self-censorship." (5) Additionally, Judge Patel found the distinction between print and electronic media in the EAR amendments untenable, particularly after the Supreme Court's opinion in Reno v. ACLU, (6) "Thus, the dramatically different treatment of the same materials depending on the medium by which they are conveyed is not only irrational, it may be impermissible under traditional First Amendment analysis." (7)

The U.S. Department of Justice appealed the Bernstein III case into the 9th Circuit, arguments were heard in December 1997, and the 9th Circuit released the Bernstein IV opinion.

Endnotes to Section B.

(1) 380 U.S. 51, 58 (1965) (The Freedman test asks whether a licensing scheme such as ITAR is constitutional: (1) the licensing agency has to make its decision within a specific and reasonable period of time; (2) prompt judicial review must be available; and (3) the licensing agency/censor must bear the burden of justifying the denial of a license.)

(2) 486 U.S. 750 (1988) (striking down a city ordinance that granted the mayor the power to grant or deny permits for newspaper racks located on public property without an criteria to guide the decisionmaking process)

(3) Bernstein III, at 1308.

(4) Bernstein III, at 1305.

(5) Bernstein III, at 1306. But note the seemingly opposite holding in the Karn v. U.S. Department of State case, supra.

(6) Reno v. ACLU, 117 S. Ct. 2329, 2344 (1997) ("our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to this medium")

(7) Bernstein III, at 1307.

C. The Cases

1. Bernstein v. Department of Justice (Bernstein IV) (opinion filed May 6, 1999, 9th Circuit)

2. Junger v. Daley (opinion filed April 4, 2000, 6th Circuit)

3. Karn v. U.S. Department of State (opinion filed March 22, 1996, District Ct. for the District of Columbia)

A. Bernstein IV (opinion filed May 6, 1999)

DANIEL J. BERNSTEIN, Plaintiff-Appellee, v. UNITED STATES DEPARTMENT OF JUSTICE, ET AL, 176 F.3d 1132 (9th Cir., May 6, 1999)

JUDGES: Before: Myron H. Bright (Senior United States Circuit Judge for the Eighth Circuit, sitting by designation), Betty B. Fletcher, and Thomas G. Nelson, Circuit Judges. Opinion by Judge B. Fletcher; Concurrence by Judge Bright; Dissent by Judge T.G. Nelson.

B. FLETCHER, Circuit Judge:

The government defendants appeal the grant of summary judgment to the plaintiff, Professor Daniel J. Bernstein ("Bernstein"), enjoining the enforcement of certain Export Administration Regulations ("EAR") that limit Bernstein's ability to distribute encryption software. We find that the EAR regulations (1) operate as a prepublication licensing scheme that burdens scientific expression, (2) vest boundless discretion in government officials, and (3) lack adequate procedural safeguards. Consequently, we hold that the challenged regulations constitute a prior restraint on speech that offends the First Amendment. Although we employ a somewhat narrower rationale than did the district court, its judgment is accordingly affirmed.

DISCUSSION

I. Prior Restraint

The parties and amici urge a number of theories on us. We limit our attention here, for the most part, to only one: whether the EAR restrictions on the export of encryption software in source code form constitute a prior restraint in violation of the First Amendment. We review de novo the district court's affirmative answer to this question. See Roulette v. Seattle, 97 F.3d 300, 302 (9th Cir. 1996).

It is axiomatic that "prior restraints on speech and publication are the most serious and least tolerable infringement on First Amendment rights." Nebraska Press Ass'n v. Stuart, 427 U.S. 539, 559, 49 L. Ed. 2d 683, 96 S. Ct. 2791 (1976). Indeed, the Supreme Court has opined that "it is the chief purpose of the [First Amendment] guaranty to prevent previous restraints upon publication." Near v. Minnesota, 283 U.S. 697, 713, 75 L. Ed. 1357, 51 S. Ct. 625 (1931). Accordingly, "any prior restraint on expression comes ... with a 'heavy presumption' against its constitutional validity." Organization for a Better Austin v. Keefe, 402 U.S. 415, 419, 29 L. Ed. 2d 1, 91 S. Ct. 1575 (1971). At the same time, the Supreme Court has cautioned that "the phrase 'prior restraint' is not a self-wielding sword. Nor can it serve as a talismanic test." Kingsley Books, Inc. v. Brown, 354 U.S. 436, 441, 1 L. Ed. 2d 1469, 77 S. Ct. 1325 (1957). We accordingly turn from "the generalization that prior restraint is particularly obnoxious" to a "more particularistic analysis." Id. at 442. The Supreme Court has treated licensing schemes that act as prior restraints on speech with suspicion because such restraints run the twin risks of encouraging self-censorship and concealing illegitimate abuses of censorial power. See Lakewood v. Plain Dealer Publishing Co., 486 U.S. 750, 759, 100 L. Ed. 2d 771, 108 S. Ct. 2138 (1988). As a result, "even if the government may constitutionally impose content-neutral prohibitions on a particular manner of speech, it may not condition that speech on obtaining a license or permit from a government official in that official's boundless discretion." Id. at 764 (emphasis in original). We follow the lead of the Supreme Court and divide the appropriate analysis into two parts. The threshold question is whether Bernstein is entitled to bring a facial challenge against the EAR regulations. See 486 U.S. at 755. If he is so entitled, we proceed to the second question: whether the regulations constitute an impermissible prior restraint on speech. See id. at 769.

A. Is Bernstein entitled to bring a facial attack?

A licensing regime is always subject to facial challenge (1) as a prior restraint where it "gives a government official or agency substantial power to discriminate based on the content or viewpoint of speech by suppressing disfavored speech or disliked speakers," and has "a close enough nexus to expression, or to conduct commonly associated with expression, to pose a real and substantial threat of ... censorship risks." Id. at 759.

The EAR regulations at issue plainly satisfy the first requirement - "the determination of who may speak and who may not is left to the unbridled discretion of a government official." 486 U.S. at 763. BXA administrators are empowered to deny licenses whenever export might be inconsistent with "U.S. national security and foreign policy interests." 15 C.F.R. � 742.15(b). No more specific guidance is provided. Obviously, this constraint on official discretion is little better than no constraint at all. See Lakewood, 486 U.S. at 769-70 (a standard requiring that license denial be in the "public interest" is an "illusory" standard that "renders the guarantee against censorship little more than a high-sounding ideal."). The government's assurances that BXA administrators will not, in fact, discriminate on the basis of content are beside the point. See id. at 770 (presumption that official will act in good faith "is the very presumption that the doctrine forbidding unbridled discretion disallows."). After all, "the mere existence of the licensor's unfettered discretion, coupled with the power of prior restraint, intimidates parties into censoring their own speech, even if the discretion and power are never actually abused." Id. at 757.

The more difficult issue arises in relation to the second requirement - that the challenged regulations exhibit "a close enough nexus to expression." We are called on to determine whether encryption source code is expression for First Amendment purposes. (2)

We begin by explaining what source code is. (3) "Source code," at least as currently understood by computer programmers, refers to the text of a program written in a "high-level" programming language, such as "PASCAL" or "C." The distinguishing feature of source code is that it is meant to be read and understood by humans and that it can be used to express an idea or a method. A computer, in fact, can make no direct use of source code until it has been translated ("compiled") into a "low-level" or "machine" language, resulting in computer-executable "object code." That source code is meant for human eyes and understanding, however, does not mean that an untutored layperson can understand it. Because source code is destined for the maw of an automated, ruthlessly literal translator - the compiler - a programmer must follow stringent grammatical, syntactical, formatting, and punctuation conventions. As a result, only those trained in programming can easily understand source code. (4)

Also important for our purposes is an understanding of how source code is used in the field of cryptography. Bernstein has submitted numerous declarations from cryptographers and computer programmers explaining that cryptographic ideas and algorithms are conveniently expressed in source code. (5) That this should be so is, on reflection, not surprising. As noted earlier, the chief task for cryptographers is the development of secure methods of encryption. While the articulation of such a system in layman's English or in general mathematical terms may be useful, the devil is, at least for cryptographers, often in the algorithmic details. By utilizing source code, a cryptographer can express algorithmic ideas with precision and methodological rigor that is otherwise difficult to achieve. This has the added benefit of facilitating peer review - by compiling the source code, a cryptographer can create a working model subject to rigorous security tests. The need for precisely articulated hypotheses and formal empirical testing, of course, is not unique to the science of cryptography; it appears, however, that in this field, source code is the preferred means to these ends.

Thus, cryptographers use source code to express their scientific ideas in much the same way that mathematicians use equations or economists use graphs. Of course, both mathematical equations and graphs are used in other fields for many purposes, not all of which are expressive. But mathematicians and economists have adopted these modes of expression in order to facilitate the precise and rigorous expression of complex scientific ideas. (6) Similarly, the undisputed record here makes it clear that cryptographers utilize source code in the same fashion. (7)

In light of these considerations, we conclude that encryption software, in its source code form (8) and as employed by those in the field of cryptography, must be viewed as expressive for First Amendment purposes, and thus is entitled to the protections of the prior restraint doctrine. If the government required that mathematicians obtain a prepublication license prior to publishing material that included mathematical equations, we have no doubt that such a regime would be subject to scrutiny as a prior restraint. The availability of alternate means of expression, moreover, does not diminish the censorial power of such a restraint - that Adam Smith wrote Wealth of Nations without resorting to equations or graphs surely would not justify governmental prepublication review of economics literature that contain these modes of expression.

The government, in fact, does not seriously dispute that source code is used by cryptographers for expressive purposes. Rather, the government maintains that source code is different from other forms of expression (such as blueprints, recipes, and "how-to" manuals) because it can be used to control directly the operation of a computer without conveying information to the user. In the government's view, by targeting this unique functional aspect of source code, rather than the content of the ideas that may be expressed therein, the export regulations manage to skirt entirely the concerns of the First Amendment. This argument is flawed for at least two reasons.

First, it is not at all obvious that the government's view reflects a proper understanding of source code. As noted earlier, the distinguishing feature of source code is that it is meant to be read and understood by humans, and that it cannot be used to control directly the functioning of a computer. While source code, when properly prepared, can be easily compiled into object code by a user, ignoring the distinction between source and object code obscures the important fact that source code is not meant solely for the computer, but is rather written in a language intended also for human analysis and understanding.

Second, and more importantly, the government's argument, distilled to its essence, suggests that even one drop of "direct functionality" overwhelms any constitutional protections that expression might otherwise enjoy. This cannot be so. (9) The distinction urged on us by the government would prove too much in this era of rapidly evolving computer capabilities. The fact that computers will soon be able to respond directly to spoken commands, for example, should not confer on the government the unfettered power to impose prior restraints on speech in an effort to control its "functional" aspects. The First Amendment is concerned with expression, and we reject the notion that the admixture of functionality necessarily puts expression beyond the protections of the Constitution.

The government also contends that the challenged regulations are immune from prior restraint analysis because they are "laws of general application" rather than being "directed narrowly and specifically at expression." Lakewood, 486 U.S. at 760-61. We cannot agree. Because we conclude that source code is utilized by those in the cryptography field as a means of expression, and because the regulations apply to encryption source code, it necessarily follows that the regulations burden a particular form of expression directly.

The Supreme Court in Lakewood explored what it means to be a "law of general application" for prior restraint purposes. In that case, the Court cited a law requiring building permits as a "law of general application" that would not be subject to a facial attack as a prior restraint, reasoning that such a law carried "little danger of censorship," even if it could be used to retaliate against a disfavored newspaper seeking to build a printing plant. Id. at 761. In the Court's view, "such laws provide too blunt a censorship instrument to warrant judicial intervention prior to an allegation of actual misuse." Id. Unlike a building permit ordinance, which would afford government officials only intermittent and unpredictable opportunities to exercise unrestrained discretion over expression, the challenged EAR regulations explicitly apply to expression and place scientific expression under the censor's eye on a regular basis. In fact, there is ample evidence in the record establishing that some in the cryptography field have already begun censoring themselves, for fear that their statements might influence the disposition of future licensing applications. See, e.g., National Research Council, Cryptography's Role in Securing the Information Society 158 (1996) ("Vendors contended that since they are effectively at the mercy of the export control regulators, they have considerable incentive to suppress any public expression of dissatisfaction with thecurrent process."). In these circumstances, we cannot conclude that the export control regime at issue is a "law of general application" immune from prior restraint analysis. (10)

Because the prepublication licensing scheme challenged here vests unbridled discretion in government officials, and because it directly jeopardizes scientific expression, we are satisfied that Bernstein may properly bring a facial challenge against the regulations. (11) We accordingly turn to the merits.

B. Are the regulations an impermissible prior restraint?

"The protection even as to previous restraint is not absolutely unlimited." Near, 283 U.S. at 716. The Supreme Court has suggested that the "heavy presumption" against prior restraints may be overcome where official discretion is bounded by stringent procedural safeguards. See FW/PBS, 493 U.S. at 227 (plurality opinion of O'Connor, J.); Freedman v. Maryland, 380 U.S. 51, 58-59, 13 L. Ed. 2d 649, 85 S. Ct. 734 (1965); Kingsley Books, 354 U.S. at 442-43; 11126 Baltimore Blvd. v. Prince George's County, 58 F.3d 988, 995 (4th Cir. 1995) (en banc). As our analysis above suggests, the challenged regulations do not qualify for this First Amendment safe harbor. (12) In Freedman v. Maryland, the Supreme Court set out three factors for determining the validity of licensing schemes that impose a prior restraint on speech: (1) any restraint must be for a specified brief period of time; (2) there must be expeditious judicial review; and (3) the censor must bear the burden of going to court to suppress the speech in question and must bear the burden of proof. (13) See 380 U.S. at 58-60. The district court found that the procedural protections provided by the EAR regulations are "woefully inadequate" when measured against these requirements. Bernstein III, 974 F. Supp. at 1308. We agree.

Although the regulations require that license applications be resolved or referred to the President within 90 days, see 15 C.F.R. � 750.4(a), there is no time limit once an application is referred to the President. Thus, the 90-day limit can be rendered meaningless by referral. Moreover, if the license application is denied, no firm time limit governs the internal appeals process. See 15 C.F.R. � 756.2(c)(1) (Under Secretary "shall decide an appeal within a reasonable time after receipt of the appeal."). Accordingly, the EAR regulations do not satisfy the first Freedman requirement that a licensing decision be made within a reasonably short, specified period of time. See FW/PBS, 493 U.S. at 226 (finding that "a prior restraint that fails to place time limits on the time within which the decisionmaker must issue the license is impermissible"); Riley v. National Fed. of the Blind, 487 U.S. 781, 802, 101 L. Ed. 2d 669, 108 S. Ct. 2667 (1988) (licensing scheme that permits "delay without limit" is impermissible); Vance v. Universal Amusement Co., 445 U.S. 308, 315-17, 63 L. Ed. 2d 413, 100 S. Ct. 1156 (1980) (prior restraint of indefinite duration is impermissible). The EAR regulatory regime further offends Freedman's procedural requirements insofar as it denies a disappointed applicant the opportunity for judicial review. (14) See 15 C.F.R. � 756.2(c)(2); FW/PBS, 493 U.S. at 229 (plurality opinion of O'Connor, J.) (finding failure to provide "prompt" judicial review violates Freedman); Freedman, 380 U.S. at 59 (licensing procedure must assure a prompt final judicial decision).

We conclude that the challenged regulations allow the government to restrain speech indefinitely with no clear criteria for review. As a result, Bernstein and other scientists have been effectively chilled from engaging in valuable scientific expression. Bernstein's experience itself demonstrates the enormous uncertainty that exists over the scope of the regulations and the potential for the chilling of scientific expression. In short, because the challenged regulations grant boundless discretion to government officials, and because they lack the required procedural protections set forth in Freedman, we find that they operate as an unconstitutional prior restraint on speech. (15) See Lakewood, 486 U.S. at 769-772 (holding that newsrack licensing ordinance was an impermissible prior restraint because it conferred unbounded discretion and lacked adequate procedural safeguards).

C. Concluding comments.

We emphasize the narrowness of our First Amendment holding. We do not hold that all software is expressive. Much of it surely is not. Nor need we resolve whether the challenged regulations constitute content-based restrictions, subject to the strictest constitutional scrutiny, or whether they are, instead, content-neutral restrictions meriting less exacting scrutiny. We hold merely that because the prepublication licensing regime challenged here applies directly to scientific expression, vests boundless discretion in government officials, and lacks adequate procedural safeguards, it constitutes an impermissible prior restraint on speech.

We will, however, comment on two issues that are entwined with the underlying merits of Bernstein's constitutional claims. First, we note that insofar as the EAR regulations on encryption software were intended to slow the spread of secure encryption methods to foreign nations, the government is intentionally retarding the progress of the flourishing science of cryptography. To the extent the government's efforts are aimed at interdicting the flow of scientific ideas (whether expressed in source code or otherwise), as distinguished from encryption products, these efforts would appear to strike deep into the heartland of the First Amendment. In this regard, the EAR regulations are very different from content-neutral time, place and manner restrictions that may have an incidental effect on expression while aiming at secondary effects.

Second, we note that the government's efforts to regulate and control the spread of knowledge relating to encryption may implicate more than the First Amendment rights of cryptographers. In this increasingly electronic age, we are all required in our everyday lives to rely on modern technology to communicate with one another. This reliance on electronic communication, however, has brought with it a dramatic diminution in our ability to communicate privately. Cellular phones are subject to monitoring, email is easily intercepted, and transactions over the internet are often less than secure. Something as commonplace as furnishing our credit card number, social security number, or bank account number puts each of us at risk. Moreover, when we employ electronic methods of communication, we often leave electronic "fingerprints" behind, fingerprints that can be traced back to us. Whether we are surveilled by our government, by criminals, or by our neighbors, it is fair to say that never has our ability to shield our affairs from prying eyes been at such a low ebb. The availability and use of secure encryption may offer an opportunity to reclaim some portion of the privacy we have lost. Government efforts to control encryption thus may well implicate not only the First Amendment rights of cryptographers intent on pushing the boundaries of their science, but also the constitutional rights of each of us as potential recipients of encryption's bounty. Viewed from this perspective, the government's efforts to retard progress in cryptography may implicate the Fourth Amendment, as well as the right to speak anonymously, see McIntyre v. Ohio Elections Comm'n, 514 U.S. 334, 115 S. Ct. 1511, 1524, 131 L. Ed. 2d 426 (1995), the right against compelled speech, see Wooley v. Maynard, 430 U.S. 705, 714, 51 L. Ed. 2d 752, 97 S. Ct. 1428 (1977), and the right to informational privacy, see Whalen v. Roe, 429 U.S. 589, 599-600, 51 L. Ed. 2d 64, 97 S. Ct. 869 (1977). While we leave for another day the resolution of these difficult issues, it is important to point out that Bernstein's is a suit not merely concerning a small group of scientists laboring in an esoteric field, but also touches on the public interest broadly defined.

CONCLUSION

Because the prepublication licensing regime challenged by Bernstein applies directly to scientific expression, vests boundless discretion in government officials, and lacks adequate procedural safeguards, we hold that it constitutes an impermissible prior restraint on speech. We decline the invitation to line edit the regulations in an attempt to rescue them from constitutional infirmity, and thus endorse the declaratory relief granted by the district court.

AFFIRMED.

BRIGHT, Circuit Judge, separately concurring.

I join Judge Fletcher's opinion. I do so because the speech aspects of encryption source code represent communication between computer programmers. I do, however, recognize the validity of Judge Nelson's view that encryption source code also has the functional purpose of controlling computers and in that regard does not command protection under the First Amendment. The importance of this case suggests that it may be appropriate for review by the United States Supreme Court.

T.G. NELSON, Circuit Judge, Dissenting:

Bernstein was not entitled to bring a facial First Amendment challenge to the EAR, and the district court improperly granted an injunction on the basis of a facial challenge. I therefore respectfully dissent.

The basic error which sets the majority and the district court adrift is the failure to fully recognize that the basic function of encryption source code is to act as a method of controlling computers. As defined in the EAR regulations, encryption source code is "[a] precise set of operating instructions to a computer, that when compiled, allows for the execution of an encryption function on a computer." 15 C.F.R. pt. 722. Software engineers generally do not create software in object code - the series of binary digits (1's and 0's) - which tells a computer what to do because it would be enormously difficult, cumbersome and time-consuming. Instead, software engineers use high-level computer programming languages such as "C" or "Basic" to create source code as a shorthand method for telling the computer to perform a desired function. In this respect, lines of source code are the building blocks or the tools used to create an encryption machine. See e.g., Patrick Ian Ross, Bernstein v. United States Department of State, 13 Berkeley Tech. L.J. 405, 410-11 (1998) ("Electronic source code that is ready to compile merely needs a few keystrokes to generate object code - the equivalent of flipping an 'on' switch. Code used for this purpose can fairly easily be characterized as 'essentially functional.'"); Pamela Samuelson et al., A Manifesto Concerning Legal Protection of Computer Programs, 94 Colum. L. Rev. 2308, 2315-30 (1994) ("Programs are, in fact, machines (entities that bring about useful results, i.e., behavior) that have been constructed in the medium of text (source code and object code)."). Encryption source code, once compiled, works to make computer communication and transactions secret; it creates a lockbox of sorts around a message that can only be unlocked by someone with a key. It is the function or task that encryption source code performs which creates its value in most cases. This functional aspect of encryption source code contains no expression; it is merely the tool used to build the encryption machine.

This is not to say that this very same source code is not used expressively in some cases. Academics, such as Bernstein, seek to convey and discuss their ideas concerning computer encryption. As noted by the majority, Bernstein must actually use his source code textually in order to discuss or teach cryptology. In such circumstances, source code serves to express Bernstein's scientific methods and ideas.

While it is conceptually difficult to categorize encryption source code under our First Amendment framework, I am still inevitably led to conclude that encryption source code is more like conduct than speech. Encryption source code is a building tool. Academics and computer programmers can convey this source code to each other in order to reveal the encryption machine they have built. But, the ultimate purpose of encryption code is, as its name suggests, to perform the function of encrypting messages. Thus, while encryption source code may occasionally be used in an expressive manner, it is inherently a functional device.

We are not the first to examine the nature of encryption source code in terms of First Amendment protection. Judge Gwin of the United States District Court for the Northern District of Ohio also explored the function versus expression conundrum of encryption source code at some length in Junger v. Daley, 8 F. Supp. 2d 708 (N.D. Ohio 1998). Junger, like Bernstein, is a professor, albeit a law professor, who wished to publish in various forms his work on computers, including a textbook, Computers and the Law. The book was determined by the Government to be subject to export without a license, but his software programs were determined to come within the licensing provisions of the EAR. In the course of rejecting Junger's claims, the court said:

Like much computer software, encryption source code is inherently functional; it is designed to enable a computer to do a designated task. Encryption source code does not merely explain a cryptographic theory or describe how the software functions. More than describing encryption, the software carries out the function of encryption. The software is essential to carry out the function of encryption. In doing this function, the encryption software is indistinguishable from dedicated computer hardware that does encryption.

In the overwhelming majority of circumstances, encryption source code is exported to transfer functions, not to communicate ideas. In exporting functioning capability, encryption source code is like other encryption devices. For the broad majority of persons receiving such source code, the value comes from the function the source code does. Id. at 716.

The Junger decision [overruled by the Sixth Circuit in April 2000] thus adds considerable support for the propositions that encryption source code cannot be categorized as pure speech and that the functional aspects of encryption source code cannot be easily ignored or put aside.

Both the district court and the majority hold that because source code can be used expressively in some circumstances, Bernstein was entitled to bring a facial challenge to the EAR. Such an approach ignores the basic tenet that facial challenges are inappropriate "unless, at a minimum, the challenged statute 'is directed narrowly and specifically at expression or conduct commonly associated with expression.'" Roulette v. City of Seattle, 97 F.3d 300, 305 (9th Cir. 1996) (quoting City of Lakewood v. Plain Dealer Publishing Co., 486 U.S. 750, 760, 100 L. Ed. 2d 771, 108 S. Ct. 2138 (1988)). That encryption source code may on occasion be used expressively does not mean that its export is "conduct commonly associated with expression" or that the EAR regulations are directed at expressive conduct. See 97 F.3d at 303 ("The fact that sitting can possibly be expressive, however, isn't enough to sustain plaintiffs' facial challenge."); see also Junger, 8 F. Supp. 2d at 718 ("The prior restraint doctrine is not implicated simply because an activity may on occasion be expressive.").

The activity or conduct at issue here is the export of encryption source code. As I noted above, thebasic nature of encryption source code lies in its functional capacity as a method to build an encryption device. Export of encryption source code is not conduct commonly associated with expression. Rather, it is conduct that is normally associated with providing other persons with the means to make their computer messages secret. The overwhelming majority of people do not want to talk about the source code and are not interested in any recondite message that may be contained in encryption source code. Only a few people can actually understand what a line of source code would direct a computer to do. Most people simply want to use the encryption source code to protect their computer communications. Export of encryption source code simply does not fall within the bounds of conduct commonly associated with expression such as picketing or handbilling. See Roulette, 97 F.3d at 303-04.

Further, the EAR regulates the export of encryption technology generally, whether it is software or hardware. See 15 C.F.R. � 742.15; Junger, 8 F. Supp. 2d at 718 ("The Export Regulations do not single out encryption software."). These regulations are directed at preventing the functional capacity of any encryption device, including its source code, from being exported without a government license. The EAR is not specifically directed towards stifling the expressive nature of source code or Bernstein's academic discussions about cryptography. This is demonstrated by the fact that the regulations do not object to publication in printed form of learned articles containing source code. See 15 C.F.R. � 734.3. Thus, the EAR is generally directed at non-expressive conduct - the export of source code as a tool to make messages secret and impervious to government eavesdropping capabilities.

Because this is a law of general application focused at conduct, Bernstein is not entitled to bring a facial challenge. The district court's injunction based upon the finding of a facial prior restraint is thus impermissible. This is not to say that Bernstein's activities would not be entitled to First Amendment protection, but that the legal path chosen to get that protection must be the correct one. We should be careful to "entertain[ ] facial freedom-of-expression challenges only against statutes that, 'by their terms,' sought to regulate 'spoken words,' or patently 'expressive or communicative conduct.'" Roulette, 97 F.3d at 303 (citing Broadrick v. Oklahoma, 413 U.S. 601, 612-13, 37 L. Ed. 2d 830, 93 S.Ct. 2908 (1973)). Bernstein may very well have a claim under an as-applied First Amendment analysis; however, such a claim must be left to the district court's determination in the first instance. Here, the district court did not rule on Bernstein's as-applied claims. I would therefore vacate the district court's injunction and remand for consideration of Bernstein's as-applied challenges to the EAR. Accordingly, I respectfully dissent.

Endnotes to Bernstein III.

(1) In using the term "facial challenge" in the prior restraint context, the Supreme Court has meant two distinct things. First, if entitled to bring a facial challenge, a plaintiff need not apply for a license before challenging the licensing regime. See Lakewood, 486 U.S. at 755-56. This is a question of standing. Second, a litigant challenging an enactment on its face champions the rights of those not before the court and thus may attack the statute "whether or not his conduct could be proscribed by a properly drawn statute." Freedman v. Maryland, 380 U.S. 51, 56, 13 L. Ed. 2d 649, 85 S. Ct. 734 (1965); see also Secretary of State of Md. v. J. H. Munson Co., 467 U.S. 947, 957, 81 L. Ed. 2d 786, 104 S. Ct. 2839 (1984); Roulette, 97 F.3d at 303 n.3. This goes to the scope of the constitutional challenge.

(2) As an initial matter, we note that the fact that the regulations reach only "exports" does not reduce the burden on Bernstein's First Amendment rights. It is Bernstein's right to speak, not the rights of foreign listeners to hear, that we are concerned with here. The government does not argue, nor could it, that being cut off from a foreign audience, as distinguished from a domestic one, does not implicate First Amendment concerns. See Bullfrog Films, Inc. v. Wick, 847 F.2d 502, 509 n.9 (9th Cir. 1988). In addition, because the regulations define "export" to include the use of internet fora that may be accessible by foreign nationals, as well as domestic communications with foreign nationals, we think it plain that the regulations potentially limit Bernstein's freedom of speech in a variety of both domestic and foreign contexts. See Reno v. American Civ. Lib. Union, 521 U.S. 844, 117 S. Ct. 2329, 2348-49, 138 L. Ed. 2d 874 (1997) (rejecting government argument that restriction of expression on the internet is justified because ample alternative channels of communication exist).

(3) In undertaking this task, we are mindful that computer technology, and the lexicon of terms that accompanies it, is changing rapidly. Nevertheless, because the regulations speak in terms of "source code," we premise our discussion on the meaning commonly ascribed to this term by the programming community.

(4) It must be emphasized, however, that source code is merely text, albeit text that conforms to stringent formatting and punctuation requirements. For example, the following is an excerpt from Bernstein's Snuffle source code:

for (; ;)

(

uch = gtchr();

if (!(n & 31))

(

for (i = 0; i64; i++)

l [ ctr[i] = k[i] + h[n - 64 + i]

Hash512 (wm, wl, level, 8);

)

As source code goes, Snuffle is quite compact; the entirety of the Snuffle source code occupies fewer than four printed pages.

(5) Source code's power to convey algorithmic information is illustrated by the declaration of MIT Professor Harold Abelson:

The square root of a number X is the number Y such that Y times Y equals X. This is declarative knowledge. It tells us something about square roots. But it doesn't tell us how to find a square root.

In contrast, consider the following ancient algorithm, attributed to Heron of Alexandria, for approximating square roots:

To approximate the square root of a positive number X,

- Make a guess for the square root of X.

- Compute an improved guess as the average of the guess and X divided by the guess.

- Keep improving the guess until it is good enough.

Heron's method doesn't say anything about what square roots are, but it does say how to approximate them. This is a piece of imperative "how to" knowledge.

Computer science is in the business of formalizing imperative knowledge - developing formal notations and ways to reason and talk about methodology. Here is Heron's method formalized as a procedure in the notation of the Lisp computer language:

(define (sqrtx)

(define (good-enough? guess)

((abs ( - (square guess) x)) tolerance))

(define (improve guess)

(average guess (/ x guess)))

(define (try guess)

(if (good-enough? guess)

guess

(try (improve guess))))

(try 1))

(6) We are reminded of at least one occasion in which a judicial thinker resorted to a mathematical equation to express a legal principle. See United States v. Carroll Towing Co., 159 F.2d 169, 173 (2d Cir. 1947) (Judge Hand's famous BPL formula to determine "when the absence of a bargee or other attendant will make the owner of the barge liable for injuries to other vessels if she breaks away from her moorings.").

(7) Bernstein's Snuffle, in fact, provides an illustration of this point. By developing Snuffle, Bernstein was attempting to demonstrate that a one-way hash function could be employed as the heart of an encryption method. The Snuffle source code, as submitted by Bernstein to the State Department, was meant as an expression of how this might be accomplished. The Source Code was plainly not intended as a completed encryption product, as demonstrated by the fact that it was incomplete and not in a form suitable for final compiling. The Source Code, in fact, omits the hash function entirely - until combined with such a function and compiled, Snuffle is incapable of performing encryption functions at all.

Snuffle was also intended, in part, as political expression. Bernstein discovered that the ITAR regulations controlled encryption exports, but not one-way hash functions. Because he believed that an encryption system could easily be fashioned from any of a number of publicly-available one-way hash functions, he viewed the distinction made by the ITAR regulations as absurd. To illustrate his point, Bernstein developed Snuffle, which is an encryption system built around a one-way hash function.

(8) We express no opinion regarding whether object code manifests a "close enough nexus to expression" to warrant application of the prior restraint doctrine. Bernstein's Snuffle did not involve object code, nor does the record contain any information regarding expressive uses of object code in the field of cryptography.

(9) If it were, we would have expected the Supreme Court to start and end its analysis of David Paul O'Brien's burning of his draft card with an inquiry into whether he was kept warm by the ensuing flames. See United States v. O'Brien, 391 U.S. 367, 20 L. Ed. 2d 672, 88 S. Ct. 1673 (1968).

(10) The government also argues that the EAR regulations are "laws of general application" because they are not purposefully aimed at suppressing any particular ideas that may be expressed in source code. With respect to this contention, the panel (including the dissenter) agree that the purpose of the regulations is irrelevant to prior restraint analysis. It is clear that a prior restraint analysis applies equally to content-neutral or content-based enactments. See FW/PBS, Inc. v. Dallas, 493 U.S. 215, 223, 107 L. Ed. 2d 603, 110 S. Ct. 596 (1990) (plurality opinion of O'Connor, J.) ("Because we conclude that the city's licensing scheme lacks adequate procedural safeguards, we do not reach ... whether the ordinance is properly viewed as a content-neutral time, place, and manner restriction. ..."); Lakewood, 486 U.S. at 764 ("Even if the government may constitutionally impose content-neutral prohibitions on a particular manner of speech, it may not condition that speech on obtaining a license or permit from a government official in that official's boundless discretion.") (emphasis in original). Indeed, where unbridled discretion is vested in a governmental official, it is difficult to know whether a licensing regime is content-based or content-neutral. Accordingly, the government's purpose in censoring encryption source code is, at this stage of our First Amendment inquiry, beside the point. In other words, a prepublication licensing regime that has a chilling and censorial effect on expression is properly subject to facial attack as a prior restraint, whatever the purpose behind its enactment. See Lakewood, 486 U.S. at 759 (upholding facial attack against newsrack ordinance because of censorial effects, without discussing governmental purpose for enacting the ordinance).

(11) It is at this juncture that we part ways with the dissent. The dissent concedes that source code can be expressive. Nevertheless, the dissent contends that Bernstein is not entitled to bring a facial attack against the EAR regulation. This argument, it seems to us, is based on two foundations.

First, the dissent conceives of the exchange of source code among scientists as "conduct." We disagree. The source code at issue here is text intended for human understanding, albeit in a specialized language. To say that the "export" of this text is "conduct" for First Amendment purposes, rather than straightforward scientific "expression," is to call into question all distribution and circulation of scientific texts that communicate ideas by using specialized languages. Of course, source code may be functional as well as expressive. We are not persuaded, however, that that fact transmogrifies the distribution of scientific texts from "expression" into "conduct" deserving of diminished First Amendment protection.

Having cast the question as one relating to "conduct," the dissent then takes a second step. Drawing from Lakeside, the dissent asks whether the "conduct" - the exchange of cryptographic source code - is "commonly associated with expression." This question the dissent answers in the negative; in other words, the dissent concludes that source code is not used expressively often enough. We find this conclusion somewhat perplexing, as there is nothing in the record to support it. Bernstein has introduced extensive expert evidence to support his contention that source code is frequently used for expressive purposes. The government, however, has failed to introduce anything into the record to rebut this evidence. In fact, the government has made it clear that it means to control the export of source code no matter how commonly associated it may be with expression: "Whatever ideas may be reflected in the software, or the intent of the exporter to convey ideas, the NSA recommends that encryption software be controlled for export solely on the basis of what it does. ..." Second Lowell Decl., Appellant's Excerpts of Record at 104.

(12) The Supreme Court has also suggested that the presumption against prior restraints may be overcome where publication would directly and imminently imperil national security. See New York Times Co. v. United States, 403 U.S. 713, 730, 29 L. Ed. 2d 822, 91 S. Ct. 2140 (1971) (Stewart, J., joined by White, J., concurring); Near, 283 U.S. at 716; see also United States v. The Progressive, Inc., 467 F. Supp. 990, 992 (W.D. Wisc. 1979). In order to justify a prior restraint on national security grounds, the government must prove the publication would "surely result in direct, immediate, and irreparable damage to our Nation or its people." New York Times, 403 U.S. at 730 (Stewart, J., joined by White, J., concurring); see also id. at 726-27 (Brennan, J., concurring) (finding that national security is a sufficient interest only where there is "governmental allegation and proof that publication must inevitably, directly, and immediately cause the occurrence of an event kindred to imperiling the safety of a transport already at sea"); Burch v. Barker, 861 F.2d 1149, 1155 (9th Cir. 1988) ("Prior restraints are permissible in only the rarest of circumstances, such as imminent threat to national security.").

The government does not argue that the prior restraint at issue here falls within the extremely narrow class of cases where publication would directly and immediately imperil national security.

(13) Whether all three Freedman factors apply to all prior restraints is the subject of dispute. Compare FW/PBS, 493 U.S. at 229-30 (plurality opinion of O'Connor, J.) (finding the government does not bear the burden of going to court to defend its licensing requirement where restrained speakers are likely to challenge the restraint in court) with id. at 239 (Brennan, J., concurring in judgment) ("We have never suggested that our insistence on Freedman procedures might vary with the particular facts of the prior restraint before us."). Because we conclude that the EAR regulations fail Freedman's first two procedural requirements, we need not reach the issue of whether the third Freedman factor applies in this case.

(14) As noted earlier, the BXA enjoys essentially unbounded discretion under the EAR regulations in administering the license process. Accordingly, even if the challenged regulations provided for judicial review, the lack of explicit limits on the decisionmaker's discretion would likely make such review meaningless. In this sense, the presence of unbounded discretion itself may be considered fatal for purposes of prior restraint review. See Lakewood, 486 U.S. at 769-70 (striking down a licensing scheme where the mayor could merely claim that the license "'is not in the public interest' when denying a permit application").

(15) Our conclusion relating to the Source Code also resolves the status of the regulations as applied to the Instructions. Because the Instructions are essentially a translation of the Source Code into English, they are, if anything, nearer the heartland of the First Amendment. Consequently, to the extent the challenged regulations are unconstitutional as applied to the Source Code, they necessarily are unconstitutional as applied to the Instructions.

  1. Junger v. Daley (opinion filed April 4, 2000, 6th Circuit)

 

PETER D. JUNGER, Plaintiff - Appellant, v. WILLIAM DALEY, United States Secretary of Commerce, et al., Defendants - Appellees,

JUDGES: Before: Boyce, Chief Judge; Clay, Circuit Judge; and Weber, District Judge.

BOYCE F. MARTIN, JR., Chief Judge.

This is a constitutional challenge to the provisions of the Export Administration Regulations, 15 C. F. R. Parts 730- 74, that regulate the export of encryption software. Peter D. Junger appeals the district court’s grant of summary judgment in favor of Secretary Daley and the other defendants.

The district court found that encryption source code is not sufficiently expressive to be protected by the First Amendment, that the Export Administration Regulations are permissible content- neutral restrictions, and that the Regulations are not subject to a facial challenge as a prior restraint on speech. Subsequent to the district court’s holding and the oral arguments before this Court, the Bureau of Export Administration issued an interim final rule amending the regulations at issue. See Revisions to Encryption Items, 65 Fed. Reg. 2492 (2000) (to be codified at 15 C. F. R. Parts 734, 740, 742, 770, 772, 774). Having concluded that the First Amendment protects computer source code, we reverse the district court and remand this case for further consideration of Junger’s constitutional claims in light of the amended regulations.

FACTUAL BACKGROUND

Peter Junger is a professor at the Case Western University School of Law. Junger maintains sites on the World Wide Web that include information about courses that he teaches, including a computers and the law course. Junger wishes to post on his web site encryption source code that he has written to demonstrate how computers work. Such a posting is defined as an export under the Regulations. On June 12, 1997, Junger submitted three applications to the Commerce Department, requesting determinations of commodity classifications for encryption software programs and other items. On July 4, the Export Administration told Junger that Classification Number 5D002 covered four of the five software programs he had submitted. Although it found that four programs were subject to the Regulations, the Export Administration found that the first chapter of Junger’s textbook, Computers and the Law , was an allowable unlicensed export. Though deciding that the printed book chapter containing encryption code could be exported, the Export Administration stated that export of the book in electronic form would require a license if the text contained 5D002 software. Since receiving the classification determination, Junger has not applied for a license to export his classified encryption source code. Junger filed this action to make a facial challenge to the Regulations on First Amendment grounds, seeking declaratory and injunctive relief that would permit him to engage in the unrestricted distribution of encryption software through his web site. Junger claims that encryption source code is protected speech. The district court granted summary judgment in favor of the defendants, holding that encryption source code is not protected under the First Amendment, that the Regulations are permissible content- neutral regulations, and that the Regulations are not subject to facial challenge on prior restraint grounds.

We review the grant of summary judgment de novo . See Smith v. Wal- Mart Stores, Inc. , 167 F. 3d 286, 289 (6th Cir. 1999).

The issue of whether or not the First Amendment protects encryption source code is a difficult one because source code has both an expressive feature and a functional feature. The United States does not dispute that it is possible to use encryption source code to represent and convey information and ideas about cryptography and that encryption source code can be used by programmers and scholars for such informational purposes. Much like a mathematical or scientific formula, one can describe the function and design of encryption software by a prose explanation; however, for individuals fluent in a computer programming language, source code is the most efficient and precise means by which to communicate ideas about cryptography. The district court concluded that the functional characteristics of source code overshadow its simultaneously expressive nature. The fact that a medium of expression has a functional capacity should not preclude constitutional protection. Rather, the appropriate consideration of the medium’s functional capacity is in the analysis of permitted government regulation.

The Supreme Court has explained that "all ideas having even the slightest redeeming social importance," including those concerning "the advancement of truth, science, morality, and arts" have the full protection of the First Amendment. Roth v. United States , 354 U. S. 476, 484 (1957) (quoting 1 J OURNALS OF THE C ONTINENTAL C ONGRESS 108 (1774)). This protection is not reserved for purely expressive communication. The Supreme Court has recognized First Amendment protection for symbolic conduct, such as draft-card burning, that has both functional and expressive features. See United States v. O’Brien , 391 U. S. 367 (1968). The Supreme Court has expressed the versatile scope of the First Amendment by labeling as "unquestionably shielded" the artwork of Jackson Pollack, the music of Arnold Schoenberg, or the Jabberwocky verse of Lewis Carroll. Hurley v. Irish- American Gay, Lesbian and Bisexual Group, 515 U. S. 557, 569 (1995). Though unquestionably expressive, these things identified by the Court are not traditional speech. Particularly, a musical score cannot be read by the majority of the public but can be used as a means of communication among musicians. Likewise, computer source code, though unintelligible to many, is the preferred method of communication among computer programmers. Because computer source code is an expressive means for the exchange of information and ideas about computer programming, we hold that it is protected by the First Amendment.

The functional capabilities of source code, and particularly those of encryption source code, should be considered when analyzing the governmental interest in regulating the exchange of this form of speech. Under intermediate scrutiny, the regulation of speech is valid, in part, if "it furthers an important or substantial governmental interest." O’Brien , 391 U. S. at 377. In Turner Broadcasting System v. FCC , 512 U. S. 622, 664 (1994), the Supreme Court noted that although an asserted governmental interest may be important, when the government defends restrictions on speech "it must do more than simply ‘posit the existence of the disease sought to be cured. ’" Id . (quoting Quincy Cable TV, Inc. v. FCC, 768 F. 2d 1434, 1455 (D. C. Cir. 1985)). The government "must demonstrate that the recited harms are real, not merely conjectural, and that the regulation will in fact alleviate these harms in a direct and material way." Id . We recognize that national security interests can outweigh the interests of protected speech and require the regulation of speech. In the present case, the record does not resolve whether the exercise of presidential power in furtherance of national security interests should overrule the interests in allowing the free exchange of encryption source code.

Before any level of judicial scrutiny can be applied to the Regulations, Junger must be in a position to bring a facial challenge to these regulations.

In light of the recent amendments to the Export Administration Regulations the district court should examine the new regulations to determine if Junger can bring a facial challenge.

For the foregoing reasons, we REVERSE the district court and REMAND the case to the district court for consideration of Junger’s constitutional challenge to the amended regulations.

3. Karn v. U.S. Department of State (opinion filed March 22, 1996); see also Karn v. U.S. Dept. of State, 107 F.3d 21 (Jan. 21, 1997)(remand of appeal to District Ct. to re-evaluate the claims in light of the jurisdictional transfer from the State Dept. to the Commerce Dept. made by Exec. Order 13026 (issued November 1996)

PHILIP R. KARN, JR., Plaintiff, v. U.S. DEPARTMENT OF STATE, and THOMAS B. MCNAMARA, Defendants, 925 F. Supp. 1 (D.Ct. D.C, March 22, 1996)

CHARLES R. RICHEY, J.

INTRODUCTION

This case presents a classic example of how the courts today, particularly the federal courts, can become needlessly invoked, whether in the national interest or not, in litigation involving policy decisions made within the power of the President or another branch of the government. The plaintiff, in an effort to export a computer diskette for Profit, raises administrative law and meritless constitutional claims because he and others have not been able to persuade the Congress and the Executive Branch that the technology at issue does not endanger the national security. This is a "political question" for the two elected branches under Articles I and II of the Constitution.

The case arises out of the defendants' designation of the plaintiff's computer diskette as a "defense article" pursuant to the Arms Export Control Act (AECA), 22 U.S.C. � � 2751-2796d, and the International Traffic in Arms Regulations (ITAR), 22 C.F.R. � � 120-130. The plaintiff alleges that the defendants' designation of a diskette containing source codes for cryptographic algorithms (1) as a defense article subject to the export controls set forth in the ITAR, when the defendant deemed a book containing the same source codes not subject to said export controls, is arbitrary and capricious and an abuse of discretion in violation of the Administrative Procedure Act (APA), 5 U.S.C. � 706(2)(a). The plaintiff also raises a number of constitutional claims. Specifically, the plaintiff alleges that the defendants' regulation of the diskette violates the plaintiff's First Amendment right to freedom of speech and arbitrarily treats the diskette differently than the book in violation of the plaintiff's Fifth Amendment right to substantive due process.

The defendants move to dismiss the plaintiff's APA challenge based on a provision in the AECA precluding the judicial review of the designation of items as defense articles subject to the AECA. The defendants move for summary judgment on the ground that the plaintiff's First Amendment and Fifth Amendment rights have not been violated by the defendants' regulation of his computer diskette under the AECA and the ITAR.

Upon consideration of the filings by the parties, the entire record herein, the applicable law thereto, and for the reasons set forth below, the Court shall grant the defendants' Motion to Dismiss the plaintiff's APA claim as nonjusticiable, and the Court shall grant the defendant's Motion for Summary Judgment with respect to the plaintiff's First and Fifth Amendment claims.

BACKGROUND

On February 12, 1994, the plaintiff submitted to the Department of State a commodity jurisdiction request (2) for the book Applied Cryptography, by Bruce Schneier. The book Applied Cryptography provides, among other things, information on cryptographic protocols, cryptographic techniques, cryptographic algorithms, the history of cryptography, and the politics of cryptography. Part Five of Applied Cryptography contains source code for a number of cryptographic algorithms. This first commodity jurisdiction submission did not include "machine-readable media" such as a computer diskette or CD-ROM. Lowell Dec., Tab. 4.

On March 2, 1994, in response to the plaintiff's commodity jurisdiction request, the Department of State's Office of Defense Trade Controls (ODTC) determined that the book is not subject to the jurisdiction of the Department of State pursuant to the ITAR. Joint St. P 4. The ODTC's response explicitly stated, however, that this determination did not extend to the two diskettes referenced in the book and available from the author -- said disks apparently containing the encryption source code printed in Part Five of Applied Cryptography. Joint St. P 5.

On March 9, 1994, the plaintiff submitted a commodity jurisdiction request for a diskette containing the source code printed in Part Five of Applied Cryptography (the "Karn diskette"). The request stated that "the diskette contains source code for encryption software that provides data confidentiality" and that "the software on this diskette is provided for those who wish to incorporate encryption into their applications." Joint St. P 7. The ODTC responded, stating that the Karn diskette is subject to the jurisdiction of the Department of State pursuant to the ITAR and the AECA because the diskette "is designated as a defense article under category XIII (b)(1) of the United States Munitions List." Joint St. P 8; Lowell Decl. P 15, Tab 9.

Pursuant to procedures set forth in 22 C.F.R. � 120.4(g), the plaintiff appealed the commodity jurisdiction determination concerning the source code diskette to the Deputy Assistant Secretary of State by letter dated June 10, 1994. Joint St. P 9. The Deputy Assistant denied the plaintiff's appeal by letter dated October 7, 1994. Joint St. P 10. The plaintiff appealed this denial to the Assistant Secretary of State for Political-Military Affairs, defendant Thomas McNamara, by letter dated December 5, 1994. Defendant McNamara denied the plaintiff's appeal on June 13, 1995, reaffirming the earlier determinations that the Karn diskette "contains cryptographic software [and] is designated as a defense article under Category XIII (b)(1)." Joint St. P 12; Lowell Decl. P 22, Tab 14.

The plaintiff filed his Complaint in this Court on September 21, 1995, and the defendants filed their Motion to Dismiss or, in the Alternative, for Summary Judgment on November 15, 1995, containing the declarations of William P. Crowell, Deputy Director of the National Security Agency, and William J. Lowell, Director of the Office of Defense Trade Controls, Bureau of Political-Military Affairs, Department of State. The plaintiff filed an Opposition thereto on December 11, 1995, including the declarations of Barbara Tuthill, a Secretary at Venable, Baetjer & Howard, Philip R. Zimmerman, a software developer, and plaintiff Philip R. Karn, Jr. The defendants Replied on December 18, 1995. The plaintiff filed a Supplemental Memorandum on December 22, 1995, containing the supplemental declaration of plaintiff Karn, in order to correct alleged misstatements made by defendants. The defendants filed a Response on January 16, 1996.

I. THE COURT SHALL DISMISS THE PLAINTIFF'S APA CHALLENGE TO THE DEFENDANTS' DESIGNATION OF THE PLAINTIFF'S DISK AS A "DEFENSE ARTICLE" BECAUSE THE ARMS EXPORT CONTROL ACT PRECLUDES JUDICIAL REVIEW.

A. Pursuant to AECA, The President Designated The Karn Diskette As A "Defense Article" Subject To Export Regulation Under The ITAR.

The AECA authorizes the President to control the export of "defense articles":

The President is authorized to designate those items which shall be considered as defense articles and defense services for the purposes of this section and to promulgate regulations for the import and export of such articles and services. The items so designated shall constitute the United States Munitions List.

22 U.S.C. � 2778 (a)(1). The President delegated this authority to the Secretary of State, who then promulgated the ITAR. See 22 C.F.R. � 120.1(a).

The ITAR contains 10 subparts, including "Purpose and definitions," contained in Part 120, and "The United States munitions list," contained in Part 121. (2) Part 121, containing the munitions list, describes those items designated "defense articles" by the Secretary of State. The descriptions of such defense articles do not include specific manufacturer names or highly-detailed descriptions, but instead contain relatively general descriptions, such as the following: "Military tanks, combat engineer vehicles, bridge launching vehicles, half tracks and gun carriers"; (3) "Military training equipment including but not limited to attack trainers, radar target trainers ... and simulation devices related to defense articles"; (4) "Body armor specifically designed, modified or equipped for military use ..."; (5) and "Radar systems, with capabilities such as (i) Search, (ii) Acquisition, (iii) Tracking ...."(6) Likewise, the munitions list specifically addresses cryptographic systems and components and includes the following description:

(b) Information Security Systems and equipment, cryptographic devices, software, and components specifically designed or modified therefor, including: (1) Cryptographic ... systems, equipment, assemblies, modules, integrated circuits, components or software with the capability of maintaining secrecy or confidentiality of information or information systems .... 22 C.F.R. � 120.4, category XIII.

Part 120 of the ITAR provides assistance for interpreting the terms used in the munitions list. The "commodity jurisdiction procedure" is explained and provided for in this definitional section, and the ITAR directs that the ODTC use the procedure "if doubt exists as to whether an article or service is covered by the U.S. Munitions List." 22 C.F.R. � 120.4. As set forth previously, the plaintiff requested the use of this procedure for the book and the diskette. The ODTC determined that the diskette was covered by the munitions list but that the book was not.

The plaintiff argues that the diskette, like the book, is not a "defense article" covered by the munitions list. The plaintiff contends that pursuant to sections 125.1 (7) and 120.11, (8) the diskette is in the "public domain" and therefore is not subject to the ITAR. See Plaint's Opp. 19-21. However, the defendants contend that the diskette does not fall within the "public domain" exemption because said exemption only applies to "technical data" which, according to the defendants, does not include cryptographic software. See Defs' Reply 21-22 (citing 22 C.F.R. � � 120.10( a)(4), (9) 120.11 and 121.8(f) (10)).

B. Section 2778(h) Of The Arms Export Control Act Precludes Judicial Review Of The Designation Of The Karn Diskette As A "Defense Article."

The AECA explicitly bars judicial review of the President's designation of an item as a defense article:

The designation by the President (or by an official to whom the President's functions under subsection (a) of this section have been duly delegated), in regulations issued under this section, of items as defense articles or defense services for purposes of this section shall not be subject to judicial review. 22 U.S.C. � 2778(h). The plaintiff argues, however, that the Court should construe this provision so narrowly as to cover only the act of listing items on the munitions list contained in Part 121 of the ITAR and not the determination whether an item, in this case the plaintiff's diskette, is actually covered by the language of the munitions list pursuant to the definitional provisions contained in Part 120 of the ITAR. The plaintiff bases this argument upon the presumption in favor of judicial review and the language of � 2778(h) when read in conjunction with � 2778(a) of the AECA.

It is often stated that there is a presumption in favor of judicial review of agency action absent "clear and convincing evidence" of legislative intent to preclude it. Bowen v. Michigan Acad. of Family Phys., 476 U.S. 667, 671, 90 L. Ed. 2d 623, 106 S. Ct. 2133 (1986). However, the Supreme Court has cautioned that this standard "is not a rigid evidentiary test but a useful reminder to the courts that, where substantial doubt about the congressional intent exists, the general presumption favoring judicial review of administrative action is controlling." Block v. Community Nutrition Inst., 467 U.S. 340, 351, 81 L. Ed. 2d 270, 104 S. Ct. 2450 (1984).The presumption may be overcome, and the appropriate standard for determining whether "a particular statute precludes judicial review is determined not only from its express language, but also from the structure of the statutory scheme, its objectives, its legislative history, and nature of the administrative action involved." Id. at 345.

Section 2778 (h) expressly bars judicial review of the President's (or his designee's -- in this case, the Secretary of State's and/or ODTC's) designation of items as defense articles. In the case at bar, the Department of State designated the Karn diskette as a defense article by listing "cryptographic software" in Part 121 to the ITAR and then confirming that the diskette was covered by this description pursuant to the definitional provisions to the ITAR contained in � 120.4. The plaintiff argues, however, that because � 2778(a)(1) authorizes the designation of "defense articles and .... the items so designated shall constitute the United States Munitions List," and because the bar to judicial review appearing in � 2778(h) only applies to "the designation ... in regulations ... of items as defense articles," judicial review is precluded only for the act of listing items as defense articles on the munitions list published in Part 121 of the ITAR. Therefore, according to the plaintiff, any interpretation and application of the descriptions on the munitions list by the agency are reviewable. See Plaint's Opp. at 36-37.

The Court finds the plaintiff's reading strained and unreasonable. It is far more reasonable to read � 2778(a)(1) and (h) to preclude judicial review for the designation of items as defense articles pursuant to the language of the munitions list and the procedures provided for interpreting the list, all set forth in the ITAR -- in other words, if the defendants follow the procedures set forth in the ITAR and authorized by the AECA for designating an item as a defense article, such item is a part of the munitions list. The defendants did precisely that. Furthermore, even if the plaintiff's reading of the statute were plausible, the Court finds that any ambiguity would be dispelled by the objective of the AECA, the structure of the United States export scheme, and the nature of the plaintiff's challenge.

To parse the statute as the plaintiff suggests makes little sense in light of the objectives of the AECA. The AECA was enacted to permit the Executive Branch to control the export and import of certain items in order to further "world peace and the security and foreign policy" of the United States. 22 U.S.C. � 2778(a)(1). Designating an export such that it is subject to the AECA and the ITAR requires first describing the type of item in the regulations, and second, if asked by a potential exporter, confirming that the item in question is or is not covered by such description. The commodity jurisdiction procedure provides the latter function, as provided for explicitly both in the definitional section of the ITAR and in the munitions list with respect to cryptographic software. See 22 C.F.R. � � 120.4 and 121.1, category XIII, (b)(1), Note. Determining whether an item is covered by the munitions list is critical to the President's ability to designate and control the export of those items the Executive Branch considers to be defense articles. Simply put, the Court discerns from the legislative scheme that Congress has precluded judicial review of the commodity jurisdiction procedure.

Judicial non-reviewability of the defendants' commodity control decision is also consistent with the structure of the United State's export control scheme. Items not regulated by the ITAR but which have both commercial and potential military application -- "dual use" items -- are regulated by the Secretary of Commerce pursuant to the Export Administration Regulations (EAR), 15 C.F.R. � � 768-99, which were promulgated pursuant to the Export Administration Act (EAA), 50 App. U.S.C. � � 2401-20. Like the munitions list contained in Part 121 of the ITAR, the EAR contains a description of items, the commodity control list (CCL), subject to the licensing requirements of the EAR. See 15 C.F.R. part 799. Similarly, the EAA also contains a judicial review prohibition. Section 2412(a) of the EAA states that "the functions exercised under this Act are excluded from the operation of [the APA.]" In a criminal appeal before the Ninth Circuit, brought by a defendant who exported laser mirrors without obtaining a license from the Secretary of Commerce pursuant to the EAA and EAR, the court held that section 2412(a) of the EAA precluded judicial review of the Secretary of Commerce's determination that the type of laser mirrors exported by the defendant were in fact covered by the language of the CCL. See United States v. Spawr Optical Research, Inc., 864 F.2d 1467 (9th Cir. 1988), cert. denied, 493 U.S. 809, 107 L. Ed. 2d 20, 110 S. Ct. 51 (1989). In other words, under the EAR and EAA judicial review is precluded for the act of describing items on the CCL and for the act of determining whether an item is covered by the CCL descriptions. (11)

In addition to the obvious analogy between the schemes of the EAA and the AECA, these statutes are in fact part of a singular export scheme, in that the commodity jurisdiction procedure set forth in � 120.4 of the ITAR not only determines whether an item is a defense article covered by the munitions list, it also determines whether an item is covered by the EAR. See Lowell Decl., Tab 2. In fact, Part 121 of the ITAR, containing the published munitions list which the plaintiff concedes is not subject to judicial review, states the following:

NOTE: A procedure has been established to facilitate the expeditious transfer to the Commodity Control List of mass market software products with encryption that meet specified criteria regarding encryption for the privacy of data ... Requests to transfer commodity jurisdiction of mass market software products designed to meet the specified criteria may be submitted in accordance with the commodity jurisdiction provisions of � 120.4.

22 C.F.R. � 121.1, Category XIII (b)(1). To achieve consistency between the EAA and the AECA the Court concludes that decisions made pursuant to the commodity jurisdiction procedure should not be reviewable. Furthermore, considering the deference afforded the President in matters of foreign policy, it would be strange indeed if Congress precluded judicial review of the determination that an item has merely a potential for military use (i.e., subject to the commodity control list), but permitted review of the determination that an item was in fact a defense article (i.e., subject to the the munitions list).

The Court has reviewed House Report No. 101-296, Senate Report No. 101-173, and the Congressional Record for legislative history regarding the Anti-Terrorism and Amendments to the Arms Export Control Act of 1989, which added � 2888(h) to the AECA. The legislative history regarding � 2778(h) is scant. However, one of the stated purposes of the Act -- to eliminate "overlapping standards that lead to confusion and misinterpretation," including "identifying which arms are subject to restrictions," see S. Rep. No. 173, 101st Cong., 1st Sess. 2 (1989); see also H.R. Rep. No. 296, 101st Cong., 1st Sess. 3 (1989) -- further supports finding consistency between the AECA and EAA.

Finally, the plaintiff's challenge is not of a nature that commands a heightened presumption in favor of judicial review. Courts have held that "the presumption of judicial review is particularly strong" where the plaintiff alleges that the agency facially violated its authority delegated under the statute. Dart v. United States, 270 U.S. App. D.C. 160, 848 F.2d 217, 223 (D.C. Cir. 1988). One rationale for this general rule is that such "facial" challenges typically raise "a discrete issue, unrelated to the facts of the case, that only needs to be resolved once," and therefore, entertaining the challenge does not "open the floodgates to litigation." Id.; see also, Bowen v. Michigan Acad. of Family Phys., 476 U.S. 667 at 677, 680 n.11, 90 L. Ed. 2d 623, 106 S. Ct. 2133 (1986); Johnson v. Robison, 415 U.S. 361, 370, 39 L. Ed. 2d 389, 94 S. Ct. 1160 (1974). Another rationale for presuming the reviewability of such facial challenges is that, "when Congress limits its delegation of power, courts infer (unless the statute clearly directs otherwise) that Congress expects this limitation to be judicially enforced." Dart, 848 F.2d at 223.

In the case at bar, the plaintiff's APA claim does not raise a facial challenge to the agency's action in the context of the agency's statutory authority, (12) but instead disputes whether the Karn diskette constitutes a defense article subject to the licensing restrictions of the ITAR. Permitting judicial review of the plaintiff's APA claim would in fact open the floodgates to litigation; every time a potential exporter is informed through the commodity jurisdiction procedure that the item he wishes to export (or has already exported) is designated in the munitions list, the exporter could seek judicial review of that decision.

Based on the authorities cited in the plaintiff's Opposition, the plaintiff maintains that it is incumbent upon courts to essentially torture the language of finality provisions in order to permit judicial review whenever possible. See Plaint's Opp. 34-38, 41-42. Each of the cases cited by the plaintiff involved either legislative history (13) or a statutory scheme (14) that raised substantial doubt as to whether Congress intended to preclude judicial review. Moreover, some of the cases often involved a facial challenge (15) to a statute or to the agency's action. Thus, the Court finds that the authorities relied upon by the plaintiff are inapposite to the circumstances presented in the present case. For the reasons discussed above (i.e., the express language of the AECA, the arms export control scheme, and the nature of the plaintiff's challenge) the Court holds that Congress has barred judicial review.

II. THE DEFENDANTS ARE ENTITLED TO SUMMARY JUDGMENT AS A MATTER OF LAW ON THE PLAINTIFF'S FIRST AND FIFTH AMENDMENT CLAIMS.

Although the Court holds that the AECA precludes judicial review of the Department of State's determination that the Karn diskette is a designated defense article subject to the ITAR, the plaintiff also asserts that the regulation of the diskette violates the plaintiff's First and Fifth Amendment rights. Both parties agree that such constitutional challenges are not barred by � 2778(h). See Webster v. Doe, 486 U.S. 592, 602-05, 100 L. Ed. 2d 632, 108 S. Ct. 2047 (1988). Accordingly, the Court will now proceed to address the defendants' Motion for Summary Judgment with respect to these claims.

A party is entitled to summary judgment when there are no material facts in dispute and its position is correct as a matter of law. Anderson v. Liberty Lobby, Inc., 477 U.S. 242, 247-48, 91 L. Ed. 2d 202, 106 S. Ct. 2505 (1986); Celotex v. Catrett, 477 U.S. 317 at 321, 322, 91 L. Ed. 2d 265, 106 S. Ct. 2548 (1986). Material facts are those "facts that might affect the outcome of the suit under the governing law ...." Anderson, 477 U.S. at 248. The Court finds that there are no material facts in dispute with respect to the plaintiff's First and Fifth Amendment claims, and for the reasons set forth below, the defendants are entitled to summary judgment as a matter of law.

A. The Court Shall Grant The Defendants' Motion For Summary Judgment On The Plaintiff's First Amendment Claim Because The Regulation Is Content-Neutral And Meets The O'Brien Test.

1. Regulation Of The Diskette Is Subject To The O'Brien Test Because The Governmental Interest At Stake Is Unrelated To The Content Of Any Protected Speech Contained On The Diskette.

The plaintiff contends that the defendants' regulation of the Karn diskette (16) constitutes a restraint on free speech in violation of the plaintiff's First Amendment rights. The plaintiff argues the diskette should be considered "speech" for the purpose of First Amendment analysis because the computer language source codes contained on the diskette are comprehensible to human beings when viewed on a personal computer, because the diskette contains "comments" interspersed throughout the source code which are useful only to a human and are ignored by the computer, and because the source code and comments taken together teach humans how to speak in code. (17)

As a threshold matter, for the purpose of addressing the dispositive issue whether the regulation is justified and permissible, the Court will assume that the protection of the First Amendment extends to the source code (18) and the comments on the plaintiff's diskette. (19) The Supreme Court has described the First Amendment right to free speech as that which "generally prevents the government from proscribing speech because of disapproval of the ideas expressed." R.A.V. v. City of St. Paul, 505 U.S. 377, 112 S. Ct. 2538, 2542, 120 L. Ed. 2d 305 . Assuming the source codes and comments are within the arena of protected speech, the Court must then determine the basis for the regulation at issue in this case.

The rationale for a regulation determines the level of scrutiny to be applied to said regulation; if the regulation is content-based, the regulation will be "presumptively invalid," whereas if the regulation is content-neutral, then the government may justify the regulation if certain other criteria are met. 112 S. Ct. at 2542-54. These additional criteria -- whether the regulation is (1) within the constitutional power of the government, (2) "furthers an important or substantial governmental interest," and (3) is narrowly tailored to the governmental interest -- have been referred to as the O'Brien test after the Supreme Court upheld the government's prohibition against burning draft cards based on these criteria in United States v. O'Brien, 391 U.S. 367, 20 L. Ed. 2d 672, 88 S. Ct. 1673 (1968).

The plaintiff disputes this characterization of the law, arguing that the nature of the matter regulated, (e.g., whether "conduct" or "pure speech"), as opposed to the rationale for the regulation, actually dictates the level of scrutiny to be applied. The plaintiff submits that the O'Brien criteria are inapplicable because they apply only to the regulation of "conduct", and that the Karn diskette is "pure speech", the regulation of which should require strict scrutiny review. The Court disagrees, as the plaintiff's argument places form over substance. Pursuant to extensive First Amendment jurisprudence, the government's rationale for the regulation controls, regardless of the form of the speech or expression regulated. See Ward v. Rock Against Racism, 491 U.S. 781, 791, 105 L. Ed. 2d 661, 109 S. Ct. 2746 (1989) ("Time, place, and manner" restriction on music permitted where, among other things, regulation was content-neutral); Clark v. Community of Creative Non-Violence, 468 U.S. 288, 298, 82 L. Ed. 2d 221, 104 S. Ct. 3065 (1984) (Standard for evaluating expressive conduct, including requirement that regulation be content-neutral, "is little, if any, different from standard applied to time, place, or manner restrictions"); O'Brien, 391 U.S. at 377 (Government prohibition against burning of draft cards sufficiently justified if, among other things, "the governmental interest is unrelated to the suppression of free expression"). Accordingly, it is unnecessary for the Court to make any finding regarding the nature of the matter contained on the Karn diskette.

The government regulation at issue here is clearly content-neutral. The defendants' rationale for regulating the export of the diskette is that "the proliferation of [cryptographic hardware and software] will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests." Crowell Decl. at 3. The defendants are not regulating the export of the diskette because of the expressive content of the comments and or source code, but instead are regulating because of the belief that the combination of encryption source code on machine readable media will make it easier for foreign intelligence sources to encode their communications. The government considers the cryptographic source code contained on machine-readable media as cryptographic software and has chosen to regulate it accordingly.

The plaintiff does not dispute this motive for regulating the export of the diskette, but instead questions the logic of such a motive in light of the plaintiff's allegations that, without compiling the source code and without further programming, the Karn diskette does not perform a cryptographic function, and that there is no actual danger to national security because the source codes can be obtained abroad through the book or on the Internet. Such issues are not material to the determination of content neutrality. In this case, the plaintiff has not presented any evidence to suggest the bad faith of the government, or that the government's expressed motive is a pretense. Accordingly, the Court finds that the rationale expressed by the government is content-neutral and the regulation is subject to the standards set forth by the Supreme Court in O'Brien.

2. The Regulation Of The Diskette Meets The O'Brien Test Because It Is Within The Power Of The Government To Control The Export Of Defense Articles, It Furthers The Significant Governmental Interest Of Preventing The Proliferation Of Cryptographic Products, And It Is Narrowly Tailored To Meet That Interest.

As stated previously, a content-neutral regulation is justified under the O'Brien test if it is within the constitutional power of the government, it "furthers an important or substantial governmental interest," and "the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest." O'Brien, 391 U.S. at 377. The plaintiff does not dispute that regulating the export of cryptographic software is within the constitutional power of the government. (20) Nor does the plaintiff expressly dispute the second requirement, that the government has an important interest at stake. The defendants have submitted evidence, which the plaintiff does not dispute, stating that the interception of communication made by foreign intelligence targets is "essential to the national defense, national security, and the conduct of the foreign affairs of the United States." Crowell Decl. at 3. In the context of this factual backdrop, the defendants have expressed the following government interest for justifying the regulation of the plaintiff's diskette: "the proliferation of cryptographic products will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests." Crowell Decl. at 3.

The plaintiff argues instead that the third prong of the O'Brien test is not satisfied because the cryptographic algorithms contained on the Karn diskette are "already widely available in other countries [through the Internet and other sources] or are so 'weak' that they can be broken by the [National Security Agency]." (21) Plaint's Opp. 15-16. Although the plaintiff has labeled his argument as one concerning the "narrowly tailored restriction" requirement of the O'Brien test, the plaintiff's argument implicates the second O'Brien requirement by questioning whether the government has a legitimate interest at stake. Indeed, the plaintiff contends that his argument constitutes a factual dispute with the defendants, making the plaintiff's First Amendment claim inappropriate for summary judgment as a matter of law.

The Court does not agree. The plaintiff attempts to disguise a disagreement with the foreign policy judgment of the President as a factual dispute. By placing cryptographic products on the ITAR, the President has determined that the proliferation of cryptographic products will harm the United States. This policy judgment exists despite the availability of cryptographic software through the Internet and the National Security Agency's alleged ability to break certain codes. Even if this were a factual dispute, it is not one into which this Court can or will delve. The Court will not scrutinize the President's foreign policy decision. As the Supreme Court stated in Chicago & Southern Air Lines v. Waterman SS. Corp., 333 U.S. 103, 92 L. Ed. 568, 68 S. Ct. 431 (1948), such decisions:

are delicate, complex, and involve large elements of prophecy. They are and should be undertaken only by those directly responsible to the people whose welfare they advance or imperil. They are decisions of a kind for which the Judiciary has neither aptitude, facilities nor responsibility and which has long been held to belong in the domain of political power not subject to judicial intrusion or inquiry.

333 U.S. at 111. The plaintiff also suggests that the Court balance any First Amendment harms created through regulation of the diskette against the injury caused to national security if the export of the diskette were not regulated. See Plaint's Opp. 16. However, unlike Tinker v. Des Moines Independent Community School District, 393 U.S. 503, 508, 21 L. Ed. 2d 731, 89 S. Ct. 733 (1969), where the Supreme Court applied an ad hoc balancing test, such a test in the case at bar would require the Court to scrutinize the actual injury to national security. Again, the Court declines to do so. See United States v. Mandel, 914 F.2d 1215, 1223 (9th Cir. 1990) ("Whether the export of a given commodity would make a significant contribution to the military potential of other countries ... is a political question not subject to review to determine whether [it] had a basis in fact"); United States v. Martinez, 904 F.2d 601, 602 (11th Cir. 1990) ("The question whether a particular item should have been placed on the Munitions List possesses nearly every trait that the Supreme Court has enumerated traditionally renders a question 'political'"). Furthermore, the plaintiff cannot genuinely dispute that, absent the restriction on the export of cryptographic products and the plaintiff's diskette, the actual number of cryptographic products (22) available to foreign intelligence sources will be greater.

Finally, the plaintiff has not advanced any argument that the regulation is "substantially broader than necessary" to prevent the proliferation of cryptographic products. City Council of Los Angeles v. Taxpayers for Vincent, 466 U.S. 789, 808, 80 L. Ed. 2d 772, 104 S. Ct. 2118 (1984). Nor has the plaintiff articulated any present barrier to the spreading of information on cryptography "by any other means" other than those containing encryption source code on machine-readable media. Clark, 468 U.S. at 295. Therefore, the Court holds that the regulation of the plaintiff's diskette is narrowly tailored to the goal of limiting the proliferation of cryptographic products and that the regulation is justified.

3. The Court Shall Deny The Plaintiff's First Amendment Claim Regarding The "Technical Data" Provisions Of The ITAR Because The Plaintiff Does Not Have Standing And The Defendants Have Limited Their Application Of Said Provisions.

As a last-ditch argument, the plaintiff contends that the Court should invalidate the ITAR under the First Amendment because certain provisions of the ITAR regulating the export of "technical data," as defined in 22 C.F.R. � 120.10, constitute "an unconstitutional system of vague prior restraints." The plaintiff bases his argument on internal government memoranda that express concern regarding the possible overbreadth and vagueness of said provisions.

The plaintiff has no standing to bring this claim regarding the unconstitutionality of the "technical data" provisions. (23) The defendants have not applied the "technical data" restrictions to the plaintiff -- in fact, the crux of the plaintiff's APA claim is that although the plaintiff believes the diskette is exempted from the ITAR because it is in the public domain, the defendants have arbitrarily concluded that the public domain exemption does not apply because the diskette is not "technical data." The Court will not accept the plaintiff's invitation to address the constitutionality of the "technical data" provisions of the ITAR where the plaintiff's injury is not causally connected to said provisions. See Lujan v. Defenders of Wildlife, 504 U.S. 555, 112 S. Ct. 2130, 2136, 119 L. Ed. 2d 351 (1992) (Court held that case-or-controversy standing required, among other things, that the plaintiff must have suffered an injury causally connected to the challenged action of the defendant). While courts have departed from traditional rules of standing with respect to certain .First Amendment claims, claims of facial overbreadth and vagueness are rarely entertained with respect to content-neutral regulations. See Broadrick v. Oklahoma, 413 U.S. 601, 613, 37 L. Ed. 2d 830, 93 S. Ct. 2908 (1973).

Furthermore, the plaintiff's overbreadth concerns are not genuine. In United States v. Edler Indus., 579 F.2d 516 (1978), the Ninth Circuit addressed the argument that the definition of "technical data" in the ITAR is "susceptible to an overbroad interpretation." Id. at 520. The court chose to read the technical data provision narrowly to avoid finding a constitutional violation. Id. at 521. The Department of State has since limited its application of the provision in practice to the interpretation expressed by the court in Edler Indus. See Preamble to Revisions of International Traffic in Arms Regulations (Final Rule) 49 Fed. Reg. 47682, 47683 (Dec. 6, 1984). In evaluating the plaintiff's overbreadth claim, the Court "'must ... consider any limiting construction that a ... court or enforcement agency has proferred.'" Ward, 491 U.S. at 796 (quoting Hoffman Estates v. The Flipside, Hoffman Estates, Inc., 455 U.S. 489, 494, n. 5, 71 L. Ed. 2d 362, 102 S. Ct. 1186 (1982)). Accordingly, based on the plaintiff's lack of standing, and in light of the limitations to the "technical data" provisions adopted by the Department of State, the plaintiff cannot prevail on his First Amendment claim that the ITAR is vague and overbroad.

B. REGULATING THE EXPORT OF THE PLAINTIFF'S DISKETTE IS RATIONAL AND, ACCORDINGLY, DOES NOT VIOLATE THE SUBSTANTIVE DUE PROCESS RIGHTS GUARANTEED THE PLAINTIFF UNDER THE FIFTH AMENDMENT OF THE CONSTITUTION.

As stated previously, � 2778(h) of the AECA precludes the APA claim asserted by the plaintiff, but it cannot bar a constitutional attack. Recognizing the significant possibility that this Court might hold the plaintiff's "arbitrary and capricious" challenge under the APA nonjusticiable, see Part I of this Memorandum Opinion, the plaintiff asserts the same "arbitrary and capricious" challenge under the legal theory that the defendants' actions violated his right to substantive due process as guaranteed by the Fifth Amendment. However, the plaintiff may not backdoor his APA claim through the Fifth Amendment as that would render the judicial review preclusion in � 2778(h) absolutely meaningless. See Sylvia Develop. Corp. v. Calvert County, 48 F.3d 810, 829 n.7 (4th Cir. 1995) ("As the courts have consistently recognized, the inquiry into 'arbitrariness' under the Due Process Clause is completely distinct from and far narrower than the inquiry into arbitrariness under state or federal administrative law").

The substantive due process provided in the Fifth Amendment, absent the assertion of a fundamental right, merely requires a reasonable fit between governmental purpose and the means chosen to advance that purpose. See Usery v. Turner Elkhorn Mining Co., 428 U.S. 1, 19, 49 L. Ed. 2d 752, 96 S. Ct. 2882 (1976) ("Under the deferential standard of review applied in substantive due process challenges to economic legislation there is no need for mathematical precision in the fit between justification and means"). Given this "extremely limited scope of permissible judicial inquiry," Association of Accredited Cosmetic Schools v. Alexander, 298 U.S. App. D.C. 310, 979 F.2d 859, 866 (D.C. Cir. 1992), the plaintiff's due process claim lacks any merit. The government clearly has an interest in preventing the proliferation of cryptographic software to foreign powers, and the regulation of the export of the cryptographic software is a rational means of achieving that goal. The Court will not substitute its policy judgments for that of the President, see Bowen v. Gilliard, 483 U.S. 587, 597, 97 L. Ed. 2d 485, 107 S. Ct. 3008 (1987), especially in the area of national security. See Martinez, 904 F.2d at 602.

Likewise, the regulation of the plaintiff's diskette as cryptographic software is rational, even when considered in conjunction with the defendants' decision not to subject the book Applied Cryptography to the ITAR. As stated bythe plaintiff in his commodity jurisdiction application for Applied Cryptography, the book contains no machine-readable media," while the diskette is precisely that. See Lowell Decl., Tab 4. Although Part Five of the book could be placed on machine readable media through the use of optical character recognition technology or through direct typing, the plaintiff concedes that using the source code in Part Five of Applied Cryptography to encode material takes greater effort and time than using the Karn diskette. Karn Decl. PP 10-12. Accordingly, treating the book and diskette differently is not in violation of the plaintiff's substantive due process rights. Finally, to the extent that the plaintiff's substantive due process rights require the Court to review the defendants' interpretation of the public domain exemption to the ITAR, the Court finds the defendants' interpretation reasonable as a matter of law.

For the reasons discussed above, the Court shall dismiss the plaintiff's APA claim, and the defendant is entitled to summary judgment on the plaintiff's First and Fifth Amendment claims. The Court shall issue an Order of even date herewith consistent with the foregoing Opinion.

Endnotes to Karn.

(1) "Cryptography" is "the art and science of keeping messages secure ... [, and] the process of disguising a message in such a way as to hide its substance is called encryption." Bruce Schneier, Applied Cryptography, at 1 (1994). A "cryptographic algorithm" is a mathematical function or equation that can be applied to transform data into an unintelligible form (e.g., into ciphertext). Joint St. P 15. "Cryptographic source code" expresses a cryptographic algorithm in computer programming language, such as the "C" language, and is a precise set of operating instructions to a computer that, when compiled into object code by using commercially available software, enables a computer to perform cryptographic functions. Joint St. PP 16.

(2) Parts 120 and 121 of the ITAR provide a "commodity jurisdiction" procedure so that an exporter may obtain a determination whether the export of an item falls within the jurisdiction of the Department of State pursuant to the ITAR, or the Department of Commerce pursuant to the Export Administration Regulations. See 22 C.F.R. � � 120.4 and 121.1, category XIII (b)(1), Note.

(3) Other subparts address the registration of exporters, the licensing of exporters, general policies and procedures, and violations and penalties.

(4) See 22 C.F.R. � 121.1, category VII (b).

(5) See id., category IX(a).

(6) See id., category X.

(7) See id., category XI.

(8) 22 C.F.R. � 125.1 provides:

(a) The controls of this part apply to the export of technical data and the export of classified defense articles. Information which is in the public domain (see � 120.11 of this subchapter and � 125.4(b)(13)) is not subject to the controls of this subchapter.

(8) 22 C.F.R. � 120.11 provides:

(a) Public domain means information which is published and which is generally accessible or available to the public ....

(9) 22 C.F.R. � 120.10 provides:

(a) Technical data means, for purposes of this subchapter: ... (4) Software as defined in � 121.8(f) of this subchapter directly related to defense articles.

(10) 22 C.F.R. � 121.8(f) provides:

Software includes but is not limited to the system functional design, logic flow, algorithms, application programs, operating systems and support software for design, implementation, test, operation, diagnosis and repair. A person who intends to export software only should, unless it is specifically enumerated in � 121.1 (e.g., XIII(b)), apply for a technical data license pursuant to part 125 of this subchapter.

(11) Based on Dart v. United States, 270 U.S. App. D.C. 160, 848 F.2d 217 (D.C. Cir. 1988), the plaintiff contends that courts in this Circuit construe finality provisions more narrowly than the Ninth Circuit did in Spawr. See Plaint's Opp. at 41-2. This is incorrect. Dart involved a facial challenge to the agency's action, which is quite different and requires greater scrutiny than the type of challenge brought in Spawr. See discussion of facial challenges, infra.

(12) However, the plaintiff does raise constitutional claims which the Court addresses, infra.

(13) See Lindahl v. Office of Personnel Mgt., 470 U.S. 768, 778-91, 84 L. Ed. 2d 674, 105 S. Ct. 1620 (1985) (Court refused to expand preclusion of finality provision beyond that previously applied by courts for almost 30 years where legislative history of new provision acknowledged past application and failed to express disagreement or amend wording of said finality clause).

(14) See McNary v. Haitian Refugee Center, 498 U.S. 479, 491-94, 112 L. Ed. 2d 1005, 111 S. Ct. 888 (1991) (Court read provision precluding judicial review narrowly to apply only to direct review of individual determinations regarding a type of immigration status, "rather [than reading the finality provision to apply] to general collateral challenges to unconstitutional practices and policies used by the agency in processing applications," because the broader reading would make no sense in light of other provisions in the Immigration and Naturalization Act); Traynor v. Turnage, 485 U.S. 535, 542-45, 99 L. Ed. 2d 618, 108 S. Ct. 1372 (1988) (Court held that bar to judicial review of decisions of Veterans Administrator on questions of law or fact would not bar constitutional or statutory challenge to VA decision because such challenges would not disrupt statutory scheme).

(15) See Traynor, 485 U.S. at 540 (Court permitted cases against the VA to go forward based on Rehabilitation Act of 1973 because such actions would not burden the courts or frustrate uniformity of technical and complex determinations and applications of VA policy connected with veterans' benefits decisions); Dart, 848 F.2d at 222-3 (Court permitted review of ALJ decision where the agency went beyond its delegated authority); Bartlett v. Bowen, 259 U.S. App. D.C. 391, 816 F.2d 695, 699 (D.C. Cir. 1987) (Court held constitutional challenge to agency action not foreclosed by provision barring judicial review).

(16) The defendants have regulated only the plaintiff's diskette, not the book. The Court will not address whether the defendants can regulate the book pursuant to the AECA and the ITAR because that issue is not properly before this Court.

(17) The plaintiff's reliance on Minneapolis Star & Tribune Co. v. Comm. of Revenue, 460 U.S. 575, 103 S. Ct. 1365, 75 L. Ed. 2d 295 (1983), for the proposition that the diskette is a "tool of speech" which cannot be regulated, is entirely misplaced. In that case, the Supreme Court invalidated a statute that singled out publications for disparate taxation. Id. at 581-82. Such circumstances are simply not present in this case.

(18) The Court makes no ruling as to whether source codes, without the comments, fall within the protection of the First Amendment. Source codes are merely a means of commanding a computer to perform a function.

(19) The plaintiff cites Supreme Court authority holding that the teaching of a foreign language is "speech," in an apparent attempt to add some additional First Amendment protection to the diskette other than that provided for the communication of scientific or mathematical information. Whether the source code and comments convey information on speaking in a foreign language (and it is not clear that cryptography qualifies) or convey information on the science of cryptographic algorithms, the analysis is the same. The plaintiff also relies upon an unpublished Ninth Circuit opinion declaring Arizona's "English-only" statute unconstitutional. See Yniguez v. Arizona, 69 F.3d 920, 1995 WL 583414 (9th Cir. 1995). Yniguez does not apply to the case at bar, however, because the defendants have not passed a regulation prohibiting speaking in code.

(20) The plaintiff cannot dispute this point in good faith, as "the federal government undeniably possesses the power to regulate the international arms traffic." United States v. Edler Indus., 579 F.2d 516, 520 (1978).

(21) The plaintiff also argues that certain "hashing" algorithms, which do not encrypt at all, pose no danger to national security. The government agrees, and in fact, hashing "functions are expressly exempted from ITAR regulation." Plaint's Opp. at 32. These algorithms are only affected by the regulation because they are included for export on the same disk as the cryptographic source code algorithms.

(22) The Court makes no finding with respect to the "functionality" of the Karn diskette as a cryptographic device. The parties agree that the source codes on the diskette can be "executed to encrypt communications on a computer by: (a) writing additional instructions to the computer called 'input and output routines' ... (b) compiling the total source code into object code by using commercially available software; and (c) typing a command to the computer ...." Joint St. P 20. Given this stipulation, the defendants still consider the disk as a cryptographic product; therefore, the defendants have made the policy decision that the proliferation of this type of product is harmful to the national security.

(23) Although not raised by either party, the Court is "obliged to examined the standing" of the plaintiff to assert his claim. Juidice v. Vail, 430 U.S. 327, 331, 51 L. Ed. 2d 376, 97 S. Ct. 1211 (1977).

VIII. Summary of Subsequent Developments

On September 16, 1999 the White House announced that, first, it would be easing export restriction on unlimited strength cryptographic programs, except to "seven terrorist countries (Iran, Iraq, Libya, Syria, Sudan, North Korea and Cuba)" after a one-time "technical" review by the Commerce Department. Second, Key Recovery products (that allow network administrators and law enforcement access) will be exportable under Commerce Department licensing.

In the same 1999 press conference where the export regulations were eased regarding strong crypto products, the White House also announced its support for giving $80 million to the FBI to augment their ongoing code-cracking efforts. The White House also reiterated its strong support for a draft piece of legislation, the "Cyberspace Electronic Security Act" or CESA, which had been initially released in early August and is yet another attempt to legislatively mandate and implement a Key Recovery/Key Escrow scheme similar to the Clipper Chip initiative in the early 1990s. For the White House's analysis of CESA, see <http://www.eff.org/pub/Crypto/ITAR_export/1999_export_policy/19990913_admin_ anlys_cesa.html>. CESA would allow federal law enforcement agents to intercept encrypted computer messages for 90 days or longer without notice to the computer user.

For an early draft of CESA with provisions to allow law enforcement officials to make surreptitious entries into servers in order to install monitoring programs, see the Center for Democracy and Technology's (CDT) website at. <http://www.cdt.org> For a version that incorporates the modifications related to the September 16, 1999 White House announcement, see the Electronic Privacy Information Center's (EPIC) website at http://www.epic.org and the Electronic Frontier Foundation's website at <http://www.eff.org/>.

For comments about the CESA, the White House's apparent change in position with regard to crypto export and a little more on what the "one-time technical review" of encryption products might entail, see the transcript of a September 16, 1999 press briefing with Attorney General Janet Reno, Secretary of Commerce William Daley, OMB Counsel Peter Swire, Deputy Defense Secretary John Hamre and Deputy Ass't for National Security Affairs James Steinberg at <http://www.eff.org/pub/Crypto/ITAR_export/1999_export_policy/19990916_wh_bri efing_transcript.html>.

So, what do these seemingly significant developments have to do with the state of crypto export at the end of 1999 and the beginning of 2000?

Actually, very little.

First, the White House announcement does not necessarily moot the Bernstein IV holding that the Commerce Department EAR amendments were an unconstitutional 'prior restraint' licensing scheme for Professor Bernstein's protected first amendment speech.

The White House was careful to maintain the ability to require a one time technical review of all exports containing strong 56-bit plus encryption technologies. As of September 1999, the U.S. government sought a rehearing en banc of the holding of the three judge panel that wrote the May 1999 Bernstein IV opinion. While the rehearing en banc by the Ninnth Circuit was granted and scheduled for arguments in March 2000, subsequent shifts by the Clinton Administration made the Ninth Circuit remand the case to the Ninth Circuit without a rehearing in early 2000.

One may view the "one-time technical review" by the Commerce Department as the crucial point at which the government may exert pressure on crypto developers seeking to enter the global market. Imagine that you are a developer of product containing unlimited strength cryptosystems. Further, imagine that the global market for such products is changing swiftly so that time is of the essence for market entry. Now consider what happen if the NSA and FBI, in concert with the Commerce Department suggest that you "voluntarily" build in a trap door for law enforcement entry and that if you don't, the "one-time technical review" could drag on long enough that you lose your commercial edge in the market. The "prior restraint" problem of Bernstein IV is still present. A more cynical view might suggest that the NSA would never allow widespread release of cryptographic tools that it could not easily crack, therefore the relaxed attitude of the administration is nothing to be triumphal or sanguine about.

Second, the timing of the White House announcement in September 1999 seemed calculated to help Vice-President Al Gore, who has been somewhat beleaguered in his quest for the 2000 Democratic Presidential nomination, raise some campaign money in Silicon Valley, where he was on a campaign swing from September 19th and 20th. It doesn't appear that crypto export will be a major issue in the 2000 presidential election.

Third, the administration's proposed legislation, the CESA, demonstrates the unwavering commitment to implement a Key recovery/Key Escrow system that allows law enforcement to be able to gain access to encryption keys without notice to keyholders, a position the Clinton Administration has steadfastly held since 1993. The suggestion that preferential export treatment will be given products with Key Recovery/Key Escrow systems is a not-so-subtle attempt to nudge the market (at least within the US) in a particular direction.

The September 1999 White House announcement and the retooled Commerce Department export regulations in January 2000 did constitute a belated and partial recognition that US encryption export policy has not stemmed the availability of unlimited strength cryptography worldwide. For example, at the July 1999 DefCon 7 convention in Las Vegas, a Montreal-based company named Zero-Knowledge Systems released 1,000 beta copies of an Internet privacy solution called "Freedom 1.0" that offers multiple on-line pseudonyms, Byzantine encrypted rerouting and public key crypto from 128 to 4,096 bit strengths. Freedom 1.0 allows users to establish separate pseudonyms and scrambles data coming from a user's computer, hiding the source and destination of Internet traffic routed through its service. A message is initially sent to Zero-Knowledge's servers where it is encrypted. A delivery process begins wherein the message is then bounced around between numerous independently owned relay stations. Eventually the message arrives at its intended destination but no one, including the final recipient can trace the message's origins. If Zero-Knowledge Systems were a US-based system, their encryption product would run afoul of the pre-Bernstein IV Commerce Department regulations. Many other countries have taken the lead in marketing unlimited strength encryption products and the White House's announcement is a belated recognition of this fact. For a New York Times Article detailing this, go to John Markoff, "Encryption Products Found to Grow in Foreign Markets," New York Times, June 10, 1999, available at http://www.nytimes.com/library/tech99/06/biztech/articles/10code .html. In important ways, trying to contain the genie of strong encryption places the Clinton Administration (or any other government) in the position of King Canute attempting to order the waves of the sea to hold themselves from rolling onto the shore.

Other significant developments include the establishment by RSA Data Security, the leading public key encryption software producer, of an Australian subsidiary that enables worldwide distribution of encryption products without dealing with US export restrictions. On May 27, 1999, the United Kingdom rejected Key Recovery as ineffective and inconsistent with the U's desire to be a leading e-commerce nation. Similarly, on June 2, 1999, Germany also rejected the idea of placing restrictions on strong encryption. In early 1999, the 56-bit DES encryption code was cracked by a group of encryption enthusiasts in 22 hours and 15 minutes, and the US National Institute for Standards and Technology (successor to the National Bureau of Standards that had adopted DES as it encryption standard in the 1970s) recommended the use of Triple DES.

Finally, the CESA proposal by the White House does not mean the withdrawal of the pending Safety and Freedom through Encryption (SAFE) bill, HR 850, re-introduced on February 25, 1999 by Representatives Bob Goodlatte (R-VA) and Zoe Lofgren (D-CA) (it had previously been introduced as HR 695 in the 105th Congress). Following the September 16, 1999 White House announcement, Representatives Goodlatte and Lofgren reaffirmed they would continue to push for the SAFE bill's passage. The major points of the SAFE bill would guarantee US citizens the right to us any kind of crypto anywhere in the world as well as prohibiting the government from requiring a 'back door' into email and computer files via mandatory Key Recovery. SAFE would also liberalize export controls to allow export if a product with comparable security is already available from foreign sources. The proposed SAFE bill would call on the US Attorney General to compile and provide example where the use of strong encryption has interfered with law enforcement and also would call on the President to convene an international conference to reach an encryption policy agreement. For more information on the SAFE bill, see CDT's website at http://www.cdt.org/crypto/legis_106/SAFE/index.shtml#provisions.

However, many commentators have noted that by the end of the 1998-99 legislative session, the SAFE bill had been gutted in Committee that groups like the CDT were opposing it's passage. For example, in July 1999, the House Armed Services Committee adopted an amendment to the SAFE bill that would give the President unconstrained authority to deny crypto exports and dictat e the level of crypt eligible for license restrictions, as well as impose reporting requirements on crypt exporters. For more information on the House Armed Services Committee July 1999 change to the SAFE bill , see EPIC's website at http://www.org.crypto.

It is against this background, the Bernstein IV 9th Circuit and the reintroduced SAFE bill that the White House proposed CESA and made its' mid--September 1999 and January 2000 regulatory and policy shift. While critics of SAFE point to its watered down export licensing language, it still remains opposed to Key Recovery. The Ninth Circuit Bernstein IV case and the April 2000 Sixth Circuit Junger case (counter to the older 1996 District Court Karn case)suggest that Commerce Department export licensing requirements will be considered to be an unconstitutional prior restriction on protected speech. CESA's focus on promoting Key Recovery systems within the US remains a crucial difference between the two proposed pieces of legislation.

  1. Selected Bibliography

Hal Abelson, et al., Questions and Answers About MIT's Release of PGP available at <http.//web.mit.edu/afs/net/mit/jis/www/pgpfaq.htm>

James Bamford, THE PUZZLE PALACE: A REPORT ON AMERICA'S MOST SECRET AGENCY (1982)

Albrecht Breutelspacher, Cryptology (1994)

Herbert Burkert, Privacy-Enhancing Technologies: Typology, Critique, Vision in TECHNOLOGY AND PRIVACY: THE NEW LANDSCAPE at 125 (Philip E. Agre and Marc Rotenberg eds., 1997)

James J. Carter, The Devil and Daniel Bernstein: Constitutional Flaws and Practical Fallacies in the Encryption Export Controls, 76 Oregon L. Rev. 981 (1997)

John P. Collins, Note, Speaking in Code, 106 Yale L. J. 2961 (1997)

Whitfield Diffie & Martin E. Hellman, New Directions in Cryptography, IT-22 IEEE Transactions Info. Theory 644 (1976)

Whitfield Diffie, The First Ten Years of Public-Key Cryptography, 76 Proc. IEEE 560, No. 5, May 1988)

Whitfield Diffie and M.E. Hellman, New Directions in Cryptography, IEEE, Transactions on Information Theory, vol. IT-22, pp. 644-654 (Nov. 1976)

Whitfield Diffie and Susan Landau, Privacy on the Line (1998)

David Chaum, Achieving Electronic Privacy, Scientific American, August 1992

W.V. Davies, READING THE PAST: EGYPTIAN HIEROGLYPHICS (1997)

Charles L. Evans, U.S. Export Control of Encryption Software: Efforts to Protect National Security Threaten the U.S. Software Industry's Ability to Compete in Foreign Markets, 19 N.C. J. Int'l & Com. Reg. 469 (1994)

A. Michael Froomkin, The Metaphor is the Key: Cryptography, the Clipper Chip, and the Constitution, 143 U. Pa. L. Rev. 709 (1995)

A. Michael Froomkin, Flood Control on the Information Ocean: Living With Anonymity, Cash, and Distributed Databases, 15 J. L. & Comm. 395 (Spring 1995)

Helen F. Gaines, CRYPTANALYSIS (1956)

Martin Gardner, A New Kind of Cipher That Would Take Millions of Years to Break, 237 Scientific American 120-124 (August 1997)

Simson Garfinkel, PRETTY GOOD PRIVACY (1995)

Mark B. Hartzler, National Security Export Control on Data Encryption --How They Limit U.S. Competitiveness, 29 Tex, Int'l L. J. 438 (1994)

M.E. Hellman, The Mathematics of Public-Key Cryptography, 241 Scientific American 130-139 (August 1979)

F.H. Hinsley, BRITISH INTELLIGENCE IN THE SEOND WORLD WAR: ITS INFLUENCE ON STRATEGY AND OPERATIONS (1975)

Andrew Hodges, ALAN TURING: THE ENIGMA (1992)

BUILDING BIG BROTHER: THE CRYPTOGRAPHY POLICY DEBATE (Lance Hoffman ed., 1994)

David Kahn, THE CODEBREAKERS : THE STORY OF SECRET WRITING (rev. ed. 1996)

David Kahn, SEIZING THE ENIGMA (1996)

Jerry Kang, Information Privacy in Cyberspace Transactions, 50 Stanford Law Review 11193 (1998)

Charles Kaufman, Radia Perlman, and Mike Spencer, NETWORK SECURITY: PRIVATE COMMUNICATION IN A PUBLIC WORLD (1996)

Henry R. King, Note Big Brother, The Holding Company, A Review of Key-Escrow Encryption Technology, 21 Rutgers Computer & Tech. J. 224 (1995)

Donald E. Knuth, THE ART OF COMPUTER PROGRAMMING (2d ed. 1974)

Bert-Jaap Koops, THE CRYPTO CONTROVERSY 1998)

Timothy B. Lennon, Comment, The Fourth Amendment's Prohibitions on Encryption Limitation: Will 1995 Be Like 1984?. 58 Alb. L. Rev. 467 (1994)

Steven Levy, Crypto Rebels, WIRED, May/June 1993.

Ralph C. Merkle, Secure Communications Over Insecure Channels, Comm. ACM, April 1978 at 294.

Davie E. Newman, Encyclopedia of Cryptography (1997)

Yvonne C. Ocrant, A Constitutional Challenge To Encryption Export Regulations: Software Is Speechless, 48 DePaul L. Rev. 503 (1998)

Kenneth J. Pierce, Public Cryptography, Arms Export Controls and the First Amendment: A Need for Legislation, 17 Cornell Int'l L. J. 197 (1994)

Laura M. Pilkington, First and Fifth Amendment Challenges to Export Controls on Encryption: Bernstein and Karn, 37 Santa Clara L. Rev. 159 (1996)

Maurice Pope, THE STORY OF DECIPHERMENT (1975)

National Research Council, Committee to Study National Cryptography Policy, CRYPTOGRAPHY'S ROLE IN SECURING THE INFORMATION SOCIETY (1996)

Ronald L. Rivest, Adi Shamir, and Leonard Adelman. Martin Gardner, Mathemenatical Games, Scientific American, August 1977.

Daniel R. Rua, "Cryptobabble: How Encryption Export Disputes Are Shaping Free Speech For The New Millennium", 24 N.C.J. Int’l L. & Com. Reg. 125, 136 (Fall, 1998)

Jill M. Ryan, Note, Freedom to Speak Unintelligibly: The First Amendment Implications of Government-Controlled Encryption, 4 Wm. & Mary Bill of Rts. J. 1165 (1996)

Shawn Rosenheim, THE CRYPTOGRAPHIC IMAGINATION (1997)

Bruce Schneier, APPLIED CRYPTOGRAPHY: PROTOCOLS, ALGORITHMS, AND SOURCE CODE IN C (2d ed. 1996)

Simon Singh, THE CODE BOOK: THE EVOLUTION OF SERECY FROM MARY QUEEN OF SCOTS TO QANTUM CRYPTOGRAPHY (1999)

Lawrence Dwight Smith, CRYPTOGRAPHY (1943)

Neal Stephenson, CRYPTONOMICON (1999)

Lee Tien, Who's Afraid of Anonymous Speech? McIntyre and the Internet, 75 Oregon L. Rev. 117 (1996)

Philip R. Zimmerman, CRYPTOGRAPHY FOR THE INTERNET, Scientific American, October 1998

Philip R. Zimmerman, The Official PGP User's Guide (1996)