You are on page 1of 8

41

Controlling
F. Weingarten

Cryptographic

1. Background

Office of Technology Assessment,


Washington, DC20.510, USA

Cryptography,

export

controls,

information

policy,

* The views expressed in this article are those of Dr. Weingarten and do not reflect those of the Office of Technology
Assessment or the U.S. Congress.

North-Holland
Publishing Company
Computers & Security 2 (1983) 41-48

0167-4048/83/$03.00

- Why the Controversy?

Congress of the United States,

Over the last decade, controversy


has simmered over the
degree to which the U.S. Federal government
should control
the publication
of cryptographic
research. The debate is not
particularly
heated at the present time; researchers seem to be
waiting to see the results of a system of voluntary
controls
recommended
last year by a panel of the American Council on
Education.
Despite the quiet, however, it is instructive to examine some
of the history underlying the debate for several reasons beyond
sheer curiosity. In the first place, such controls, whether voluntary or mandatory
sanctions, could influence the future directions of civilian cryptographic
research. Secondly, controversy
or confrontation
could still arise were a researcher to disagree
with the results of the voluntary review. Finally, a recent speech
by Admiral Inman, since retired as Deputy Director of the
CIA, to the American
Association
for the Advancement
of
Science suggests that some in the U.S. Government
would like
to extend such controls to a much wider range of computer
science and engineering research.
This article is a brief and personal discussion of the principal components
of the controversy
by an author who found
himself occasionally watching it from the sidelines and in some
instances
uncomfortably
in the middle as an NSF project
director responsible for some of the research under debate.
Keywords:
cryptology.

Publication

0 1983 Elsevier

Science Publishers

For many years, governments


have attempted
to control the dissemination
of technical information that it considers militarily
relevant on the
grounds of national security. In the past, the military importance
of certain technical information
has been obvious - the design of a new bomb, an
armoured vehicle, or gun. For many years, courts
in the USA have tended to agree with the U.S.
Government
that the First Amendment
right of
speech needs to be balanced
against other important societal interest, such as national security,
particularly
in time of war. They have, however,
also tended to lay a very stringent burden of proof
on the government
to demonstrate
a serious and
immediate danger.
In many fields, for example certain areas of
physics relevant
to atomic weapons design, researchers have long since learned to live with the
possibility
or even the certainty
that their work

Fred Weingarten is Program Manager


of the Communication
and Information Technology Program at the Office
of Technology
Assessment
(OTA).
Prior to his appointment,
Dr. Weingarten had gained recognition
in the
USA as an authority
on information
policy in academia, as a private consultant,
and in the US Federal
Government.
At OTA, he was the
principal
author of the report, Computer Based National Information Systems, which projected for Congress the
general trends in information
policy over the next decade.
In 1971, he joined the National Science Foundation
(NSF)
to form and direct a program of research in the impacts of
computers on society. In an earlier tour of duty with the NSF,
Dr. Weingarten
formed and directed
the first program
of
support for computer
science research and was also deeply
involved with programs of support for research and educational
computing
in higher education.
He then left NSF to teach
computer
science and direct computing
at The Claremont
Colleges.
Dr. Weingarten earned a BS degree in Engineering from the
California
Institute of Technology
in 1962, and a PhD in
Mathematics
from Oregon State University
in 1966. From
there, he accepted a post-doctoral
fellowship at the Lawrence
Laboratory
at Livermore.
He has written and spoken both
nationally
and internationally
on issues of computer
impacts
and information
policy. His books and articles range in subject
matter from pure computer science to computer security and
information
policy.

B.V.

42

F. Wemgurten

/ Con~rdlwzg Cryptographic

would never see the light of day in the unclassified


scientific literature. They have tended to cooperate
with these restrictions,
partly because they appreciated the military importance
of their work and
the dangers of publication
and partly because
acceptance of controls was the agreed-upon
price
for getting government
support of research and
development
in these areas. Despite this history of
concern on the part of the U.S. Government
and
an accompanying
sympathy and cooperation
from
the scientific community,
the mid-seventies
saw a
surge of controversy
focused on attempts by the
government
to impose certain controls on a group
of scientists working in areas of computer science
and engineering
related to cryptography.
That
such arguments
arose was probably
surprising to the defense community.
After all, the
field of cryptography
had been for a long time the
nearly exclusive domain of the military and diplomatic
corps. In addition,
there already were
areas of computer
and telecommunications
technology directly related to military interests being
investigated
under defense classification
restrictions. Highly sophisticated
computer/communications systems had become the central nervous system of a modern military establishment.
A number of reasons have been proposed
to
explain the controversy;
among them:
* Trends in computer
science and cryptography
were converging in some areas so that research in
computing
theory was suddenly and unexpectedly
relevant to cryptography
and, hence, potentially
subject to control.
* In computer
science, the boundaries
between
basic and applied research are particularly
blurred.
The elapsed time between a theoretical result and
the implementation
of a new piece of useful software or hardware can be very short. Hence, controls based on relevancy threatened to reach back
into basic research much further than was usually
the case with classified research.
* Domestic applications
of computer/communication systems were growing rapidly, along with an
interest
on the part of users in securing
their
systems. Thus a potential
need and market for
security technology,
including
cryptography,
was
appearing in the civilian sector.
* In the aftermath
of Vietnam
and Watergate,
public distrust of government
institutions
was particularly intense in the USA. The National Security Agency, having been thrust into the limelight

Publmztion

by virtue of its primary responsibility


for communications
security, was a natural focal point for
this hostility.
* The controversy
appeared
to pit the National
Science Foundation,
a small civilian agency dedicated to support of unclassified,
academic basic
research, against the Defense Department
in a
bureaucratic
battle over control of research in
areas of computer science.
- Finally,
the U.S. government
appeared
to be
asserting,
and even expanding,
its authority
to
suppress publication
of so-called private
ideas
through extension of laws that were little known
and little understood
in the basic research community. Such authority to censor appeared to conflict with First Amendment
rights and fundamental precepts of academic freedom.
Each of these reasons found echoes in the policy
debate over government
controls that has taken
place in the USA over the last few years.
2. The Major Events
The controversy
over cryptographic
research
that was played out in the press and in government during the late 1970s was triggered and
sustained
by four principal
events: (1) a public
debate over the strength of the data encryption
standard (DES) proposed by the National Bureau
of Standards;
(2) an attempt by an employee of
the National
Security Agency (NSA) to block an
IEEE meeting on cryptography;
(3) U.S. government attempts to classify two patent applications
involving
cryptographic
technology;
and (4) a
series of articles in the New York Times purporting that NSA was putting improper pressure
on the National Science Foundation
(NSF).
These events were accompanied
by coverage in
the national news media in the U.S.A, debates and
hearings in Congress, letters exchanged
between
professional
societies and the White House, some
rare public statements
and appearances
by the
Director of the NSA, and repeated attempts by the
U.S. Federal government
to set and promulgate
a
clear policy statement on the matter. Finally, responding
to calls from both the academic community and defense officials for a broader dialogue, the American
Council on Education
obtained support from NSF for a working group to
explore possible
voluntary
controls
on cryptographic publication.

F. Weingarten

/ Conrrding

3. The DES Debate

Starting in 1973, the National Bureau of Standards, seeing the need to protect U.S. Federal data
communications,
began the search for an encryption algorithm to serve as a standard for use by the
U.S. Federal government.
While the full origins of
the encryption
standard
finally chosen by the
Bureau must remain somewhat cloudy, it can be
traced back, at least in part, to research work done
by IBM, which submitted a candidate algorithm in
1974. This algorithm was submitted for review by
experts in the field. Part of that review included
consultation
with the NSA.
In the eyes of the Bureau, such consultation
made good sense. Not only did NSA have unchallengeable expertise in the field of cryptography;
it
also had responsibility
for protecting the security
of all government
communications.
But NSA involvement was a red flag to some outside critics
who did not feel that the algorithm was secure
enough to qualify as a standard. While arguments
over the appropriateness
and quality of the DES
were generally based on technical grounds,
the
subject of NSA inevitably entered the discussion,
The loudest technical complaint from the critics
of DES, one that is still heard, is that the key
length, 56 bits, is too short to protect communications against a brute force attack, trying all possible keys until the correct one is found. This argument focused on the cost of building a specialized
computer for breaking the DES code - a debate
that inevitably turned to discussion of whether any
private or government
agency could afford or
would be inclined
to invest in such expensive
hardware. The NBS conceded that sometime in the
future, technology would be cheap enough to make
a brute force processor
theoretically
affordable,
but contended that such a time was sufficiently far
off in the future not to matter in the deliberations
taking place then. The algorithm could be reviewed
later, and, if it seemed appropriate,
strengthened.
Critics, on the other hand, said that the time
when chip technology would invalidate
the standard was not so far off. Furthermore,
even if such
an event were a decade or more in the future, by
that time substantial
investment
would have been
made by both government
and the private sector
in DES-based security. Changing over to another,
more secure system then could be very expensive
and meet with great resistance.

Cryptogruphrc Publication

43

A second criticism, less concrete but generating


even more heat, was the assertion that there could
be a trapdoor in the code, a flaw known only to
the Government
that could be exploited to make
the job of breaking
the code far easier. Those
suspicions
still are expressed in some quarters.
However, other observers point out that enormous
damage could be done to national interests were
such a flaw to exist and were it to be discovered by
criminal elements or a hostile foreign power. That
potential
for harm would seem to make it very
unlikely and out of character for a defense agency
to promulgate
a flawed algorithm, especially an
algorithm specifically designed for government use.
According to David Kahn, writing in Foreign
Affairs [l], the DES algorithm is a compromise
between the desires to promulgate
a solid algorithm and to not have too strong a code available
commercially.
That compromise,
according
to
Kahn, was to make the algorithm strong but keep
the key length relatively small. Regardless of these
issues, since July of 1977, DES has been the officially recognized algorithm for U.S. Federal use,
and it has become a defacto standard for civilian
encryption
[2].

4. The IEEE Meeting


In July of 1977, the Institute of Electrical and
Electronics Engineers (IEEE) received a letter from
one J.A. Meyer [3]. Though signed and including a
membership
number,
the letter did not indicate
any institutional
affiliation.
The letter expressed
concern that recent IEEE activities, publications
and symposia contained
publications
and talks
about cryptography.
He warned that such publication could constitute
a violation of the International Traffic in Arms Regulations
(ITAR,
22
CFR 121-128).
A particular concern of the letter seemed to be
a symposium on information
theory scheduled to
be held in Ithaca (New York) in October of 1977.
According
to Meyers interpretation
of ITAR, if
foreign visitors were in the audience, export licenses
would be required
for all papers delivered on
cryptographic-related
topics.
The IEEE Director
of Technical
Activities
routinely passed along the warning to the scheduled speakers, saying that the responsibility
to
comply with ITAR regulations
lay, not with the

44

F. Weingurten

/ Controlling

IEEE, but with the authors, themselves [4]. Not


addressed
was the question,
soon to rise as the
most important
one, of whether basic research
publication
was covered under the ITAR definition of technical data. That question was more
immediately
apparent
to the speakers. Was the
Government
suggesting
that research
scientists
working
on unclassified
basic research needed
government
permission to speak at scientific meetings and publish in scientific journals?
Could the
Government,
in fact, bar them from presenting
unclassified
research results in such fora? And, if
they spoke, might they actually face criminal prosecution?
The controversy
soon reached the press. Reporters identified
Meyer as a NSA employee, although the agency denied that he was stating
official policy. In this case, too, the NSA was
drawn reluctantly
into a public controversy
~ this
time pitting national security against concerns over
constitutional
guarantees
of free speech and long
traditions of academic freedom in the sciences.
The meeting went on as planned.
The applicability of export controls to basic research information continues
to be in debate. To date, no
scientist in the USA languishes
in jail for publishing a research paper; however, a number of
thorny issues has yet to be resolved. Among them:
* Constitutionality. To what degree can ITAR controls be extended from their traditional
applicability to hardware
and detailed
descriptions
of
hardware
(such as blueprints),
to more general
restrictions
on scientific
publication
without
violating First Amendment
protections?
The question is particularly
difficult in computer
science
because of the close connection
between
basic
research and applications.
- Degree of threat. How much threat to national
security is posed by unbridled
publication
of unclassified research in cryptology-related
fields of
computer science? The dilemma is that, since the
information
necessary
to make that decision is
highly classified, one can only take the word of
NSA - an interested party in the debate.
* Scope. What fields of investigation
constitute
related
research,
since ITAR specifically
exempts basic research without attempting
to define it? What constitutes
basic research in computer-related
fields? Some initial attempts to define the scope have been very broad, indeed. The
tight boundaries
that policy makers would like to
find may well not exist.

Cryptographic

Publication

- Enforceabilit,v. How far does ITAR reach into


the daily activity of scientists and how enforceable
is it in practical
terms? ITAR restrictions
have
been claimed to extend far beyond international
meetings and publications
- they include the classroom and even informal
discussions
around the
department
coffee pot. Are there courses in computer science and engineering
that may not be
taught to foreign students (which constitute in the
USA nearly half the graduate student population
in some departments)?
How about meetings, such
as the National
Computer
Conference,
that run
exhibitions
in which hardware demonstrations
and
sales material are available freely to all attendees?
Must a university
professor
on sabbatical
at a
university
outside the USA continually
monitor
his words and thoughts?
* Effect on U.S. science. What would be the effect
on U.S. science of the rigorous application
of
ITAR controls on scientific publication?
Some say
that it could seriously inhibit research in a vital
field at a time when international
economic competition
is facing the U.S. with unprecedented
challenges. Further, it could impede international
science cooperation
and result in a restricted flow
of information
into this country [5-71.
A careful distinction
should always be made
between government
control of publication
when
it concerns research that has been funded by the
Government
and publication
that reflects work
done privately. While the net effect is the same restriction
of the flow of scientific information
the grounds
on which controls are applied are
substantially
different. Usually, in a research grant
or contract, the rules governing
the rights of the
researcher and of the Government
with regard to
publication
are described in detail. If that is the
case, it is fair to assume that the investigator
accepting the money subscribes to the accompanying limitations.
While one could question the wisdom of any particular
government
restriction,
it
would seem to be legitimate. On the other hand, it
seems fair to enquire about the constitutionality
of
attempts by the Government
to restrict publication or speech when it concerns research findings
or ideas developed independently
of the Government [8].
Clearly NSA is concerned about the leakage of
valuable technical data that has national security
implications.
That concern has been serious enough
that Admiral Inman, then Director of NSA, took

F. Werngar~en / Controlling Cryptographic

the nearly unprecedented


step of speaking publicly
about his views, trying to engage the academic
community
in a debate that might result in some
balanced
consensus
[9]. One of the approaches
resulting from this public discussion was the work
of a panel, funded by NSF and sponsored by the
American Council on Education,
that finally proposed a system of voluntary controls. The work of
this panel is described in another paper in Volume
1, Number 2 of Computers and Security .
In the meantime,
U.S. Government
claims of
the right to restrict
publication
of so-called
private ideas have expanded into other areas of
computer
science as well as into other fields of
engineering
and science [lo].
5. Classification of Patents
In 1977, the Wisconsin
Alumni
Research
Foundation
applied for a patent on a cryptographic device developed
by George Davida, a
professor of computer science at the University of
Wisconsin at Milwaukee. Davidas work was based
on research done under NSF basic research grants
in the area of computer security. In April of 1978,
Davida received a notice that the patent application had been classified. Of most concern to Davida
was that the letter from the Patent Office could be
read to extend the classification
to all of his related research effort - all notes, reports, and even
published
documents.
Davida and administrators
of the University protested the order vociferously,
maintaining
that it had a chilling effect on the
conduct of their research. (See Davidas testimony
in [ 111.)
At about the same time, a patent secrecy order
was also imposed on the application by a group of
Seattle, Washington
inventors who had developed
a telephone scrambling
device. Shortly afterward,
both secrecy orders were lifted by the Patent and
Trademark
Office [ 121.
The authority
under which the secrecy order
was based was a 195 1 law, the Invention
Secrecy
Act of 1951 (35 USC 181-188). That law replaced
a number of temporary laws dealing with patent
classification,
dating as far back as 1917. It says,
basically, that the Commissioner
of Patents may
show an application
to defense agencies that might
have an interest in the technology concerned; and,
if an agency finds that publication
would be detrimental to national security, the Commissioner

Publication

45

may order it to be kept secret. The period of


classification
was strictly limited to no more than
two years, except during periods of national
emergency,
as declared by the U.S. President.
Such a period had been declared by President
Truman in 1950 and remained in effect until 1978.
The secrecy orders raised two questions:
(1)
Given the vague wording of the order and the
consequent
uncertain
applicability
to related
speech, publications,
and research notes, was there
a violation of First Amendment
freedom? (2) Were
there due process violations in that the investigators found few avenues available for appeal and
review of the decisions except by the very agencies
and people who made them in the first place? As
seems to be the general rule with such cases, the
orders were revoked before such questions could
be explored in the courts.
Furthermore,
although patents may seem to be
a special case, the U.S. Government
interest in
maintaining
its patent rights in research funded by
it has caused agencies to express interest in reviewing research prior to publication
[ 131. (In some
countries, submission
of manuscripts
for publication is considered
to be public disclosure
and
invalidating
for patents.) However, access by the
U.S. Government
to research results well in advance of publication,
even if done in the name of
investigating
patentability,
also may afford to
agencies so inclined an additional
opportunity
to
apply pressure inhibiting publication
[ 131.

6. NSA / NSF Interaction


In late 1977, another related controversy erupted
in the press with a New York Times report that
NSA had been putting undue pressure on NSF to
restrict the funding or the type of research supported for some principal investigators
who had
been looking at cryptography.
The reports were
particularly
embarrassing
to NSF and alarming to
the research support community.
For many years,
the Foundation
had been a principal government
contact
with the academic
science community,
particularly
at a time when defense contracts were
anathema on many college campuses. Any suggestion that NSF policies, priorities, or granting decisions were being tampered
with by NSA on
grounds other than scientific quality was bound to
raise an outcry.

46

F. Weingarten

/ Controlling

Some researchers funded by the computer science program had begun making breakthroughs
in
encryption,
particularly
in the area known as
public key. It is interesting to note that none of
the investigators
was funded explicitly to look at
encryption.
Hellman, at Stanford, was working in
coding theory (which sounds like, but is not, a
theory of encryption),
and Rivest, at MIT, was
working in complexity theory. Davida, at the University of Wisconsin,
had been working on the
general problem of securing data banks. There was
not then, nor is there now, an explicit NSF program of research in encryption.
In fact, even computer security experts thought that there were other
areas of research that held much higher priority in
terms of utility to the industry.
While NSA had contacted NSF concerning certain grants that had been made by the Computer
Science Section, the author never experienced anything that could be construed as improper
pressure. NSF had started routinely sending proposals
that clearly related to cryptography
to NSA for
technical review. This practice was felt to be consistent with the goal of eliciting best expert opinion about the scientific quality of proposed
research. Especially at that time, such expertise resided almost exclusively in NSA. In the course of
review of a few such proposals,
NSA expressed
informally
to the author their concerns that some
of the work might have national security implications. NSFs response was simply that scientific
review was not an appropriate
channel to promote
those concerns,
and
that there were proper
bureaucratic
channels to express such a concern ~
channels that lead all the way to the White House.
This interference
was one of the items investigated by the Senate Intelligence
Committee,
and the Committee
reported that they found no
evidence or improper conduct by either agency.
But they further recommended
that NSA and NSF
establish some formal mechanisms for coordination, since it was not implausible
that some day a
proposal for cryptographic
research might come
along that might be clearly hazardous
to U.S.
national security if funded by an unclassified basic
research program [ 141.
It must be realized that, even when a program
of research support is designated
as unclassified,
the U.S. Government
retains some rights to change
its mind if research results appear to be sensitive.
This fact holds true even in the case of NSF

Cryptogruphlc

Publrcurron

grantees, although the fact that NSF does not, as a


habit, prereview research publications
renders the
policy somewhat moot in their case.
In addition
to continuing
the review arrangement, NSF began talking with NSA about establishing their own program of support for unclassified research in encryption.
Such a program is now
in operation.
and a number of researchers
have
taken advantage of it. The only problem cropped
up when NSF appeared to be forcing a researcher
to accept NSA rather than NSF support for a
project [ 15,161. NSFs official policy as now stated
is that, if NSA is interested in a proposal, they will
offer a researcher the opportunity
to seek NSA
support, but not require it.
Surely more problems and questions regarding
relations
between NSA and NSF lie down the
road. They would appear to be inevitable in light
of the close connection
between basic science of
interest to the academic community,
technology of
interest to the civilian market, and information
that could be highly sensitive from a national
security perspective. The cases of Hellman, Rivest,
and Davida cited above illustrate that, even if all
direct support of cryptographic
research were to
be spun off to another agency, results of vital
importance
to NSA could crop up unexpectedly
from a wide range of mathematics,
computer science, or engineering
research programs.

7. Conclusion
The four controversies
discussed above were
closely related, both in time and in the more
general debate in the USA. Although each raises
some specific and unique questions, there are also
a few important and still unresolved issues that cut
across all of them.
7.1. Classification

of Private Ideas

The U.S. Governments


authority to classify its
own work is not questioned,
so long as certain
standards are applied and procedures are followed
to protect the publics right to know. However, the
right of the U.S. Government
to appropriate
and
classify other, so-called private
ideas is much
more in debate. Proponents
of such a right often
point for an example at the born
classified
doctrine in the atomic energy field that asserts just

F. Werngarten / Controlltng Cryptographic

such a right. Yet, according to an official historian


of the AEC, that principle had a murky birth and
relatively little testing in court for constitutionality
[8]. Opponents
doubt that such a doctrine could
satisfy a constitutional
test on First Amendment
grounds,
at least as implemented
in the export
control regulations
or patent secrecy act and applied to scientific research publication.
7.2. Impact and Effectiveness
cal Data

of Controls on Techni-

There is a great deal of current public debate in


the USA, much of it triggered by the Soviet gas
pipeline issue, as to whether export controls are
effective commercially
- do they achieve a political goal or merely hurt U.S. business by locking
them out of international
markets? [ 171.
The same question can be asked with respect to
technical data. In that case, there is significant
debate as to whether controls have any effect on
the rate of technological
advance. It is argued that
technology
transfer is a highly complex process
beyond
merely communicating
research results.
Secondly, information
and the media conveying it
(not to mention science, itself) do not recognize
national borders. It would seem difficult to set up
any barriers to international
flow that did not also
impede domestic flow. Would any benefits to U.S.
national
security from restrictions
be offset by
through
constricting
damage
to U.S. science
domestic communication
of research and cutting
off valuable international
ties?
7.3. Definitions

of Relevant

Data

What sort of information


is covered by export
control,
patent secrecy, or the voluntary
ACE
agreement?
It is not so easy to find definitions.
For example, basic research is excluded. Yet, we
all know that the line between basic research and
applied research can be hazy or even non-existent.
The same piece of pure mathematical
work can be
basic in the eye of the mathematician
performing it and applied
to someone who sees in it a
new technique to encode, or analyze the strength
of, or break an existing code. Furthermore,
the
fields of science related to cryptography
and, more
broadly, signals intelligence,
can be broad indeed,
ranging from mathematics
and computer science
to physics and chemistry.

41

Publication

The question is more than academic. If one is


willing to accept some cost of impeding domestic
technological
development
in order to improve
national security, the magnitude
of that cost depends on the range of science affected. Perhaps the
U.S.A. does not need many more encryption algorithms, at least for the present. But it could less
afford restriction
on work much more broadly
related to the improvement
of computer and communication
technology.
In those areas, we in the
U.S. are in a full-fledged economic race with the
rest of the world and could ill-afford to hobble
ourselves.
7.4. NSA and the Civilian

World

NSA is reputed to be the most secret organization in the United States. Certainly, much of their
work is highly sensitive. At one time, even the very
existence of the agency was classified. They are
not, by nature or primary mission, oriented to
dealing with the unclassified
civilian world. Yet,
they have responsibility
for securing government
communications,
including those of civilian agencies. It was the revolution in information
technology and the way U.S. society uses it that caused
NSA to suddenly be exposed to the public light on
this issue. Their problems with NSF illustrate the
difficulty faced when the activities of two agencies
with wildly diverging
charters, operating
styles,
and constituencies
intersect unexpectedly.
If this analysis of the fundamental
problem is
correct, it implies that the future may hold further
problems,
sporadic brushfires between NSA and
the academic community
that will involve NSF.
Perhaps another small agency or office - say, in
the National
Bureau of Standards
- should be
established
to sit at this interface
between the
defense department
and the civilian community
in
the whole area of computer and communication
security.
Such a group could mediate internal
Federal policy and deal with both communities,
neither of whom, in general, trusts the other.

References

[I] David Kahn, Cryptology Goes Public, Foreign Affurrs.


August 1979, pp. 141-159.
[2] U.S. Government National Bureau of Standards, Data
Enqption
Standard, FIPS Pub 46, 15 January 1977.

48

F. Weingarren / Controlhng

(31 J.A. Meyer, Letter to E.K. Gannet (IEEE Editorial Board).


7 July 1977.
[4] N.P. Dwivedi, Letter to several IEEE Officials and members, 8 August 1977.
[5] Paul E. Gray,
Technology
Transfer
at Issue: The
Academic Viewpoint,
IEEE Spectrum, Vol. 19-5. May
1982, pp. 64-68.
[6] Unger, Stephen H.. The Growing Threat of Government
Secrecy, Technology Review. Vol. 85-2, Feb/March
1982,
p. 30.
[7] Paul Wallich. Technology
Transfer
at Issue: The Industry Viewpoint, IEEE Spectrum, Vol. 19-5, May 1982,
pp. 69973.
[8] Richard G. Hewlett. Born Classified in the AEC: A
Historians view. Bullenn of rhe Aromc Scientists. DecemInman. The
tions Protection in
March 1979, pp. 6[lo] William 0. Carey

[ 1 I]

[ 12)
[13]

[ 141

[ 151

[16]

ber 1981, pp. 20-30.


[9] B.R.

C<vptographic Puhkation

NSA Perspective on Telecommunicathe Nongovernmental


Sector, StgnuI.
11.
and Frank Carlucci.
Scientific
Ex-

[17]

changes and U.S. National Security. Science, Vol. 215-8.


January 1982. p, 139-141.
U.S. Congress House Committee on Government
Operations, The Governments Classr/ication of Private Idea, H.
Rep 96-1540. 96th Congress,
2nd Session. U.S.G.P.O..
1980.
N. Y. Times, Commerce
Officials Lift Secrecy Order..
Science, New Patent Rule Upsets Universities, Vol. 213.
11 September 1981, pp. 1234. 1235.
U.S. Senate Select Committee
on Intelligence,
Involvement of NSA in the Development
of the Data Encryption
Standard (Unclassified
Summary). April 1978.
New, York Times, Science Agency Blocks Funds to Aid
Research on Computer
Coding, 27 August 1980, pp. 1.
co1 I.
Science, Cryptography:
A New Clash Between Academic
Freedom
and National
Security,
Vol. 209. 29 August
1980, pp. 995. 996.
U.S. Congress
Office
OF Technology
Assessment,
Eus/- West Trade, U.S. Government
Printing Office, 1979.

You might also like