You are on page 1of 636

#WeWinCyberwar2.

0 (ST)

Notes
Brought to you by KWei and Amy from the SWS heg lab.
Email me at ghskwei@gmail.com for help/with questions.
The thing about backdoor Affs is that all of their evidence will talk about past
attacks. Press them on why their scenario is different and how these past
attacks prove that empirically, there is no impact to break-ins through
backdoors.
Also, a lot of their ev about mandating backdoors is in the context of future
legislation, not the squo.
Also, their internal links are totally fabricated.
Links to networks, neolib, and gender privacy k, you can find those in the
generics.

Links
Some links I dont have time to cut but that I think will have good args/cards:
Going dark terrorism links:
http://judiciary.house.gov/_files/hearings/printers/112th/112-59_64581.PDF
Front doors CP: http://papers.ssrn.com/sol3/papers.cfm?
abstract_id=2630361&download=yes
Military DA i/l ev: https://cyberwar.nl/d/20130200_Offensive-CyberCapabilities-are-Needed-Because-of-Deterrence_Jarno-Limnell.pdf
http://www.inss.org.il/uploadImages/systemFiles/MASA4-3Engc_Cilluffo.pdf
Military DA Iran impact:
http://www.sobiad.org/ejournals/journal_ijss/arhieves/2012_1/sanghamitra_na
th.pdf
Miltiary DA Syran impact: http://nationalinterest.org/commentary/syriapreparing-the-cyber-threat-8997

T-Domestic

1NC
NSA spies on foreign corporations through backdoors
NYT 14
(David E. Sanger and Nicole Perlroth. "N.S.A. Breached Chinese Servers Seen as Security Threat," New
York Times. 3-22-2014. http://www.nytimes.com/2014/03/23/world/asia/nsa-breached-chinese-serversseen-as-spy-peril.html//ghs-kw)
WASHINGTON American officials have long considered

Huawei, the Chinese telecommunications

giant, a security threat, blocking it from business deals in the United States for fear that the company would
create back doors in its equipment that could allow the Chinese military or Beijing-backed hackers to steal
corporate and government secrets. But even as the United States made a public case about the dangers of buying

the National Security Agency was creating its


own back doors directly into Huaweis networks. The agency pried its way
into the servers in Huaweis sealed headquarters in Shenzhen, China s industrial heart,
according to N.S.A. documents provided by the former contractor Edward J. Snowden. It obtained
information about the workings of the giant routers and complex digital switches
that Huawei boasts connect a third of the worlds population, and monitored
communications of the companys top executives. One of the goals of the operation,
code-named Shotgiant, was to find any links between Huawei and the Peoples
Liberation Army, one 2010 document made clear. But the plans went further: to exploit Huaweis technology
from Huawei, classified documents show that

so that when the company sold equipment to other countries including both allies and nations that avoid buying
American products the N.S.A. could roam through their computer and telephone networks to conduct surveillance
and, if ordered by the president, offensive cyberoperations.

NSA targets foreign systems with backdoors


Zetter 13
(Kim Zetter. "NSA Laughs at PCs, Prefers Hacking Routers and Switches," WIRED. 9-4-2013.
http://www.wired.com/2013/09/nsa-router-hacking///ghs-kw)

THE NSA RUNS a massive, full-time hacking operation targeting foreign


systems, the latest leaks from Edward Snowden show. But unlike conventional cybercriminals, the agency
is less interested in hacking PCs and Macs. Instead, Americas spooks have their eyes on the internet
routers and switches that form the basic infrastructure of the net , and are largely
overlooked as security vulnerabilities. Under a $652-million program codenamed Genie, U.S. intel agencies
have hacked into foreign computers and networks to monitor communications
crossing them and to establish control over them , according to a secret black budget document

leaked to the Washington Post. U.S. intelligence agencies conducted 231 offensive cyber operations in 2011 to
penetrate the computer networks of targets abroad. This included not only installing covert implants in foreign
desktop computers but also on routers and firewalls tens of thousands of machines every year in all. According to
the Post, the government planned to expand the program to cover millions of additional foreign machines in the
future and preferred hacking routers to individual PCs because it gave agencies access to data from entire networks
of computers instead of just individual machines. Most of the hacks targeted the systems and communications of
top adversaries like China, Russia, Iran and North Korea and included activities around nuclear proliferation. The
NSAs focus on routers highlights an often-overlooked attack vector with huge advantages for the intruder, says
Marc Maiffret, chief technology officer at security firm Beyond Trust. Hacking routers is an ideal way for an
intelligence or military agency to maintain a persistent hold on network traffic because the systems arent updated
with new software very often or patched in the way that Windows and Linux systems are. No one updates their
routers, he says. If you think people are bad about patching Windows and Linux (which they are) then they are
horrible about updating their networking gear because it is too critical, and usually they dont have redundancy to
be able to do it properly. He also notes that routers dont have security software that can help detect a breach.
The challenge [with desktop systems] is that while antivirus dont work well on your desktop, they at least do
something [to detect attacks], he says. But you dont even have an integrity check for the most part on routers
and other such devices like IP cameras. Hijacking routers and switches could allow the NSA to do more than just
eavesdrop on all the communications crossing that equipment. It would also let them bring down networks or
prevent certain communication, such as military orders, from getting through, though the Post story doesnt report
any such activities. With control of routers, the NSA could re-route traffic to a different location, or intelligence
agencies could alter it for disinformation campaigns, such as planting information that would have a detrimental
political effect or altering orders to re-route troops or supplies in a military operation. According to the budget

the CIAs Tailored Access Programs and NSAs software engineers possess
templates for breaking into common brands and models of routers, switches and
document,

firewalls. The article doesnt say it, but this would likely involve pre-written scripts or backdoor
tools and root kits for attacking known but unpatched vulnerabilities in these systems, as well as for attacking
zero-day vulnerabilities that are yet unknown to the vendor and customers. [Router software is] just an
operating system and can be hacked just as Windows or Linux would be hacked,
Maiffret says. Theyve tried to harden them a little bit more [than these other systems], but for folks at a
place like the NSA or any other major government intelligence agency, its pretty
standard fare of having a ready-to-go backdoor for your [off-the-shelf] Cisco or Juniper
models.

T-Surveillance

1NC
Backdoors are also used for cyberwarfarenot surveillance
Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.
After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

an implants purpose is to create a back door for future access. You


pry open the window somewhere and leave it so when you come back the
owner doesnt know its unlocked, but you can get back in when you want
to, said one intelligence official, who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of
anonymity to discuss sensitive technology. Under U.S. cyberdoctrine, these operations are known as exploitation,
not attack, but they are essential precursors both to attack and defense. By the end of this
year, GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S. intelligence
budget. The NSA appears to be planning a rapid expansion of those numbers , which were limited
Sometimes

until recently by the need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use of

the
NSA has brought online an automated system, code-named TURBINE, that is capable of
managing potentially millions of implants for intelligence gathering and active attack.
only 8,448 of the 68,975 machines with active implants in 2011. For GENIEs next phase, according to an authoritative reference document,

T-Surveillance (ST)

1NC
Undermining encryption standards includes commercial fines
against illegal exports
Goodwin and Procter 14
(Goodwin and Proctor, legal firm. Software Companies Now on Notice That Encryption Exports May Be
Treated More Seriously: $750,000 Fine Against Intel Subsidiary, Client Alert, 10-15-2014.
http://www.goodwinprocter.com/Publications/Newsletters/Client-Alert/2014/1015_Software-CompaniesNow-on-Notice-That-Encryption-Exports-May-Be-Treated-More-Seriously.aspx//ghs-kw)

the Department of Commerces Bureau of Industry and Security (BIS)


announced the issuance of a $750,000 penalty against Wind River Systems , an Intel
subsidiary, for the unlawful exportation of encryption software products to foreign
government end-users and to organizations on the BIS Entity List. Wind River
Systems exported its software to China, Hong Kong, Russia, Israel, South Africa, and
South Korea. BIS significantly mitigated what would have been a much larger fine
because the company voluntarily disclosed the violations. We believe this to be the first
On October 8, 2014,

penalty BIS has ever issued for the unlicensed export of encryption software that did not also involve
comprehensively sanctioned countries (e.g., Cuba, Iran, North Korea, Sudan or Syria). This suggests a fundamental
change in BISs treatment of violations of the encryption regulations. Historically, BIS has resolved voluntarily
disclosed violations of the encryption regulations with a warning letter but no material consequence, and has shown

This fine dramatically increases the


compliance stakes for software companies a message that BIS seemed intent upon making in its
announcement. Encryption is ubiquitous in software products. Companies making these
products should reexamine their product classifications, export eligibility, and
internal policies and procedures regarding the export of software that uses or
leverages encryption (even open source or third-party encryption libraries), particularly where a
potential transaction on the horizon e.g., an acquisition, financing, or initial public
offering will increase the likelihood that violations of these laws will be identified.
itself unlikely to pursue such violations that were not disclosed.

If you would like additional information about the issues addressed in this Client Alert, please contact Rich Matheny,
who chairs Goodwin Procters National Security & Foreign Trade Regulation Practice, or the Goodwin Procter
attorney with whom you typically consult.

CPs

Foreign Backdoors CP

CX
In the world of the AFF does the government no longer have access to
backdoors? So we dont use or possess backdoors in the world of the AFF,
right?

1NC
(KQ) Counterplan: the United States federal government
should ban the creation of backdoors as outlined in the Secure
Data Act of 2015 but should not ban the surveillance of
backdoors and should mandate clandestine corporate
disclosure of foreign-government-mandated backdoors to the
United States federal government.
(CT) Counterplan: The United States federal government
should not mandate the creation of surveillance backdoors in
products or request privacy keys, and should terminate current
backdoors created either by government mandates or
government requested keys but should not cease the use of
backdoors.
Backdoors are inevitablewell use backdoors created by
foreign governments
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)

Still another approach is to let other governments do the dirty work. The
computer scientists' report cites the possibility of other sovereigns adopting their
own extraordinary access regimes as a reason for the U.S. to go slow: Building in
exceptional access would be risky enough even if only one law enforcement agency
in the world had it. But this is not only a US issue. The UK government promises
legislation this fall to compel communications service providers, including US-based
corporations, to grant access to UK law enforcement agencies, and other countries
would certainly follow suit. China has already intimated that it may require
exceptional access. If a British-based developer deploys a messaging application
used by citizens of China, must it provide exceptional access to Chinese law
enforcement? Which countries have sufficient respect for the rule of law to participate in an international

exceptional access framework? How would such determinations be made? How would timely approvals be given for
the millions of new products with communications capabilities? And how would this new surveillance ecosystem be
funded and supervised? The US and UK governments have fought long and hard to keep the governance of the
Internet open, in the face of demands from authoritarian countries that it be brought under state control. Does not

the computer
scientists are correct that foreign governments will move in this direction , but I think
they are misreading the consequences of this. China and Britain will do this irrespective of
what the United States does, and that fact may well create potential
opportunity for the U.S. After all, if China and Britain are going to force
U.S. companies to think through the problem of how to provide
extraordinary access without compromising general security, perhaps the
need to do business in those countries will provide much of the incentive
to think through the hard problems of how to do it. Perhaps countries far less
solicitous than ours of the plight of tech nology companies or the privacy interests of
their users will force the research that Comey can only hypothesize. Will Apple then take the
view that it can offer phones to users in China which can be decrypted for Chinese
authorities when they require it but that it's technically impossible to do so in the
the push for exceptional access represent a breathtaking policy reversal? I am certain that

United States?

2NC O/V
Counterplan solves 100% of the casewe mandate the USFG
publicly stop creating backdoors but instead use backdoors
that are inevitably mandated by foreign nations for
surveillancesolves perception and doesnt link to the net
benefitthats Wittes

2NC Backdoors Inev


India has backdoors
Ragan 12
(Steve Ragan. Steve Ragan is a security reporter and contributor for SecurityWeek. Prior to joining the
journalism world in 2005, he spent 15 years as a freelance IT contractor focused on endpoint security
and security training. "Hackers Expose India's Backdoor Intercept Program," No Publication. 1-9-2012.
http://www.securityweek.com/hackers-expose-indias-backdoor-intercept-program//ghs-kw)
Symantec confirmed with SecurityWeek on Friday that hackers did access source code from Symantec
Endpoint Protection 11.0 and Symantec Antivirus 10.2. According to a Symantec spokesperson, SEP 11 was four
years ago to be exact. In addition, Symantec Antivirus 10.2 has been discontinued, though the company continues
to service it. Were taking this extremely seriously and are erring on the side of caution to develop and long-range
plan to take care of customers still using those products, Cris Paden, Senior Manager of Corporate Communications
at Symantec told SecurityWeek. Over the weekend, the story expanded. The Lords of Dharmaraja released a

RINOA, which earns its name from the


vendors involved - RIM, Nokia, and Apple. The memo said the vendors provided India
with backdoors into their technology in order to them to maintain a
presence in the local market space. Indias Ministry of Defense has an
agreement with all major device vendors to provide the country with the source
code and information needed for their SUR (surveillance) platform, the memo
explains. These backdoors allowed the military to conduct surveillance (RINOA SUR)
purported memo outlining the intercept program known as

against the US-China Economic and Security Review Commission. Personnel from Indian Naval Military Intelligence
were dispatched to the Peoples Republic of China to undertake Telecommunications Surveillance (TESUR) using the
RINOA backdoors and CYCADA-based technologies.

China has backdoors in 80% of global communications


Protalinski 12
(Emil Protalinski. Reporter for CNet and ZDNet. "Former Pentagon analyst: China has backdoors to
80% of telecoms," ZDNet. 7-14-2012. http://www.zdnet.com/article/former-pentagon-analyst-china-hasbackdoors-to-80-of-telecoms///ghs-kw)

The Chinese government reportedly has "pervasive access" to some 80


percent of the world's communications, thanks to backdoors it has ordered to be
installed in devices made by Huawei and ZTE Corporation. That's according to sources cited by
Michael Maloof, a former senior security policy analyst in the Office of the Secretary
of Defense, who now writes for WND: In 2000, Huawei was virtually unknown outside China, but by 2009 it had
grown to be one of the largest, second only to Ericsson. As a consequence, sources say that any information
traversing "any" Huawei equipped network isn't safe unless it has military encryption . One source
warned, "even then, there is no doubt that the Chinese are working very hard to decipher
anything encrypted that they intercept." Sources add that most corporate telecommunications
networks use "pretty light encryption" on their virtual private networks, or VPNs. I found about Maloof's report via
this week's edition of The CyberJungle podcast. Here's my rough transcription of what he says, at about 18 minutes

The Chinese government and the People's Liberation Army are so much
into cyberwarfare now that they have looked at not just Huawei but also ZTE Corporation as
providing through the equipment that they install in about 145 countries around in the world, and in
45 of the top 50 telecom centers around the world, the potential for backdooring
into data. Proprietary information could be not only spied upon but also could be altered and in some cases
and 30 seconds:

could be sabotaged. That's coming from technical experts who know Huawei, they know the company and they

it's
giving Chinese access to approximately 80 percent of the world telecoms
and it's working on the other 20 percent now.
know the Chinese. Since that story came out I've done a subsequent one in which sources tell me that

China is mandating backdoors


Mozur 1/28
(Paul Mozur. Reporter for the NYT. "New Rules in China Upset Western Tech Companies," New York
Times. 1-28-2015. http://www.nytimes.com/2015/01/29/technology/in-china-new-cybersecurity-rulesperturb-western-tech-companies.html//ghs-kw)
HONG KONG

The Chinese government has adopted new regulations requiring

companies that sell computer equipment to Chinese banks to turn over secret source code, submit
to invasive audits and build so-called back doors into hardware and software ,
according to a copy of the rules obtained by foreign technology companies that do billions of dollars worth of

The new rules, laid out in a 22-page document approved at the end of last year, are the
first in a series of policies expected to be unveiled in the coming months that Beijing
business in China.

says are intended to strengthen cybersecurity in critical Chinese industries. As copies have spread in the past
month, the regulations have heightened concern among foreign companies that the authorities are trying to force
them out of one of the largest and fastest-growing markets. In a letter sent Wednesday to a top-level Communist
Party committee on cybersecurity, led by President Xi Jinping, foreign business groups objected to the new policies
and complained that they amounted to protectionism. The groups, which include the U.S. Chamber of Commerce,
called for urgent discussion and dialogue about what they said was a growing trend toward policies that cite
cybersecurity in requiring companies to use only technology products and services that are developed and
controlled by Chinese companies. The letter is the latest salvo in an intensifying tit-for-tat between China and the
United States over online security and technology policy. While the United States has accused Chinese military
personnel of hacking and stealing from American companies, China has pointed to recent disclosures of United
States snooping in foreign countries as a reason to get rid of American technology as quickly as possible. Although
it is unclear to what extent the new rules result from security concerns, and to what extent they are cover for
building up the Chinese tech industry, the Chinese regulations go far beyond measures taken by most other
countries, lending some credibility to industry claims that they are protectionist. Beijing also has long used the

Chinese companies
must also follow the new regulations, though they will find it easier since for most, their core
Internet to keep tabs on its citizens and ensure the Communist Partys hold on power.

customers are in China. Chinas Internet filters have increasingly created a world with two Internets, a Chinese one
and a global one. The new policies could further split the tech world, forcing hardware and software makers to sell

While the
Obama administration will almost certainly complain that the new rules are protectionist in
nature, the Chinese will be able to make a case that they differ only in degree from
Washingtons own requirements.
either to China or the United States, or to create significantly different products for the two countries.

2NC AT Perm do Both


Permutation links to the net benefitthe AFF stops use of
backdoors, that was 1AC cross-ex

2NC AT Perm do the CP


The counterplan bans the creation of backdoors but not the
use of themthats different from the planthat was cross-ex
The permutation is severancethats a voting issue:
1. NEG groundmakes the AFF a shifting target which
makes it impossible to garner offensestop copying k
AFFs, vote NEG to be Dave Strauss
2. Kills advocacy skillsthey never have to defend
implementation of an advocacy

Cyberterror Advantage CP

1NC
Counterplan: the United States federal government should
substantially increase its support for renewable energy
technologies and grid decentralization.
Grid decentralization and renewables solve terror attacks
Lawson 11
(Lawson, Sean. Sean Lawson is an assistant professor in the Department of Communication at the
University of Utah. He holds a PhD in Science and Technology Studies from Rensselaer Polytechnic
Institute, a MA in Arab Studies from Georgetown University, and a BA in History from California State
University, Stanislaus. BEYOND CYBER-DOOM: Cyberattack Scenarios and the Evidence of History,
Mercatus Center at George Mason University. Working Paper No. 11-01, January 2011.
http://mercatus.org/sites/default/files/publication/beyond-cyber-doom-cyber-attack-scenariosevidence-history_1.pdf//ghs-kw)

Cybersecurity policy should promote decentralization and self-organization in efforts


to prevent, defend against, and respond to cyberattacks. Disaster researchers have
shown that victims are often themselves the first responders and that centralized,
hierarchical, bureaucratic responses can hamper their ability to respond in the
decentralized, self-organized manner that has often proved to be more effective
(Quarantelli, 2008: 895896). One way that officials often stand in the way of decentralized self-organization is by

U.S. military
doctrine increasingly has identified decentralization , self-organization, and information sharing
as the keys to effectively operating in ever-more complex conflicts that move at an
ever-faster pace and over ever-greater geographical distances (LeMay & Smith, 1968;
hoarding information (Clarke & Chess, 2009: 10001001). Similarly, over the last 50 years,

Romjue, 1984; Cebrowski & Garstka, 1998; Hammond, 2001). In the case of preventing or defending against
cyberattacks on critical infrastructure, we must recognize that most cyber and physical infrastructures are owned

a centralized, military-led effort to protect the fortress at every point


will not work. A combination of incentives, regulations, and public-private
partnerships will be necessary. This will be complex, messy, and difficult. But a cyberattack,
by private actors. Thus,

should it occur, will be equally complex, messy, and difficult, occurring instantaneously over global distances via a

The owners
and operators of our critical infrastructures are on the front lines and will be the first
responders. They must be empowered to act. Similarly, if the worst should occur,
average citizens must be empowered to act in a decentralized , self-organized way
to help themselves and others. In the case of critical infrastructures like the
electrical grid, this could include the promotion of alt ernative energy
generation and distribution methods. In this way, Instead of being passive consumers,
[citizens] can become actors in the energy network. Instead of waiting for blackouts,
they can organize alternatives and become less vulnerable to either terror or natural
medium that is almost incomprehensible in its complex interconnections and interdependencies.

catastrophe (Nye, 2010: 203)

2NC O/V
Counterplan solves all of their grid and cyber-terrorism
impactswe mandate the USFG provide incentives,
regulations, and P3s for widespread adoption of alt energy and
grid decentralizationthis means each building has its own
microgrid, which allows for local, decentralized responses to
cyberterror attacks and solves their impactthats Lawson

2NC CP>AFF
Only the CP solvesa centralized grid results in inevitable
failures and kills the economy
Warner 10
(Guy Warner. Guy Warner is a leading economist and the founder and CEO of Pareto Energy. "Moving
U.S. energy policy to a decentralized grid," Grist. 6-4-2010. http://grist.org/article/2010-06-03-movingu-s-energy-policy-to-a-decentralized-grid-rethinking-our///ghs-kw)

the
technology to deliver this energy to the places where it is most needed is decades
behind. Americas current electricity transmission and distribution grid was
built more than a century ago. Relying on the grid to relay power from wind farms in the Midwest
to cities on the east and west coast is simply not feasible. Our dated infrastructure cannot handle
the existing load power outages and disruptions currently cost the nation an
estimated $164 billion each year. Wind and solar power produce intermittent power, which, in small
doses, has little impact on grid operations. As we introduce increasingly larger amounts of
intermittent power, our transmission system will require significant upgrades and
And, while the development of renewable energy technology has sped up rapidly in recent years,

perhaps even a total grid infrastructure redesign, which could take decades and cost billions. With 9,200 power
plants that link homes and business via 164,000 miles of lines, a national retrofit is both cost-prohibitive and

One solution to this challenge is the development of microgrids. Also known as


distributed generation, microgrids produce energy closer to the user rather than
transmitting it from remote power plants. Power is generated and stored locally and
works in parallel with the main grid, providing power as needed and utilizing the
main grid at other times. Microgrids offer a decentralized power source that
can be introduced incrementally in modules now without having to deal with the years of
improbable.

delay realistically associated with building central generation facilities (e.g. nuclear) and their associated
transmission and distribution system add-ons. There is also a significant difference in the up-front capital costs that
are ultimately assigned the consumer. Introducing generation capacity into a microgrid as needed is far less capital
intensive, and some might argue more economical, than building a new nuclear plant at a cost of $5-12 billion
dollars.

Technological advancements in connectivity mean that microgrids can now be


developed for high energy use building clusters, such as trading floors and
hospitals, relieving stress on the macrogrid, and providing more reliable power. In fact,
microgrids can be viewed as the ultimate smart grid, providing local power that
meets local needs and utilizing energy sources, including renewables, that best fit
the location and use profile. For example, on the East Coast, feasibility studies are underway to retrofit
obsolete paper mills into biomass fuel generators utilizing left over pulp wood. Pulp wood, the waste left over from
logging, can be easily pelletized, is inexpensive to produce, easy to transport, and has a minimal net carbon output.
Wood pellets are also easily adaptable to automated combustion systems, making them a valuable domestic
resource that can supplement and replace our use of fossil fuels, particularly in microgrids which can be designed to
provide heating and cooling from these biomass products.

2NC Terror Solvency


Decentralization solves terror threats
Verclas 12
(Verclas, Kristen. Kirsten Verclas works as International Program Officer at the National Association of
Regulatory Utility Commissioners (NARUC) in Washington, DC. She holds a BA in International
Relations with a Minor in Economics from Franklin and Marshall College and an MA in International
Relations with a concentration in Security Studies from The Elliott School at The George Washington
University. She also earned an MS in Energy Policy and Climate from Johns Hopkins University in
August 2013. "The Decentralization of the Electricity Grid Mitigating Risk in the Energy Sector ,
American Institute for Contemporary German Studies at John Hopkins University. 4-27-2012.
http://www.aicgs.org/publication/the-decentralization-of-the-electricity-grid-mitigating-risk-in-theenergy-sector///ghs-kw)

A decentralized electricity grid has many environmental and security benefits.


Microgrids in combination with distributed energy generation provide a system of small power
generation and storage systems, which are located in a community or in individual
houses. These small power generators produce on average about 10 kW (for individual homes) to 2 MW (for
communities) of electricity. While connected to and able to feed excess energy into the grid, these generators
are simultaneously independent from the grid in that they can provide power even
when power from the main grid is not available. Safety benefits from a
decentralized grid are immense, as it has build-in redundancies. These
redundancies are needed should the main grid become inoperable due to a natural
disaster or terrorist attack. Communities or individual houses can then rely on microgrids with
distributed electricity generation for their power supply. Furthermore, having less
centralized electricity generation and fewer main critical transmission lines reduces
targets for terrorist attacks and natural disasters. Fewer people would then be impacted by subsequent
power outages. Additionally, decentralized power reduces the obstacles to disaster
recovery by allowing the focus to shift first to critical infrastructure and then to flow
outward to less integrated outlets.[ 10] Thus critical facilities such as hospitals or
police stations would be the first to have electricity restored, while non-essential
infrastructure would have energy restored at a later date. Power outages are not only
dangerous for critical infrastructure, they also cost money to business and the economy overall. EPRI reported that

Decentralized
grids are also more energy efficient than centralized electricity grids because as
electricity streams through a power line a small fraction of it is lost to various
factors. The longer the distance the greater the loss.[ 12] Savings that are realized by
having shorter transmission lines could be used to install the renewable energy sources close
to homes and communities. The decrease of transmission costs and the increase in
efficiency would cause lower electricity usage overall. A decrease in the need to generate
power outages and quality disturbances cost American businesses $119 billion per year.[11]

electricity would also increase energy securityfewer imports of energy would be needed. The U.S. especially has
been concerned with energy dependence in the last decades; decentralized electricity generation could be one of
the policies to address this issue.

Decentralization solves cyberattacks


Kiger 13
(Patrick J. Kiger. "Will Renewable Energy Make Blackouts Into a
Thing of the Past?," National Geographic Channel. 10-2-2013.
http://channel.nationalgeographic.com/americanblackout/articles/will-renewable-energy-make-blackouts-into-athing-of-the-past///ghs-kw)
The difference is that Germanys grid of the future, unlike the present U.S. system, wont rely on big power plants

a decentralized smart gridessentially, a


system composed of many small, potentially self-sufficient grids, that will obtain
and long transmission lines. Instead, Germany is creating

much of their power at the local level from renewable energy sources, such as solar
panels, wind turbines and biomass generators. And the system will be equipped with
sophisticated information and communications technology (ICT) that will enable it to
make the most efficient use of its energy resources. Some might scoff at the idea that a nation
could depend entirely upon renewable energy for its electrical needs, because both sunshine and wind tend to be
variable, intermittent producers of electricity. But the Germans plan to get around that problem by using linked
renewablesthat is, by combining multiple sources of renewable energy, which has the effect of smoothing out
the peaks and valleys of the supply. As Kurt Rohrig, the deputy director of Germanys Fraunhofer Institute for Wind
Energy and Energy System Technology, explained in a recent article on Scientific Americans website :

"Each
source of energybe it wind, sun or bio-gashas its strengths and weaknesses. If
we manage to skillfully combine the different characteristics of the regenerative
energies, we can ensure the power supply for Germany." A decentralized smart grid
powered by local renewable energy might help protect the U.S. against a
catastrophic blackout as well, proponents say. A more diversified supply with more
distributed generation inherently helps reduce vulnerability, Mike Jacobs, a
senior energy analyst at the Union of Concerned Scientists, noted in a recent blog post on the organizations

such a system would have


the ability to bank surplus electricity from wind turbines and solar panels in
numerous storage locations around the system. Utility operators could tap into
those reserves if electricity generation ebbed. Additionally , in the event of a large-scale
disruption, a smart grid would have the ability to switch areas over to power generated
by utility customers themselves, such as solar panels that neighborhood residents
have installed on their roofs. By combining these "distributed generation" resources,
a community could keep its health center, police department, traffic lights, phone
system, and grocery store operating during emergencies, DOEs website notes. "There are
lots of resources that contribute to grid resiliency and flexibility," Allison Clements, an
official with the Natural Resource Defense Council, wrote in a recent blog post on the NRDC website. "Happily,
they are the same resources that are critical to achieving a clean energy, low
carbon future." Joel Gordes, electrical power research director for the U.S. Cyber Consequences Unit, a
private-sector organization that investigates terrorist threats against the electrical grid and
other targets, also thinks that such a decentralized grid "could carry benefits
not only for protecting us to a certain degree from cyber-attacks but also providing
website. According to the U.S. Department of Energys SmartGrid.gov website,

power during any number of natural hazards." But Gordes does offer a caveatsuch a system might also offer more
potential points of entry for hackers to plant malware and disrupt the entire grid. Unless that vulnerability is
addressed, he warned in an e-mail, "full deployment of [smart grid] technology could end up to be disastrous."

Patent Reform Advantage CP

Notes
Specify reform + look at law reviews
Read the 500 bil card in the 1NC
Cut different versions w/ different mechanisms

1NC Comprehensive Reform


Counterplan: the United States federal government should
comprehensively reform its patent system for the purpose of
eliminating non-practicing entities.
Patent trolls cost the economy half a trillion and counting
larger internal link to tech and the economy
Lee 11
(Timothy B. Lee. Timothy B. Lee covers tech policy for Ars, with a particular focus on patent and
copyright law, privacy, free speech, and open government. While earning his CS master's degree at
Princeton, Lee was the co-author of RECAP, a Firefox plugin that helps users liberate public documents
from the federal judiciary's paywall. Before grad school, he spent time at the Cato Institute, where he
is an adjunct scholar. He has written for both online and traditional publications, including Slate,
Reason, Wired.com, and the New York Times. When not screwing around on the Internet, he can be
seen rock climbing, ballroom dancing, and playing soccer. He lives in Philadelphia. He has a blog at
Forbes and you can follow him on Twitter. "Study: patent trolls have cost innovators half a trillion
dollars," Ars Technica. xx-xx-xxxx. http://arstechnica.com/tech-policy/2011/09/study-patent-trolls-havecost-innovators-half-a-trillion-bucks///ghs-kw)

patent trolls has become well-known: a small company with no products of


its own threatens lawsuits against larger companies who inadvertently infringe its
portfolio of broad patents. The scenario has become so common that we don't even try to cover all the
By now, the story of

cases here at Ars. If we did, we'd have little time to write about much else. But anecdotal evidence is one thing.

Three Boston University researchers have produced a rigorous empirical


estimate of the cost of patent trolling. And the number is breath-taking: patent trolls ("nonpracticing entity" is the clinical term) have cost publicly traded defendants
$500 billion since 1990. And the problem has become most severe in recent years. In the last four
years, the costs have averaged $83 billion per year. The study says this is more than
a quarter of US industrial research and development spending during
those years. Two of the study's authors, James Bessen and Mike Meurer, wrote Patent Failure, an empirical
Data is another.

study of the patent system that has been widely read and cited since its publication in 2008. They were joined for

The most obvious


costs for defendants are legal fees and payouts to plaintiffs, but these are not
necessarily the largest costs. Often, indirect costs like employee distraction, legal
uncertainty, and the need to redesign or drop key products are even more
significant. The trio use a clever method known as a stock market event study to estimate these costs. The
this paper by a colleague, Jennifer Ford.It's hard to measure the costs of litigation directly.

theory is simple: a company's stock price represents the stock market's best estimation of the company's value. If
the company's stock drops by, say, two percent in the days after a lawsuit is filed, then the market thinks the
lawsuit will cost the company two percent of its market capitalization. Of course, this wouldn't be a very rigorous
technique if they were looking at a single lawsuit. Any number of factors could have affected the firm's stock price

with a large sample


of companies, these random factors should mostly cancel each other out, leaving the
market's rough estimate of how much patent lawsuits cost their targets. The
authors used a database of 1,630 patent troll lawsuits compiled by Patent Freedom. Because
many of the lawsuits had multiple defendants, there was a total of 4,114 plaintiff-defendant
pairs. The median defendant over all of these pairs lost $20.4 million in market capitalization, while the mean loss
that same week. Maybe the company released a bad earnings report the next day. But

was $122 million.

2NC Solvency
(Senator Orrin Hatch. "Senator Hatch: Its Time to Kill Patent Trolls for Good,"
WIRED. 3-16-2015. http://www.wired.com/2015/03/opinion-must-finallylegislate-patent-trolls-existence///ghs-kw)
There is broad agreementamong both big and small businessesthat any
serious solution must include:

Fee shifting, which will require patent trolls to pay legal fees when their
suits are unsuccessful;

Heightened pleading and discovery standards, which will raise the bar
on litigation procedure, making it increasingly difficult for trolls to file
frivolous lawsuits;

Demand letter reforms, which will require those sending demand


letters to be more specific and transparent;

Stays of customer suits, which will allow a manufacturers case to


move forward first, without binding the end user to the result of that case;

A mechanism to enable recovery of fees, which will prevent insolvent


plaintiffs from litigating and dashing.
Some critics argue that these proposals will help only large technology
companies and might even hurt startups and small businesses. In my
discussions with stakeholders, however, I have repeatedly been told that a
multi-pronged approach that tackles each of these issues is needed to
effectively combat patent trolls across all levels of industry. These
stakeholder discussions have included representatives from the hotel,
restaurant, retail, real estate, financial services, and high-tech industries, as
well as start-up and small business owners.
Enacting legislation on any topic is a major undertaking, and the added
complexities inherent in patent law make passing patent reforms especially
challenging. Crucially, we will probably have only one chance to do so for a
long while, so whatever we do must work. We must not pass any bill that
fails to provide an effective deterrent against patent trolls at all stages of
litigation.
It is my belief that any viable legislation must ensure that those who
successfully defend against abusive patent litigation and are awarded fees
will actually get paid. Even when a patent troll is a shell company with no
assets, there are usually other parties with an interest in the litigation who
do have assets. These parties, however, often keep themselves beyond the
jurisdiction of the courts. They reap benefits if the plaintiff forces a
settlement, but are protected from any liability if they lose.
Right now, thats a win-win situation for these parties, and a lose-lose
situation for Americas innovators.

Because Congress cannot force parties outside a courts jurisdiction to join in


a case, we must instead incentivize interested parties to do the right thing
and pay court-ordered fee awards. This is why we must pass legislation that
includes a recovery provision. Fee shifting without recovery is like writing a
check on an empty account. Its purporting to convey something that isnt
there. Only fee shifting coupled with a recovery provision will stop patent
trolls from litigating-and-dashing.
There is no question that American ingenuity fuels our economy. We must
ensure that our patent system is strong and vibrant and helps to protect our
countrys premier position in innovation.

Reform solves patent trolling


Roberts 14
(Jeff John Roberts. Jeff reports on legal issues that impact the future of the tech industry, such as
privacy, net neutrality and intellectual property. He previously worked as a reporter for Reuters in
Paris and New York, and his free-lance work includes clips for the Economist, the New York Times and
the Globe & Mail. A frequent guest on media outlets like NPR and Fox, Jeff is also a lawyer, having
passed the bar in New York and Ontario. "Patent reform is likely in 2015. Heres what it could look
like," No Publication. 11-19-2014. https://gigaom.com/2014/11/19/patent-reform-is-likely-in-2015heres-what-it-could-look-like///ghs-kw)

real reform
will depend on changing the economic asymmetries in patent litigation
that allow trolls to flourish, and that lead troll victims to simply pay up
rather engage in costly litigation. Here are some measures we are likely to see under
the Goodlatte bill, according to Crouch and legal sources like IAM and Law.com (subscription required) : Feeshifting: Right now, trolls typically have nothing to lose by filing a lawsuit since they
are shell companies with no assets. New fee-shifting measures, however, could put
them on the hook for their victims legal fees. Discovery limits: Currently, trolls can
exploit the discovery process in which each side must offer up documents and
depositions by drowning their targets in expensive and time-consuming requests.
Limiting the scope of discovery could take that tactic off the table. Heightened
pleading requirements: Right now, patent trolls dont have to specify how exactly a
company is infringing their technology, but can simply serve cookie-cutter
complaints that list the patents and the defendant. Pleading reform would force the
trolls to explain what exactly they are suing over, and give defendants a better
opportunity to assess the case. Identity requirements: This reform proposal is known
as real party of interest and would make it harder for those filing patent lawsuits
(often lawyers working with private equity firms) to hide behind shell companies,
and require them instead to identify themselves. Crouch also notes the possibility of
expanded post-grant review, which gives defendants a fast and cheaper tool to
invalidate bad patents at the Patent Office rather than in federal court.
A patent scholar Dennis Crouch notes, the question is how far the new law will go. In particular,

2NC O/V
The status quo patent system is hopelessly broken and allows
patent trolls to game the system by gaining broad patents for
objects such as selling objects on the internetthose firms sue
innovators and startups who violate their patents, costing
the US economy half a trillion and stifling innovationthats
Lee
The counterplan eliminates patent trolls through a set of
comprehensive reforms well describe belowsolves their
innovation argumentss and independently is a bigger internal
link to innovation and the economy
Patent reform is key to prevent patent trolling that stifle
innovation and reduce R&D by half
Bessen 14
(James Bessen. Bessen is a Lecturer in Law at the Boston
University School of Law. Bessen was also a Fellow at the
Berkman Center for Internet and Society. "The Evidence Is In:
Patent Trolls Do Hurt Innovation," Harvard Business Review.
November 2014. https://hbr.org/2014/07/the-evidence-is-inpatent-trolls-do-hurt-innovation//ghs-kw)
patent trolls, firms that make their money
asserting patents against other companies, but do not make a useful product of
their own. Both the White House and Congressional leaders have called for
patent reform to fix the underlying problems that give rise to patent troll
lawsuits. Not so fast, say Stephen Haber and Ross Levine in a Wall Street Journal Op-Ed (The Myth of the
Over the last two years, much has been written about

Wicked Patent Troll). We shouldnt reform the patent system, they say, because there is no evidence that trolls are
hindering innovation; these calls are being driven just by a few large companies who dont want to pay inventors.
But there is evidence of significant harm. The White House and the Congressional Research Service both cited many

patent litigation harms innovation. And three new


empirical studies provide strong confirmation that patent litigation is
reducing venture capital investment in startups and is reducing R&D
spending, especially in small firms. Haber and Levine admit that patent litigation is surging.
There were six times as many patent lawsuits last year than in the 1980s. The
number of firms sued by patent trolls grew nine-fold over the last decade; now a
majority of patent lawsuits are filed by trolls. Haber and Levine argue that this is not a problem: it
research studies suggesting that

might instead reflect a healthy, dynamic economy. They cite papers finding that patent trolls tend to file suits in
innovative industries and that during the nineteenth century, new technologies such as the telegraph were
sometimes followed by lawsuits. But this does not mean that the explosion in patent litigation is somehow normal.
Its true that plaintiffs, including patent trolls, tend to file lawsuits in dynamic, innovative industries. But thats just
because they follow the money. Patent trolls tend to sue cash rich companies, and innovative new technologies
generate cash. The economic burden of todays patent lawsuits is, in fact, historically unprecedented.

Research shows that patent trolls cost defendant firms $29 billion per year
in direct out-of-pocket costs; in aggregate, patent litigation destroys over
$60 billion in firm wealth each year. While mean damages in a patent lawsuit ran around
$50,000 (in todays dollars) at the time the telegraph, mean damages today run about $21 million. Even taking into
account the much larger size of the economy today, the economic impact of patent litigation today is an order of

these costs fall


disproportionately on innovative firms: the more R&D a firm performs, the
more likely it is to be sued for patent infringement , all else equal. And,
magnitude larger than it was in the age of the telegraph. Moreover,

although this fact alone does not prove that this litigation reduces firms innovation, other evidence suggests that

medical imaging businesses


sued by a patent troll reduced revenues and innovations relative to comparable
companies that were not sued. But the biggest impact is on small startup firms
contrary to Haber and Levine, most patent trolls target firms selling less than $100 million a year. One survey
of software startups found that 41% reported significant operational impacts from
patent troll lawsuits, causing them to exit business lines or change strategy. Another
survey of venture capitalists found that 74% had companies that experienced
significant impacts from patent demands. Three recent econometric studies
confirm these negative effects. Catherine Tucker of MIT analyzed venture capital investing relative to
this is exactly what happens. A researcher at MIT found, for example, that

patent lawsuits in different industries and different regions of the country. Controlling for the influence of other

lawsuits from frequent litigators (largely patent trolls) were


responsible for a decline of $22 billion in venture investing over a five-year period.
That represents a 14% decline. Roger Smeets of Rutgers looked at R&D spending by small firms,
factors, she estimates that

comparing firms that were hit by extensive lawsuits to a carefully chosen comparable sample. The comparison
sample allowed him to isolate the effect of patent lawsuits from other factors that might also influence R&D

Prior to the lawsuit, firms devoted 20% of their operating expenditures to


R&D; during the years after the lawsuit, after controlling for other factors, they reduced that
spending by 3% to 5% of operating expenditures, representing about a 19% reduction in relative
R&D spending. And researchers from Harvard and the University of Texas recently examined R&D spending
spending.

of publicly listed firms that had been sued by patent trolls. They compared firms where the suit was dismissed,
representing a clear win for the defendant, to those where the suit was settled or went to final adjudication
(typically much more costly). As in the previous paper, this comparison helped them isolate the effect of lawsuits

firms reduced their R&D


spending by $211 million and reduced their patenting significantly in subsequent
years. The reduction in R&D spending represents a 48% decline. Importantly,
from other factors. They found that when lawsuits were not dismissed,

these studies are initial releases of works in progress; the researchers will refine their estimates of harm over the

across a significant
number of studies using different methodologies and performed by different
researchers, a consistent picture is emerging about the effects of patent litigation: it
costs innovators money; many innovators and venture capitalists report that it
significantly impacts their businesses; innovators respond by investing less in R&D
and venture capitalists respond by investing less in startups. Haber and Levine might not like
the results of this research. But the weight of the evidence from these many studies cannot be ignored; patent
trolls do, indeed, cause harm. Its time for Congress to do something about it.
coming months. Perhaps some of the estimates may shrink a bit. Nevertheless,

2NC Comprehensive Reform


Comprehensive reform solves patent trolling
Downes 7/6
(Larry Downes. Larry Downes is an author and project director at the Georgetown Center for Business
and Public Policy. His new book, with Paul Nunes, is Big Bang Disruption: Strategy in the Age of
Devastating Innovation. Previous books include the best-selling Unleashing the Killer App: Digital
Strategies for Market Dominance. "What would 'real' patent reform look like?," CNET. 7-6-2015.
http://www.cnet.com/news/what-does-real-patent-reform-look-like///ghs-kw)

reversing the damage to the


innovation economy caused by years of overly generous patent policies requires far
stronger medicine than Congress is considering or the courts seem willing to swallow on their own. The bills
And a new report (PDF) from technology think tank Lincoln Labs argues that

making their way through Congress, for example, focus almost entirely on curbing abuses by companies that buy
up often overly broad patents and then, rather than produce goods, simply sue manufacturers and users they argue

nonpracticing entities, referred to derisively as patent


trolls, are widely seen as a serious drag on innovation, particularly in fastevolving technology industries. Trolling behavior, according to studies from
Stanford Law School professor and patent expert Mark Lemley, does little to nothing
to promote the Constitutional goal of patents to encourage innovation by granting
inventors temporary monopolies during which they can recover their investment. The
are infringing their patents. These

House of Representatives passed antitrolling legislation in 2013, but a Senate version was killed by then-Majority

"Patent trolls," said Gary Shapiro, president and CEO of the Consumer
$1.5 billion a week from the US economy -- that's almost
$120 billion since the House passed a patent reform bill in December of 2013." A call for 'real' patent reform The
Lincoln Labs report agrees with these and other criticisms of patent trolling, but argues for more
fundamental changes to the system, or what the report calls "real" patent reform. The
report, authored by former Republican Congressional staffer Derek Khanna, urges a complete overhaul
of the process by which the Patent Office reviews applications, as well as the
elimination of patents for software, business methods, and a special class of patents
for design elements -- a category that figured prominently in the smartphone wars. Khanna claims that the
Leader Harry Reid (D-Nev.) in May 2014.
Electronics Association, "bleed

Patent Office has demonstrated an "abject failure" to enforce fundamental legal requirements that patents only be

To reverse that trend, the report calls


on Congress to change incentives for patent examiners that today weigh the scales
in favor of approval, add a requirement for two examiners to review the most
problematic categories of patents, and allow crowdsourced contributions to Patent
Office databases of "prior art" to help filter out nonnovel inventions. Khanna
estimates these reforms alone "would knock out a large number of software
patents, perhaps 75-90%, where the economic argument for patents is exceedingly
difficult to sustain." The report also calls for the elimination of design patents, which
offer protection for ornamental features of manufactured products, such as the
original design of the Coca-Cola bottle.
granted for inventions that are novel, nonobvious and useful.

Reg-Neg CP

1NC Shell
Text: the United States federal government should enter into a
process of negotiated rulemaking over _______<insert
plan>______________ and implement the results of negotiation.
The CP is plan minusit doesnt mandate the plan, just that a
regulatory negotiations committee is created to discuss the
plan
And, it competesreg neg is not normal means
USDA 06
(The U.S. Department of Agricultures Agricultural Marketing Service administers programs that
facilitate the efficient, fair marketing of U.S. agricultural products, including food, fiber, and specialty
crops What is Negotiated Rulemaking?. Last updated June 6th 2014.
http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELPRDC5089434) //ghs-kw)

reg-neg differs from traditional notice-and-comment rulemaking The


traditional notice-and-comment rulemaking provided in the Administrative Procedure Act (APA)
requires an agency planning to adopt a rule on a particular subject to publish a
proposed rule (NPRM) in the Federal Register and to offer the public an opportunity to
comment. The APA does not specify who is to draft the proposed rule nor any
particular procedure to govern the drafting process. Ordinarily, agency staff performs this
function, with discretion to determine how much opportunity is allowed for public input. Typically, there is no
opportunity for interchange of views among potentially affected parties,
even where an agency chooses to conduct a hearing . The traditional notice-andHow

comment rulemaking can be very adversarial. The dynamics encourage parties to take extreme positions in their
written and oral statements in both pre-proposal contacts as well as in comments on any published proposed rule
as well as withholding of information that might be viewed as damaging. This adversarial atmosphere may
contribute to the expense and delay associated with regulatory proceedings, as parties try to position themselves
for the expected litigation. What is lacking is an opportunity for the parties to exchange views, share information,

In negotiated rulemaking, the


agency, with the assistance of one or more neutral advisors known as convenors,
assembles a committee of representatives of all affected interests to
negotiate a proposed rule. Sometimes the law itself will specify which interests are to be included on
and focus on finding constructive, creative solutions to problems.

the committee. Once assembled, the next goal is for members to receive training in interest-based problem-solving

They then must make sure that all views are heard and that
each committee member agrees to a set of ground rules for the negotiated rulemaking
process. The ultimate goal is to reach consensus on a text that all parties
can accept. The agency is represented at the table by an official who is sufficiently senior to be able to speak
authoritatively on its behalf. Negotiating sessions are chaired by a neutral mediator or
facilitator skilled in assisting in the resolution of multiparty disputes. The Checklist
and consensus-decision making.

Advantages as well as Misperceptions The advantages of negotiated rulemaking include: Producing greater
information sharing and better communication; Enhancing public awareness and involvement; Providing a
reality check to agencies and other interests; Encouraging discovery of more creative options for rulemaking;
Increasing compliance with rules; Saving time, money and effort in the long run; Allowing earlier
implementation dates; Building cooperative relationships among key parties; Increasing the certainty of the
outcome for all and thus enabling better planning; Producing superior rules on technically complex topics
because of the input of all parties; Giving rise to fewer legislative end runs against the rule; and Reducing
post-issuance contentiousness and litigation. What negotiating rulemaking does not do: It does not cause the
agency to delegate its ultimate obligation to determine the content of the proposed and final regulations; It does
not exempt the agency from any statutory or other requirements; It does not eliminate the agencys obligation to
produce any economic analysis; paperwork or other regulatory analysis requirements imposed by law or agency
policy; It does not require parties or non-parties to set aside their legal or political rights as a condition of
participating; and It is not compulsory, participation is voluntary, for the agency and for others.

<Insert specific solvency advocate>


Reg neg solvesempirics prove
Knaster 10
(Alana Knaster is the Deputy Director of the Resource Management Agency. She was Senior Executive
in the Monterey County Planning Department for five years with responsibility for planning, building,
and code enforcement programs. Prior to joining Monterey County, Alana was the President of the
Mediation Institute, a national non-profit firm specializing in the resolution of complex land use
planning and environmental disputes. Many of the disputes that she successfully mediated, involved
dozens of stakeholder groups including government agencies, major corporations and public interest
groups. She served in that capacity for 15 years. Alana was Mayor of the City of Hidden Hills,
California from 1981-88 and represented her City on a number of regional planning agencies and
commissions. She also has been on the faculty of Pepperdine University Law School since 1989,
teaching courses in environmental and public policy mediation. Knaster, A. Resolvnig Conflicts Over
Climate Change Solutions: Making the Case for Mediation, Pepperdine Dispute Resolution Law
Journal, Vol 10, No 3, 2010. 465-501. http://law.pepperdine.edu/dispute-resolution-law-journal/issues/volumeten/Knaster%20Article.pdf//ghs-kw)

Federal and international dispute resolution process models. There are also models in
U.S. and Canadian legislation supporting the use of consensus-based processes. These
processes have been successfully applied to resolve dozens of disputes
that involved multiple stakeholder interests, on technically and politically
complex environmental and public policy issues. For example, the Negotiated
Rulemaking Act of 1990 was enacted by Congress to formalize a process for negotiating contentious new
regulations.118 The Act provides a process called reg neg by which representatives of interest
groups that could be substantially affected by the provisions of a regulation, and
agency staff negotiate the provisions.119 The meetings are open to the public;
however, the process does enable negotiators to hold private interest group caucuses. If a consensus is
reached on the provisions of the rule, the Agency commits to publish the consensus
rule in the Federal Register for public comment. 120 The participants in the reg neg
agree that as long as the final regulation is consistent with what they have jointly
recommended, they will not challenge it in court. The assumption is that parties will
support a product that they negotiated.121 Reg neg has been utilized by
numerous federal agencies to negotiate rules pertaining to a diverse
range of topics including safe drinking water, fugitive gasoline emissions,
eligibility for educational loans, and passenger safety .122 In 1991, in Canada, an
initiative was launched by the National Task Force on Consensus and Sustainability to develop a guidance document
that would govern how federal, provincial, and municipal governments would address resource management
disputes. The document that was negotiated, Building Consensus for a Sustainable Future: Guiding Principles, was
adopted by consensus in 1994.123 The document outlined principles for building a consensus and process steps.
The ten principles included provisions regarding inclusivity of the process (this was particularly important in Canada
with respect to inclusion of Aboriginal peoples), voluntary participation, accountability to constituencies, respect for

The consensus principles were


subsequently utilized to resolve disputes over issues that included sustainable
forest management, siting of solid waste facilities, impacts of pulp mill expansion,
and economic diversification based on sustainable wildlife resources .125 The reg neg
and Consensus for Sustainable Future model represent codified mediated negotiation
processes that have withstood the test of legal challenge and have been strongly
endorsed by the groups that have participated in these processes.
diverse interests, and commitment to any agreement adopted.124

1NC Ptix NB
Doesnt link to politicsempirics prove
USDA 6/6
(The U.S. Department of Agricultures Agricultural Marketing Service administers programs that
facilitate the efficient, fair marketing of U.S. agricultural products, including food, fiber, and specialty
crops What is Negotiated Rulemaking?. Last updated June 6th 2014 @
http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELPRDC5089434)

Congress endorsed use by federal agencies of an alternative procedure known as "negotiated


rulemaking,"'' also called "regulatory negotiation," or "reg-neg." It has been used by agencies to bring interested parties into the rule-drafting process at an early stage,
History In 1990,

under circumstances that foster cooperative efforts to achieve solutions to regulatory problems. Where successful, negotiated rulemaking can lead to better, more acceptable rules,

Negotiated rules may be easier to enforce and


less likely to be challenged in litigation. The results of reg-neg usage by the
federal government, which began in the early 1980s, are impressive: large-scale regulators as the
Environmental Protection Agency, Nuclear Regulatory Commission, Federal Aviation Administration, and the Occupational Safety and Health
Administration used the process on many occasions . Building on these positive experiences, several states, including Massachusetts, New
York, and California, have also begun using the procedure for a wide range of rules. The very first negotiated rule-making was convened
by the Federal Mediation and Conciliation Service (FMCS) working with the Department of
Transportation, the Federal Aviation Administration, airline pilots and other interested groups to deal with regulations concerning flight and duty time for pilots. The negotiated
rulemaking was a success and a draft rule was agreed upon that became the final rule. Since that first reg-neg. FMCS has assisted in both the convening and
facilitating stages in many such procedures at the Departments of Labor, Health and Human Services (HRSA), Interior, Housing and
Urban Development, and the EPA, as well as state-level processes, and other forms of consensus-based decision-making programs such as public policy dialogues, hearings, focus
based on a clearer understanding of the concerns of all those affected.

groups, and meetings.

1NC Fism NB
Failure to use reg neg results in a federalism crisisREAL ID
proves
Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)

value of negotiated rulemaking to federalism


bargaining may be best understood in relief against the failure of alternatives in
federalism-sensitive [*57] contexts. Particularly informative are the strikingly different state responses to
b. A Cautionary Tale: The REAL ID Act The

the two approaches Congress has recently taken in tightening national security through identifi-cation reform--one

requiring regulations through negotiated rulemaking, and the other through traditional
notice and comment. After the 9/11 terrorist attacks, Congress ordered the Department of Homeland

Security (DHS) to establish rules regarding valid identification for federal purposes (such as boarding an aircraft or
accessing federal buildings). n291 Recognizing the implications for state-issued driver's licenses and ID cards,

ne-gotiated rulemaking to forge consensus among the states about how best
States leery of the stag-gering costs associated with proposed reforms
participated actively in the process. n293 However, the subsequent REAL ID Act of
2005 repealed the ongoing negotiated rulemaking and required DHS to prescribe top-down federal requirements for state-issued licenses. n294 The resulting DHS rules have been bitterly opposed
by the majority of state governors, legislatures, and motor vehicle administrations,
n295 prompting a virtual state rebellion that cuts across the red-state/blue-state
political divide. n296 No state met the December 2009 deadline initially contemplated by the
statute, and over half have enacted or considered legislation prohibiting compliance
with the Act, defunding its implementation, or calling for its repeal . n297 In the face of
this unprecedented state hostility, DHS has extended compliance deadlines even for those that did not
request extensions, and bills have been introduced in both houses of Congress to repeal
the Act. n298 Efforts to repeal what is increasingly referred to as a "failed" policy have won
endorsements [*58] from or-ganizations across the political spectrum . n299 Even the
Congress required DHS to use
to proceed. n292

Executive Director of the ACLU, for whom federalism concerns have not historically ranked highly, opined in USA
Today that the REAL ID Act violates the Tenth Amendment. n300

US federalism will be modelled globallysolves human rights,


free trade, war, and economic growth
Calabresi 95
(Steven G. Calabresi is a Professor of Law at Northwestern University and is a graduate of the Yale
Law School (1983) and of Yale College (1980). Professor Calabresi was a Scholar in Residence at
Harvard Law School from 2003 to 2005, and he has been a Visiting Professor of Political Science at
Brown University since 2010. Professor Calabresi was also a Visiting Professor at Yale Law School in
the Fall of 2013. Professor Calabresi served as a Law Clerk to Justice Antonin Scalia of the United
States Supreme Court, and he also clerked for U.S. Court of Appeals Judges Robert H. Bork and Ralph
K. Winter. From 1985 to 1990, he served in the Reagan and first Bush Administrations working both in

the West Wing of the Reagan White House and before that in the U.S. Department of Justice. In 1982,
Professor Calabresi co-founded The Federalist Society for Law & Public Policy Studies, a national
organization of lawyers and law students, and he currently serves as the Chairman of the Societys
Board of Directors a position he has held since 1986. Since joining the Northwestern Faculty in 1990,
he has published more than sixty articles and comments in every prominent law review in the country.
He is the author with Christopher S. Yoo of The Unitary Executive: Presidential Power from
Washington to Bush (Yale University Press 2008); and he is also a co-author with Professors Michael
McConnell, Michael Stokes Paulsen, and Samuel Bray of The Constitution of the United States (2nd ed.
Foundation Press 2013), a constitutional law casebook. Professor Calabresi has taught Constitutional
Law I and II; Federal Jurisdiction; Comparative Law; Comparative Constitutional Law; Administrative
Law; Antitrust; a seminar on Privatization; and several other seminars on topics in constitutional law.
Calabresi, S. G. Government of Limited and Enumerated Powers: In Defense of United States v.
Lopez, A Symposium: Reflections on United States v. Lopez, Michigan Law Review, Vol 92, No 3,
December 1995. Ghs-kw)

a desire for both international and devolutionary federalism has swept


across the world in recent years. To a significant extent, this is due to global fascination with
and emulation of our own American federalism success story. The global trend toward
federalism is an enormously positive development that greatly increases the likelihood of future
peace, free trade, economic growth, respect for social and cultural diversity, and
protection of individual human rights. It depends for its success on the willingness
of sovereign nations to strike federalism deals in the belief that those deals will be
kept.233 The U.S. Supreme Court can do its part to encourage the future striking of such
deals by enforcing vigorously our own American federalism deal . Lopez could be
We have seen that

a first step in that process, if only the Justices and the legal academy would wake up to the importance of what is at
stake.

Federalism solves economic growth


Bruekner 05
(Jan K. Bruekner is a Professor of Economics University of California, Irvine. He is a Member member of
the Institute of Transportation Studies, Institute for Mathematical Behavioral Sciences, and a former
editor of the Journal of Urban Economics. Bruekner, J. K. Fiscal Federalism and Economic Growth,
CESifo Working Paper No. 1601, Novermber 2005. https://www.cesifogroup.de/portal/page/portal/96843357AA7E0D9FE04400144FAFBA7C//ghs-kw)

faster economic growth may constitute an additional


federalism beyond those already well recognized. This result, which matches the
conjecture of Oates (1993) and the expectations of most empirical researchers who have
studied the issue, arises from an unexpected source: a greater incentive to save
when public-good levels are tailored under federalism to suit the differing
demands of young and old consumers. This effect grows out of a novel
interaction between the rules of public-good provision which apply cross-sectionally
at a given time and involve the young and old consumers of different generations,
and the savings decision of a given generation, which is intertemporal in nature. This crossThe analysis in this paper suggests that

benefit of

fiscal

sectional/intertemporal interaction yields the link between federalism and economic growth. While it is encouraging

the papers results match recent empirical findings showing a positive growth
impact from fiscal decentralization, additional theoretical work exploring other possible sources of such
that

a link is clearly needed. The present results emerge from a model based on very minimal assumptions, but
exploration of richer models may also be fruitful.

US economic growth solves war, collapse ensures instability


National Intelligence Council, 12 (December, Global Trends 2030:
Alternative Worlds
http://www.dni.gov/files/documents/GlobalTrends_2030.pdf)

a reinvigorated US economy would increase the


prospects that the growing global and regional challenges would be addressed. A stronger
US economy dependent on trade in services and cutting-edge technologies would be a boost for the world
economy, laying the basis for stronger multilateral cooperation. Washington would
have a stronger interest in world trade, potentially leading a process of World Trade Organization reform
Big Stakes for the International System The optimistic scenario of

that streamlines new negotiations and strengthens the rules governing the international trading system.
The US would be in a better position to boost support for a more democratic Middle
East and prevent the slide of failing states . The US could act as balancer ensuring
regional stability, for example, in Asia where the rise of multiple powersparticularly India
and Chinacould spark increased rivalries. However, a reinvigorated US would not necessarily be a panacea. Terrorism,
proliferation, regional conflicts, and other ongoing threats to the international order will be affected by the presence or absence of strong US leadership

The US impact is much more clear-cut in the negative case in


which the US fails to rebound and is in sharp economic decline. In that scenario, a large and
dangerous global power vacuum would be created and in a relatively short
space of time. With a weak US, the potential would increase for the European
economy to unravel. The European Union might remain, but as an empty shell around a fragmented continent. Progress on trade reform
as well as financial and monetary system reform would probably suffer. A weaker and less secure international
community would reduce its aid efforts, leaving impoverished or crisis-stricken countries to fend for themselves,
multiplying the chances of grievance and peripheral conflicts . In this scenario, the US would
be more likely to lose influence to regional hegemons China and India in Asia and
Russia in Eurasia. The Middle East would be riven by numerous rivalries which could
erupt into open conflict, potentially sparking oil-price shocks. This would be a world
reminiscent of the 1930s when Britain was losing its grip on its global leadership
role.
but are also driven by their own dynamics.

2NC O/V
The counterplan convenes a regulatory negotiation committee
to discuss the implementation of the plan. Stakeholders decide
how and if the plan is implementedthen implements the
decision - solves better than the AFF:
1. Collaborationreg neg facilitates government-civilian
cooperation, results in greater satisfaction with regulations
and better compliance after implementationsocial
psychology and empirics prove
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford
University, a Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a
Doctors of Jurisdictional Science from Harvard University. She served as Counselor for Energy and
Climate Change in the Obama White House in 2009-2010. Freeman is a prominent scholar of
regulation and institutional design, and a leading thinker on collaborative and contractual
approaches to governance. After leaving the White House, she advised the National Commission
on the Deepwater Horizon oil spill on topics of structural reform at the Department of the Interior.
She has been appointed to the Administrative Conference of the United States, the government
think tank for improving the effectiveness and efficiency of federal agencies, and is a member of
the American College of Environmental Lawyers. Laura I Langbein is the Professor of Quantitative
Methods, Program Evaluation, Policy Analysis, and Public Choice and American College. She holds
a PhD in Political Science from the University of North Carolina, a BA in Government from Oberlin
College. Freeman, J. Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U.
Environmental Journal, Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy
%20benefit.pdf/)
D. Compliance The compliance implications of consensus-based processes remain a matter of speculation.360
No one has yet produced empirical data on the relationship between negotiated rulemaking and compliance,
let alone data comparing the compliance implications of negotiated and conventional rules.361 However, the

The data shows reg-neg


participants to be significantly more likely than conventional rulemaking
participants to report the perception that others will be able to comply with the
final rule.362 Perceiving that others will comply might induce more compliance among competitors, along
the lines of game theoretic models, at least until evidence of defection emerges.363 Moreover, to the
extent that compliance failures are at least partly due to technical and
information deficitsrather than to mere political resistanceit seems plausible
that reports of the learning effect and more horizontal sharing of information
might help to improve compliance in the long run.364 The claim that reg-neg
could improve compliance is consistent with social psychology studies showing
that in both legal and organizational settings, fair procedures lead to greater
compliance with the rules and decisions with which they are associated .365
Similarly, negotiated rulemaking might facilitate compliance by bringing to the
surface some of the contentious issues earlier in the rulemaking process, where
Phase II results introduce interesting new findings into the debate.

they might be solved collectively rather than dictated by the agency. Although speculative, these hypotheses
seem to fit better with Kerwin and Langbeins data than do the rather negative expectations about compliance.

Higher satisfaction could well translate into better long-term compliance, even if
litigation rates remained the same. Consistent with our contention that process matters, we
expect it to matter to compliance as well. In any event, empirical studies of compliance should
no longer be so difficult to produce. A number of negotiated rules are now several years old, with
some in the advanced stages of implementation. A study of compliance might compare numbers of
enforcement actions for negotiated as compared to conventional rules, measured by notices of violation, or
penalties, for example.366 It

might investigate as well whether compliance methods

differ between the two types of rules: perhaps the enforcement of negotiated
rules occurs more cooperatively, or informally, than enforcement of conventional
rules. Possibly, relationships struck during the negotiated rulemaking make a
difference at the compliance stage.367 To date, the effects of how the rule is developed on
eventual compliance remain a matter of speculation, even though it is ultimately an empirical issue on which
both theory and empirical evidence must be brought to bear.

And, well win new net benefits here that ALL turn the aff
a. Delayscps regulatory negotiation means that rules wont
be challenged during the regulation creation processempirics
prove the CP solves faster than the AFF
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl
F. Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the
design of many of the major developments of administrative law in the past 40 years. He is the
author of more than 50 papers and books on administrative law and has been a visiting professor
or guest lecturer internationally, including at the University of Paris II, Humboldt University
(Berlin) and the University of the Western Cape (Cape Town). He has consulted on environmental
mediation and public participation in rulemaking in China, including a project sponsored by the
Supreme Peoples Court. He has received multiple awards for his achievements in administrative
law. He is listed in Who's Who in America and is a member of the Administrative Conference of the
United States.Harter, P. J. Assessing the Assessors: The Actual Performance of Negotiated
Rulemaking, December 1999. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghskw)

the average length of EPAs negotiated


rulemakings the time it took EPA to fulfill its goal was 751 days or
32% faster than traditional rulemaking. This knocks a full year off the
average time it takes EPA to develop rule by the traditional method. And, note
these are highly complex and controversial rules and that one of them survived
Presidential intervention. Thus, the dynamics surrounding these rules are by no
mean average. This means that reg negs actual performance is much better
than that. Interestingly and consistently, the average time for all of EPAs reg negs when viewed in context
Properly understood, therefore,

is virtually identical to that of the sample drawn by Kerwin and Furlong77 differing by less than a month.
Furthermore, if all of the reg negs that were conducted by all the agencies that were included in Coglianeses

the average time for all


negotiated rulemakings drops to less than 685 days .80 No Substantive Review of Rules
table78 were analyzed along the same lines as discussed here,79

Based on Reg Neg Consensus. Coglianese argues that negotiated rules are actually subjected to a higher
incident of judicial review than are rules developed by traditional methods, at least those issued by EPA.81 But,
like his analysis of the time it takes to develop rules, Coglianese fails to look at either what happened in the
negotiated rulemaking itself or the nature of any challenge. For example, he makes much of the fact that the
Grand Canyon visibility rule was challenged by interests that were not a party to the negotiations;82 yet, he
also points out that this rule was not developed under the Negotiated Rulemaking Act83 which explicitly
establishes procedures that are designed to ensure that each interest can be represented. This challenge
demonstrates the value of convening negotiations.84 And, it is significantly misleading to include it when
discussing the judicial review of negotiated rules since the process of reg neg was not followed. As for
Reformulated Gasoline, the rule as issued by EPA did not reflect the consensus but rather was modified by EPA
under the direction of President Bush.85 There were, indeed, a number of challenges to the application of the
rule,86 but amazingly little to the rule itself given its history. Indeed, after the proposal was changed, many
members of the committee continued to meet in an effort to put Humpty Dumpty back together again, which

the fact that the rule had been negotiated not only resulted in a
much better rule,87 it enabled the rule to withstand in large part a massive
assault. Coglianese also somehow attributes a challenge within the World Trade Organization to a
they largely did;

shortcoming of reg neg even though such issues were explicitly outside the purview of the committee; to
criticize reg neg here is like saying surgery is not effective when the patient refused to undergo it. While the
Underground Injection rule was challenged, the committee never reached an agreement88 and, moreover, the
convening report made clear that there were very strong disagreements over the interpretation of the

governing statute that would likely have to be resolved by a Court of Appeals. Coglianese also asserts that the
Equipment Leaks rule was the subject of review; it was, but only because the Clean Air requires parties to file
challenges in a very short period, and a challenger therefore filed a defensive challenge while it worked out
some minor details over the regulation. Those negotiations were successful and the challenge was withdrawn.
The Chemical Manufacturers Association, the challenger, had no intention of a substantive challenge.89
Moreover, a challenge to other parts of the HON should not be ascribed to the Equipment Leaks part of the
rule. The agreement in the Asbestos in Schools negotiation explicitly contemplated judicial review strange,
but true and hence it came as no surprise and as no violation of the agreement. As for the Wood Furniture
Rule, the challenges were withdrawn after informal negotiations in which EPA agreed to propose amendments
to the rule.90 Similarly, the challenge to EPAs Disinfectant By-Products Rule91 was withdrawn. In short, the
rules that have emerged from negotiated rulemaking have been remarkably resistant to substantive
challenges. And, indeed, this far into the development of the process, the standard of review and the extent to
which an agreement may be binding on either a signatory or someone whom a party purports to represent are

Coglianese
paints a substantially misleading picture by failing to distinguish substantive
challenges to rules that are based on a consensus from either challenges to
issues that were not the subject of negotiations or were filed while some details
were worked out. Properly understood, reg negs have been phenomenally
successful in warding off substantive review.
still unknown the speculation of many an administrative law class.92 Thus, here too,

B. More democraticreg neg encourages private sector


participationmeans that regulations arent unilaterally
created by the USFGCP results in a fair playing field for the
entirety of the private sector
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate
Change in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to
governance. Laura Langbein is the Professor of Quantitative Methods, Program Evaluation, Policy
Analysis, and Public Choice and American College. She holds a PhD in Political Science from the
University of North Carolina, a BA in Government from Oberlin College. Freeman, J. Langbein, R. I.
Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal, Volume 9,
2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

Negotiated Rulemaking Is Fairer to Regulated Parties than


Conventional Rulemaking To test whether reg neg was fairer to regulated parties, Ker-win and Langbein
asked respondents whether EPA solicited their participation and whether they believed
anyone was left out of the process. They also examined how much the parties learned in each process, and
2.

whether they experienced resource or information disparities. Negotiated rule participants were significantly more likely to say
that the EPA encouraged their participation than conventional rule participants (65% versus 33%
respectively). Al-though a higher proportion of conventional rulemaking participants reported that a party that should have
been represented in the rulemaking was omitted, the difference is not statistically significant. Specifically, "a majority of both
negotiated and conventional rule participants believed that the parties who should have been involved were involved (66%
versus 52% respectively)." In addition, as reported above, participants in regulatory negotiations reported significantly more
learning than their conventional rulemaking counterparts. Indeed, the disparity between the two types of participants in terms
of their reports about learning was one of the study's most striking results. At the same time, the resource disadvantage of

while smaller
groups did report suffering from a lack of resources during regulatory
negotiation, they reported the same in conventional rulemakings; no disparity
existed between the two processes on this score. Finally, the data suggest that the agency
is equally responsive to the parties in both negotiated and conventional
rulemakings. This result, together with the finding that participants in regulatory negotiations perceived
poorer, smaller groups was no greater in negotiated rulemaking than in conventional rulemaking. So,

disproportionate influence to be about evenly distributed, suggests that reg neg is at least as fair to the parties as conventional

because participant learning was so much greater in


regulatory negotiation, the process may in fact be more fair.
rulemaking. Indeed,

2NC Solves Better


Reg neg is better for complex rules
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

Complex Rules Are More Likely To Be Settled Through Negotiated


Rulemaking Recall that theorists disagree over whether complex or simple issues are best suited for
4.

negotiation. The data suggest that negotiated and conventional rules differ in systematic ways, indicating that EPA
officials do not select just any rule for negotiation. When asked how the issues for rulemaking were established, reg
neg participants reported more often than their counterparts that the participants established at least some of them

Conventional rulemaking participants more often admitted to being


uninformed of the process for establishing issues (17% versus 0%) or offered that regulated
(44% versus 0%).

entities set the issues (11% to 0%). A majority of both groups reported that the EPA or the governing legislation

types of issues indeed


appeared to differ between negotiated and conventional rules . When asked about the type
of issues to be decided, 52% of participants in conventional groups identified issues regarding
the standard, including its level, timing, or measurement ( compared to 31% of negotiated rule
participants), while 58% of the negotiating group identified compliance and implementation
issues (compared to 39% of participants in the conventional group). More reg neg participants (53%)
also cited compliance issues as causing the greatest conflict, compared to 32% of
conventional participants. Conventional participants more often reported that the rulemaking failed to
resolve all of the issues (30% versus 14%), but also more often reported that they encountered no
established at least some of the issues. Kerwin and Langbein found that the

"surprise" issues (74% versus 44%). Participants perceived negotiated rules to be more complex, with more issues

reg neg
participants tended to develop a more detailed view about the issues to be decided
than did their conventional counterparts. The researchers interpreted this disparity in reported
and more sides per issue than conventional rules. Kerwin and Langbein learned in interviews that

detail as a perception of complexity. To measure it they computed a complexity score: the more issues and the
more sides to each issue that respondents in a rulemaking could identify, relative to the number of respondents, the
more nuanced or complex the rulemaking. Using this calculation, the rules ranged in com plexity from 1.9 to 5.0,
with a mean complexity score of 3.6. The mean complexity score for reg negs (4.1) was significantly higher than the
score (2.5) for conventional rulemaking. Reg neg participants also presented a clearer understanding of the issues
to be decided than did conventional participants. To test clarity, Kerwin and Langbein developed a measure that
would reflect the striking variation among respondents in the number of different issues and different sides they
perceived in their rulemaking. Some respondents could identify very few separate issues and sides (e.g., "the level
of the standard is the single issue and the sides are business, environmentalists, and EPA"), while others detected
as many as four different issues, with three sides on some and two on others. Kerwin and Langbein's measurement
was in units of issue/sides, representing a combination of the two variables, the recognition of which they were
measuring; the mentions ranged from 3 to 10 issue/sides, with a mean of 7.9. Negotiated rulemaking participants
mentioned an average of 8.9 issue/sides, compared to an average of 6issue/sides mentioned by their conventional
counterparts, a statistically significant difference. To illustrate the difference between complexity and clarity: If a
party identified the compliance standard as the sole issue, but failed to identify a number of sub-issues, they would
be classified as having a clear understanding but not a complex one. similarly, if the party identified two sides
(business vs. environment) without recognizing distinctions among business participants or within an environmental

The differences in
complexity might be explained by the higher reported rates of learning by reg neg
participants, rather than by differences in the types of rules processed by reg neg
coalition, they would also be classified as clear but not complex in their understanding.

versus conventional rulemaking. Kerwin and Langbein found that complexity and clarity
were both positively and significantly correlated with learning by
respondents, but the association between learning and complexity/clarity disappeared when the type of
rulemaking was held constant. However, when the amount learned was held constant, the association between

the
association between learning and complexity/clarity was due to the negotiation
process. In other words, the differences in complexity/clarity are not attributable to higher
learning but rather to differences between the processes. The evidence is consistent with the
hypothesis that issues selected for regulatory negotiation are different from and more
complicated than those chosen for conventional rulemaking. The data associating
reg negs with complexity, together with the finding that more issues settle in reg
negs, are consistent with the proposition that issues with more (and more di verse)
sub-issues and sides settle more easily than simple issues.
complexity/clarity and the type of rulemaking remained positive and significant. This signifies that

Reg neg is better than conventional rulemaking


Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)
In this article, we present an original analysis and summary of new empirical evidence from Neil Kerwin and Laura
Langbein's two-phase study of Environmental Protection Agency (EPA) negotiated rulemakings. n5 Their qualitative
and (*62) quantitative data reveal more about reg neg than any empirical study to date; although not published in
a law review article until now, they unquestionably bear upon the ongoing debate among legal scholars over the
desirability of negotiating rules. Most importantly, this is the first study to compare participant attitudes toward
negotiated rulemaking with attitudes toward conventional rulemaking. The findings of the studies tend, on balance,
to undermine arguments made by the critics of regulatory negotiation and to bolster the claims of proponents.

reg neg generates more


learning, better quality rules, and higher satisfaction compared to conventional
rulemaking. n6 At the same time, stakeholder influence on the agency remains
about the same using either approach. n7 Based on the results, we recommend
more frequent use of regulatory negotiation, accompanied by further comparative
and empirical study, for the purposes of establishing regulatory standards and
resolving implementation and compliance issues. This recommendation contradicts
the prevailing view that the process is best used sparingly, n8 and even then, only
for narrow questions of implementation. n9
Kerwin and Langbein found that, according to participants in the study,

Reg negs solve better


Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing

the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.


http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)
The Primary Objective of Negotiated Rulemaking Is To Create Better and More Widely Accepted Rules. Coglianese
argues throughout his article that the primary benefits of negotiated rules were seen by its advocates as being the
reduction in time and in the incidence of litigation.93 While, both benefits have been realized, neither was seen by
those who established it as the predominant factor in its use. For example, Peter Schuck wrote an important early

the benefits of negotiated solutions over those imposed by a


hierarchy.94 Schuck emphasized a number of shortcomings of the adjudicatory nature of hybrid rulemaking and
many benefits of direct negotiations among the affected parties. The tenor of his thinking is reflected
by his statement, a bargained solution depends for its legitimacy not upon its
objective rationality, inherent justice, or the moral capital of the institution that
fashioned it, but upon the simple fact that it was reached by consent of the parties
affected.95 And, it encourages diversity, stimulates the parties to develop relevant information about facts
article in which he described

and values, provides a counter-weight to concentrations of power, and advances participation by those the
decisions affect.96 Nowhere in his long list of benefits was either speed or reduced litigation, except by implication
of the acceptability of the results. My own article that developed the recommendations97 on which the ACUS
Recommendation,98 the Negotiated Rulemaking Act, and the practice itself are based describes the anticipated

Negotiating has many advantages over the adversarial


process. The parties participate directly and immediately in the decision. They share
in its development and concur in it, rather than participate by submitting
information that the decisionmaker considers in reaching the decision. Frequently,
those who participate in the negotiations are closer to the ultimate decisionmaking
authority of the interest they represent than traditional intermediaries that
represent the interest in an adversarial proceeding. Thus, participants in
negotiations can make substantive decisions, rather than acting as experts in the
decisionmaking process. In addition, negotiation can be a less expensive means of
decisionmaking because it reduces the need to engage in defensive research in
anticipation of arguments made by adversaries. Undoubtedly the prime benefit of
direct negotiations is that it enables the participants to focus squarely on
their respective interests.99 The article quotes John Dunlop, a true pioneer in using negotiations
benefits of negotiated rulemaking:

among the affected interests in the public sphere,100 as saying In our society, a rule that is developed with the
involvement of the parties who are affected is more likely to be accepted and to be effective in accomplishing its

Reducing time and litigation exposure was not emphasized if even


mentioned directly To be sure, the Congressional findings that precede the Negotiated Rulemaking Act
mention the savings of time and litigation, but they are largely the by-product of far
more significant benefits:102 (2) Agencies currently use rulemaking procedures that
may discourage the affected parties from meeting and communicating with each
other, and may cause parties with different interest to assume conflicting and
antagonistic positions and to engage in expensive and time-consuming litigation
over agency rules. (3) Adversarial rulemaking deprives the affected parties and the
public of the benefits of face-to-face negotiations and cooperation in developing and
reaching agreement on a rule. It also deprives them of the benefits of shared
information, knowledge, expertise, and technical abilities possessed by the affected
parties 4) Negotiated rulemaking, in which the parties who will be significantly
affected by a rule participate directly in the development of the rule, can provide
significant advantages over adversarial rulemaking. (5) Negotiated rulemaking can
increase the acceptability and improve the substance of rules, making it less likely
that the affected parties will resist enforcement or challenge such rules in court. It
may also shorten the amount of time needed to issue final rules . Thus, those who
were present at the creation of reg neg sought neither expedition nor a
shield against litigation. Rather, they saw direct negotiations among the
parties a form of representational democracy not explicitly recognized in the Administrative Procedure Act
as resulting in rules that are substantively better and more widely
intended purposes.101

accepted. Those benefits were seen as flowing from the participation of those
affected who bring with them a practical insight and expertise that can result in
rules that are better informed, more tailored to achieving the actual regulatory goal
and hence more effective, and able to be enforced.

Reg negs are the best type of negotiations


Hsu 02
(Shi-Ling Hsu is the Larson Professor of Law at the Florida State University College of Law. Professor
Hsu has a B.S. in Electrical Engineering from Columbia University, and a J.D. from Columbia Law
School. He also has an M.S. in Ecology and a Ph.D. in Agricultural and Resource Economics, both from
the University of California, Davis. Professor Hsu has taught in the areas of environmental and natural
resource law, law and economics, quantitative methods, and property. Prior to his current
appointment, Professor Hsu was a Professor of Law and Associate Dean for Special Projects at the
University Of British Columbia Faculty Of Law. He has also served as an Associate Professor at the
George Washington University Law School, a Senior Attorney and Economist for the Environmental
Law Institute in Washington D.C, and a Deputy City Attorney for the City and County of San Francisco.
A Game Theoretic Approach to Regulatory Negotiation: A Framework for Empirical Analysis, Harvard
Environmental Law Review, Vol 26, No 2, February2002. http://papers.ssrn.com/sol3/papers.cfm?
abstract_id=282962//ghs-kw)

There are reasons to be optimistic about what regulatory negotiations can produce
in even a troubled administrative state. Jody Freeman noted that one important finding from the
Kerwin and Langbein studies were that parties involved in negotiated rulemaking were able to
use the face-to-face contact as a learning experience .49 Barton Thompson has noted in his
article on common-pool resources problems50 that one reason that resource users resist collective action solutions

it is evidently human nature to blame others for the existence of resource


shortages. That in turn leads to an extreme reluctance by resource users to agree to
a collective action solution if it involves even the most minimal personal sacrifices. Thompson suggests
that the one hope for curing resource users of such self-serving myopia is face-to-face
contact and the exchange of views. The vitriol surrounding some environmental regulatory issues
suggests that there is a similar human reaction occurring with respect to some resource
conflicts.51 Solutions to environmental problems and resource conflicts on which regulated parties and
environmental organizations hold such strong and disparate views may require face-to-face contact to
defuse some of the tension and remove some of the demonization that has arisen in the
these conflicts. Reinvention, with the emphasis on negotiations and face-to-face contact, provides
such an opportunity. 52 Farber has argued for making the best of this trend towards regulatory negotiation
characterizing negotiated rulemaking and reinvention. 53 Faced with the reality that some negotiation will
inevitably take place because of the slippage inherent in our system of
regulation, Farber argues that the best model for allowing it to go forward is a bilateral
one. A system of bilateral negotiation would clearly be superior to a system of selfregulation, as such a Farber has argued for making the best of this trend towards regulatory negotiation
is that

characterizing negotiated rulemaking and reinvention. A system of bilateral negotiation would clearly be superior to

a
system of bilateral negotiation between agencies and regulated parties would even
be superior to a system of multilateral negotiation, due to the transaction costs of
assembling all of the affected stakeholders in a multilateral effort, and the
difficulties of reaching a consensus among a large number of parties. Moreover, multilateral
a system of self-regulation, as such a system would inevitably descend into a tragedy of the commons.54 But

negotiation gives rise to the troubling idea that there should be joint governance among the parties. Since
environmental organizations lack the resources to participate in post-negotiation governance, there is a heightened

The correct balance between


regulatory flexibility and accountability, argues Farber, is to allow bilateral negotiation
but with built-in checks to ensure that the negotiation process is not captured by
regulated parties. Built-in checks would include transparency, so that environmental organizations can
danger of regulatory capture by the better-financed regulated parties.55

monitor regulatory bargains, and the availability of citizen suits, so that environmental organizations could remedy
regulatory bargains that exceed the dictates of the underlying statute. Environmental organizations would thus play
the role of the watchdog, rather than the active participant in negotiations. The finding of Kerwin and Langbein that

resource constraints sometimes caused environmental organizations, especially smaller local ones, to skip

A much more efficient use of


limited resources would require that the environmental organization attempt to play
a deterrent role in monitoring negotiated rulemakings.
negotiated rulemakings would seem to support this conclusion. 56

2NC Cybersecurity Solvency


Reg neg solves cybersecurity
Sales 13
(Sales, Nathan Alexander. Assistant Professor of Law, George Mason University School of Law.
REGULATING CYBERSECURITY, Northwestern University Law Review. 2013.
http://www.rwu.edu/sites/default/files/downloads/cyberconference/cyber_threats_and_cyber_realities_r
eadings.pdf//ghs-kw)

An alternative would be a form of enforced self-regulation 324 in which private


companies develop the new cybersecurity protocols in tandem with the
government.325 These requirements would not be handed down by
administrative agencies, but rather would be developed through a
collaborative partnership in which both regulators and regulated would
play a role. In particular, firms might prepare sets of industrywide security
standards. (The National Industrial Recovery Act, famously invalidated by the Supreme Court in 1935, contained such a
mechanism,326 and today the energy sector develops reliability standards in the same way.327) Or agencies could
sponsor something like a negotiated rulemaking in which regulators, firms, and
other stakeholders forge a consensus on new security protocols. 328 In either
case, agencies then would ensure compliance through standard administrative
techniques like audits, investigations, and enforcement actions. 329 This approach
would achieve all four of the benefits of private action mentioned above: It avoids (some) problems with
information asymmetries, takes advantage of distributed private sector
knowledge about vulnerabilities and threats, accommodates rapid
technological change, and promotes innovation. On the other hand, allowing firms to help set
the standards that will be enforced against them may increase the risk of regulatory capture the danger that agencies will come to
promote the interests of the companies they regulate instead of the publics interests.330 The risk of capture is always present in
regulatory action, but it is probably even more acute when regulated entities are expressly invited to the decisionmaking table.331

2NC Encryption Advocate


Heres a solvency advocate
DMCA 05
(Digital Millenium Copyright Act, Supplement in 2005. https://books.google.com/books?id=nL0s81xgVwC&pg=PA481&lpg=PA481&dq=encryption+AND+(+%22regulatory+negotiation%22+OR+
%22negotiated+rulemaking%22)&source=bl&ots=w9mrCaTJs4&sig=1mVsh_Kzk1p26dmT9_DjozgVQI&hl=en&sa=X&ved=0CB4Q6AEwAGoVChMIxtPG5YH9xgIVwx0eCh2uEgMJ#v=onepa
ge&q&f=false//ghs-kw)

Some encryption supporters advocate use of advisory committee and negotiated


rulemaking procedures to achieve consensus around an encryption
standard. See Motorola Comments at 10-11; Veridian Reply Comments at 20-23.

Reg negs are key to wireless technology innovation


Chamberlain 09
(Chamberlain, Inc. Comments before the Federal Communications Commission. 11-05-2009.
https://webcache.googleusercontent.com/search?
q=cache:dfYcw45dQZsJ:apps.fcc.gov/ecfs/document/view%3Bjsessionid
%3DSQnySfcTVd22hL6ZYShTpQYGY1X27xB14p3CS1y01XW15LQjS1jj!-1613185479!153728702%3Fid
%3D7020245982+&cd=2&hl=en&ct=clnk&gl=us//ghs-kw)

Chamberlain supports solutions that will balance the needs of stakeholders in both
the licensed and unlicensed bands. Chamberlain and other manufacturers of unlicensed
devices such as Panasonic are also uniquely able to provide valuable contributions
from the perspective of unlicensed operators with a long history of innovation in the
unlicensed bands. Moreover, as the Commission has recognized in recent proceedings,
alternative mechanisms for gathering data and evaluating options may assist the
Commission in reaching a superior result.19 For these reasons, Chamberlain would
support a negotiated rulemaking process, the use of workshops -both large and small- or any other
alternative process that ensures the widest level of participation from stakeholders across
the wireless market.

2NC Privacy Solvency


Reg neg is key to privacy
Rubinstein 09
(Rubinstein, Ira S. Adjunct Professor of Law and Senior Fellow, Information Law Institute, New York
University School of Law. PRIVACY AND REGULATORY INNOVATION: MOVING BEYOND VOLUNTARY
CODES, Workshop for Federal Privacy Regulation, NYU School of Law. 10/2/2009.
https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-commentproject-no.p095416-544506-00103/544506-00103.pdf//ghs-kw)

self-regulation is a recurrent theme in the


US approach to online privacy and perhaps a permanent part of the regulatory
landscape. This Articles goal has been to consider new strategies for overcoming observed weaknesses in
Whatever its shortcoming, and despite its many critics,

self-regulatory privacy programs. It began by examining the FTCs intermittent embrace of self-regulation, and
found that the Commissions most recent foray into self regulatory guidelines for online behavioral advertising is
not very different from earlier efforts, which ended in frustration and a call for legislation. It also reviewed briefly
the more theoretical arguments of privacy scholars for and against self-regulation, but concluded that the market
oriented views of those who favor open information flows clashed with the highly critical views of those who detect
a market failure and worry about the damaging consequences of profiling and surveillance not only to individuals,
but to society and to democratic self-determination. These views seem irreconcilable and do not pave the way for
any applied solutions. Next, this Article presented three case studies of mandated self-regulation. This included
overviews of the NAI Principles and the SHA, as well as a more empirical analysis of the CARU safe harbor program.
An assessment of these case studies against five criteria (completeness, free rider problems, oversight and

self-regulation undergirded by law


in other words, a statutory safe harboris a more effective and efficient
instrument than any self-regulatory guidelines in which industry is chiefly
responsible for developing principles and /or enforcing them. In a nutshell, well-designed
safe harbors enable policy makers to imagine new forms of self-regulation that
build on its strengths while compensating for its weaknesses .268 This embrace of
enforcement, transparency, and formation of norms) concluded that

statutory safe harbors led to a discussion of how to improve them by importing second-generation strategies from
environmental law. Rather than summarizing these strategies and how they translate into the privacy domain, this
Article concludes with a set of specific recommendations based on the ideas discussed in Part III.C. If Congress
enacts comprehensive privacy legislation based on FIPPs, the first recommendation is that the new law
include a safe harbor program, which should echo the COPPA safe harbor to the extent of encouraging groups to
submit self-regulatory guidelines and, if approved by the FTC, treat compliance with these guidelines as deemed
compliance with statutory requirements. The FTC should be granted APA rulemaking powers to implement
necessary rules including a safe harbor rule.

Congress should also consider whether to mandate a

negotiated rulemaking for an OBA safe harbor or for safe harbor programs more generally. In any case,
FTC should give serious thought to using the negotiated rulemaking process in
developing a safe harbor program or approving specific guidelines. In addition, the safe harbor program should be
overhauled to reflect second-generation strategies. Specifically, the statute should articulate default requirements
but allow FTC more discretion in determining whether proposed industry guidelines achieve desired outcomes,
without firms having to match detailed regulatory requirements on a point by point basis.

2NC Fism NB
Reg negs are better and solves federalismplan fails
Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)
1. Negotiated Rulemaking Although the most conventional of the less familiar forms, " negotiated

rulemaking" between federal agencies and state stakeholders is a sparingly used tool that holds promise
for facilitating sound administrative policymaking in disputed federalism contexts,
such as those implicating environmental law, national security, and consumer
safety. Under the Administrative Procedure Act, the traditional "notice and comment"
administrative rulemaking pro-cess allows for a limited degree of
participation by state stakeholders who comment on a federal agency's
proposed rule. The agency publishes the proposal in the Federal Register, invites public comments
critiquing the draft, and then uses its discretion to revise or defend the rule in response to comments. n256 Even
this iterative process con-stitutes a modest negotiation, but it leaves participants so frequently unsatisfied that
many agencies began to in-formally use more extensive negotiated rulemaking in the 1970s. n257 In 1990,
Congress passed the Negotiated Rulemaking Act, amending the Administrative Procedure Act to allow a more
dynamic [*52] and inclusive rulemaking process, n258 and a subsequent Executive Order required all federal
agencies to consider negotiated rulemaking when developing regulations. n259 Negotiated rulemaking allows

Under notice and comment,


public participation is limited to criticism of well-formed rules in which the agency is
already substantially invested. n260 By contrast, stakeholders in negotiated
rulemaking collectively design a proposed rule that takes into account their
respective interests and expertise from the beginning. n261 The concept, outline, and/or text of
stakeholders much more influence over unfolding regulatory decisions.

a rule is hammered out by an advisory committee of carefully balanced representation from the agency, the
regulated public, community groups and NGOs, and state and local governments. n262 A professional intermediary
leads the effort to ensure that all stakeholders are appropriately involved and to help interpret prob-lem-solving
opportunities. n263 Any consensus reached by the group becomes the basis of the proposed rule, which is still
subject to public comment through the normal notice-and-comment procedures. n264 If the group does not reach
consensus, then the agency proceeds through the usual notice-and-comment process. n265 The negotiated
rulemaking process, a tailored version of interest group bargaining within established legisla-tive constraints, can

The process is usually more subjectively satisfying [*53] for all


stakeholders, including the government agency representatives. n267 More
cooperative relationships are estab-lished between the regulated parties and the
agencies, facilitating future implementation and enforcement of new rules. n268 Final
regulations include fewer technical errors and are clearer to stakeholders, so that
less time, money and effort is expended on enforcement. n269 Getting a proposed rule out for
yield important benefits. n266

public comment takes more time under negotiated rulemaking than standard notice and comment, but thereafter,

negotiated rules receive fewer and more moderate public comment, and are less
frequently challenged in court by regulated entities . n270 Ultimately, then, final
regulations can be implemented more quickly following their debut in the Federal

Register, and with greater compliance from stakeholders. n271 The process also
confers valuable learning benefits on participants, who come to better understand
the concerns of other stakeholders, grow invested in the consensus they help
create, and ulti-mately campaign for the success of the regulations within their own
constituencies. n272 Negotiated rulemaking offers additional procedural benefits because it
ensures that agency personnel will be unambiguously informed about the full
federalism implications of a proposed rule by the impacted state interests. Federal agencies are
already required by executive order to prepare a federalism impact statement for rulemaking with federalism

the quality of state-federal communication within negotiated


rulemaking enhances the likelihood that federal agencies will appreciate and
understand the full extent of state [*54] con-cerns. Just as the consensus-building process invests
participating stakeholders with respect for the competing concerns of other stake-holders, it invests
participating agency personnel with respect for the federalism concerns of state
stakeholders. n274 State-side federalism bargainers interviewed for this project
consistently reported that they always prefer negotiated rulemaking to notice and
comment--even if their ultimate impact remains small--because the products of fully
informed federal consultation are always preferable to the alternative. n275
implications, n273 but

Reg negs solve federalismtraditional rulemaking fails


Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)

bargaining in which the normative leverage of federalism values heavily


influences the ex-change offers the most reliable interpretive tools, smoothing out
leverage imbalances and focusing bargainers' in-terlinking interests . n619
Negotiations in which participants are motivated by shared regard for checks, localism, accountability, and
synergy naturally foster constitutional process and hedge against non-consensual dealings. All federalism
bargaining trades on the normative values of federalism to some degree, and any given
Unsurprisingly,

negotiation may feature it more or less prominently based on the factual particulars. n620 Yet the taxonomy reveals
several forms in which federalism values predominate by design, and which may prove especially valuable in
fraught federalism contexts: negotiated rulemaking, policymaking laboratory negotiations, and iterative federalism.
n621 These ex-amples indicate the potential for purposeful federalism engineering to reinforce procedural regard

Negotiated Rulemaking between state


and federal actors improves upon traditional administrative rule-making in fostering
participation, localism, and synergy by incorporating genuine state input into
federal regula-tory planning. n622 Most negotiated rulemaking also uses professional
intermediaries to ensure that all stake-holders are appropriately engaged and to
facilitate the search for outcomes that meet parties' dovetailing interests. n623 For
for state and fed-eral roles within the American system. (1)

example, after discovering that extreme local variability precluded a uniform federal program, Phase LI stormwater
negotiators invited municipal dischargers to design individually [*123] tailored programs within general federal
limits. n624

Considering the massive number of municipalities involved, the fact that

the rule faced legal challenge from only a handful of Texas municipalities testifies to
the strength of the consensus through which it was created. By contrast, the
iterative exchange within standard notice-and-comment rulemaking--also an
example of feder-alism bargaining--can frustrate state participation by denying
participants meaningful opportunities for consulta-tion, collaborative
problem-solving, and real-time accountability The contrast between noticeand-comment and negotiated rulemaking, exemplified by the two phases of REAL ID rulemaking,
demonstrates the difference be-tween more and less successful instances
of federalism bargaining. n625 Moreover, the difficulty of asserting state consent to the
products of the REAL ID notice-and-comment rulemaking (given the outright rebellion that fol-lowed) limits its
interpretive potential. Negotiated rulemakings take longer than other forms of administrative
rulemaking, but are more likely to succeed over time. Regulatory matters best suited for state-federal
negotiated rulemaking include those in which a decisive federal rule is needed to overcome spillover effects,
holdouts, and other collective action problems, but unique and diverse state expertise is needed for the creation of

Matters in contexts of overlap least suited for negotiated rulemaking


include those in which the need for immediate policy overcomes the need for broad
participation--but even these leave open possibilities for incremental rulemaking, in which the initial federal
wise policy.

rule includes mechanisms for periodic reevaluation with local input.

2NC Fism NB Heg Impact


Fast growth promotes US leadership and solves great power
war
Khalilzad 11 PhD, Former Professor of Political Science @ Columbia,
Former ambassador to Iraq and Afghanistan
(Zalmay Khalilzad was the United States ambassador to Afghanistan, Iraq,
and the United Nations during the presidency of George W. Bush and the
director of policy planning at the Defense Department from 1990 to 1992.
"The Economy and National Security" Feb 8
http://www.nationalreview.com/articles/259024/economy-and-nationalsecurity-zalmay-khalilzad)//BB
economic
trends pose the most severe long-term threat to the U nited States
position as global leader. While the United States suffers from
low economic
growth, the economies of rival powers are developing rapidly. continuation
could
lead to a shift from American primacy toward a multi-polar global system, leading
to
geopolitical rivalry and war among the great powers.
Today,

and fiscal

fiscal imbalances and

The

of these two trends

in turn

increased

even

The current recession is the result of a deep financial crisis,

not a mere fluctuation in the business cycle. Recovery is likely to be protracted. The crisis was preceded by the buildup over two decades of enormous amounts of debt throughout the U.S. economy ultimately totaling
almost 350 percent of GDP and the development of credit-fueled asset bubbles, particularly in the housing sector. When the bubbles burst, huge amounts of wealth were destroyed, and unemployment rose to over 10
percent. The decline of tax revenues and massive countercyclical spending put the U.S. government on an unsustainable fiscal path. Publicly held national debt rose from 38 to over 60 percent of GDP in three years .

Without faster economic growth


rates

and actions to reduce deficits, publicly held national debt is projected to reach dangerous proportions. If

were to rise significantly, annual interest payments which already are larger than the defense budget

would crowd out other spending

interest

or require substantial

tax increases that would undercut economic growth. Even worse, if unanticipated events trigger what economists call a sudden stop in credit markets for U.S. debt, the United States would be unable to roll over its
outstanding obligations, precipitating a sovereign-debt crisis that would almost certainly compel a radical retrenchment of the United States internationally. Such scenarios would reshape the international order.

the economic devastation of Britain and France


countries to relinquish their empires

during World War II, as well as the rise of other powers,

It was

that led both

. In the late 1960s, British leaders concluded that they lacked the economic capacity to maintain a presence east of Suez. Soviet

economic weakness, which crystallized under Gorbachev, contributed to their decisions to withdraw from Afghanistan, abandon Communist regimes in Eastern Europe, and allow the Soviet Union to fragment. If the U.S. debt

the United States would be compelled to retrench,


shedding
international commitments We face this domestic challenge while other major
problem goes critical,

reducing its military spending and

powers are experiencing rapid economic growth

. Even though countries such as China, India, and Brazil have profound political, social,

If U.S.
policymakers fail to act
The closing of the
gap
could intensify geopolitical competition among major powers,
and
the higher risk of escalation.
the longest period of peace among the great powers has been the era of
U.S. leadership
multi-polar systems have been unstable, with
major wars among the great powers.
American
retrenchment could have devastating consequences
there would be a heightened possibility of arms races,
miscalculation, or other crises spiraling into all-out conflict
weaker powers may shift their geopolitical posture away from the United States.
hostile states would be emboldened to make aggressive moves in their regions
demographic, and economic problems, their economies are growing faster than ours, and this could alter the global distribution of power. These trends could in the long term produce a multi-polar world.
and other powers continue to grow, it is not a question of whether but when a new international order will emerge.

between the United States and its rivals

for local powers to play major powers against one another,

increase incentives

undercut our will to preclude or respond to international crises because of

The stakes are high. In modern history,

. By contrast,

crises and

their competitive dynamics resulting in frequent

Failures of multi-polar international systems produced both world wars.

. Without an American security blanket, regional powers could rearm in an

attempt to balance against emerging threats. Under this scenario,

. Alternatively, in seeking to accommodate the stronger powers,

Either way,

Slow growth leads to hegemonic wars relative gap is key


Goldstein 7 - Professor of Global Politics and International Relations @
University of Pennsylvania,
(Avery Goldstein, Power transitions, institutions, and China's rise in East
Asia: Theoretical expectations and evidence, Journal of Strategic Studies,
Volume30, Issue 4 & 5 August, EBSCO)
Two closely related, though distinct, theoretical arguments focus explicitly on the consequences for international
politics of a shift in power between a dominant state and a rising power. In War and Change in World Politics,
Robert

Gilpin suggested that

peace prevails when a dominant states capabilities enable it to govern an

as economic and technological diffusion


proceeds during eras of peace and development, other states are empowered. Moreover, the
international order that it has shaped. Over time, however,

burdens of international governance drain and distract the reigning hegemon,

challengers eventually emerge who seek to rewrite the rules of governance. As the
power advantage of the erstwhile hegemon ebbs, it may become desperate
enough to resort to theultima ratio of international politics, force, to forestall the increasingly
urgent demands of a rising challenger. Or as the power of the challenger rises, it may
be tempted to press its case with threats to use force. It is the rise and fall of the great
powers that creates the circumstances under which major wars, what Gilpin labels hegemonic
wars, break out.13 Gilpins argument logically encourages pessimism about the implications of a rising China. It
and

leads to the expectation that international trade, investment, and technology transfer will result in a

diffusion of American economic power, benefiting the rapidly developing states of


the world, including China. As the US simultaneously scurries to put out the many brushfires that threaten its farflung global interests (i.e., the classic problem of overextension), it will be unable to devote sufficient
resources to maintain or restore its former advantage over emerging competitors like China. While
the erosion of the once clear American advantage plays itself out, the US will find
it ever more difficult to preserve the order in Asia that it created during its era of
preponderance. The expectation is an increase in the likelihood for the use of force
either by a Chinese challenger able to field a stronger military in support of its demands for greater
influence over international arrangements in Asia , or by a besieged American
hegemon desperate to head off further decline. Among the trends that alarm those
who would look at Asia through the lens of Gilpins theory are Chinas expanding share of world
trade and wealth(much of it resulting from the gains made possible by the international economic order a
dominant US established); its acquisition of technology in key sectors that have both civilian and
steady

military applications (e.g., information, communications, and electronics linked with to forestall, and the challenger
becomes increasingly determined to realize the transition to a new international order whose contours it will
define. the revolution in military affairs); and an expanding military burden for the US (as it copes with the
challenges of its global war on terrorism and especially its struggle in Iraq) that limits the resources it can devote to
preserving its interests in East Asia.14 Although similar to Gilpins work insofar as it emphasizes the importance of
shifts in the capabilities of a dominant state and a rising challenger, the power-transition theory A. F. K. Organski
and Jacek Kugler present in The War Ledger focuses more closely on the allegedly dangerous phenomenon of
crossover the point at which a dissatisfied challenger is about to overtake the established leading state.15 In

when the power gap narrows, the dominant state becomes increasingly
desperate. Though suggesting why a rising China may ultimately present grave dangers for international
peace when its capabilities make it a peer competitor of America, Organski and Kuglers power-transition
theory is less clear about the dangers while a potential challenger still lags far behind and faces a difficult
such cases,

struggle to catch up. This clarification is important in thinking about the theorys relevance to interpreting Chinas
rise because a broad consensus prevails among analysts that Chinese military capabilities are at a minimum two

points with alarm to trends


in Chinas growing wealth and power relative to the United States, but especially looks
ahead to what it sees as the period of maximum danger that time when a
dissatisfied China could be in a position to overtake the US on dimensions believed
crucial for assessing power. Reports beginning in the mid-1990s that offered
extrapolations suggesting Chinas growth would give it the worlds largest gross domestic
product (GDP aggregate, not per capita) sometime in the first few decades of the twentieth
century fed these sorts of concerns about a potentially dangerous challenge to American leadership in Asia.17
decades from putting it in a league with the US in Asia.16 Their theory, then,

The huge gap between Chinese and American military capabilities (especially in terms of technological
sophistication) has so far discouraged prediction of comparably disquieting trends on this dimension, but inklings of
similar concerns may be reflected in occasionally alarmist reports about purchases of advanced Russian air and
naval equipment, as well as concern that Chinese espionage may have undermined the American advantage in
nuclear and missile technology, and speculation about the potential military purposes of Chinas manned space

because a dominant state may react to the prospect of a crossover


and believe that it is wiser to embrace the logic of preventive war and act early to
delay a transition while the task is more manageable, Organski and Kuglers powertransition theory also provides grounds for concern about the period prior to the
possible crossover.19
program.18 Moreover,

2NC Ptix NB
Reg negs are bipartisan
Copeland 06
(Curtis W. Copeland, PhD, was formerly a specialist in American government at the Congressional
Research Service (CRS) within the U.S. Library of Congress. Copeland received his PhD degree in
political science from the University of North Texas.His primary area of expertise is federal rulemaking
and regulatory policy. Before coming to CRS in January 2004, Dr. Copeland worked at the U.S. General
Accounting Office (GAO, now the Government Accountability Office) for 23 years on a variety of issues,
including federal personnel policy, pay equity, ethics, procurement policy, management reform, the
Office of Management and Budget (OMB), and, since the mid-1990s, multiple aspects of the federal
rulemaking process. At CRS, he wrote reports and testified before Congress on such issues as federal
rulemaking, regulatory reform, the Congressional Review Act, negotiated rulemaking, the Paperwork
Reduction Act, the Regulatory Flexibility Act, OMBs Office of Information and Regulatory Affairs,
Executive Order 13422, midnight rulemaking, peer review, and risk assessment. He has also written
and testified on federal personnel policies, the federal workforce, GAOs pay-for-performance system,
and efforts to oversee the implementation of the Troubled Asset Relief Program. From 2004 until 2007,
Dr. Copeland headed the Executive Branch Operations section within CRSs Government and Finance
Division. Copeland, C. W. Negotiated Rulemaking, Congressional Research Service, September 18,
2006. http://crs.wikileaks-press.org/RL32452.pdf//ghs-kw)

Negotiated rulemaking (sometimes referred to as regulatory negotiation or reg-neg) is a supplement to


the traditional APA rulemaking process in which agency representatives and representatives of affected parties
work together to develop what can ultimately become the text of a proposed rule.1 In this approach,

negotiators try to reach consensus by evaluating their priorities and making


tradeoffs, with the end result being a draft rule that is mutually acceptable.
Negotiated rulemaking has been encouraged (although not usually required) by both congressional and
executive branch actions, and has received bipartisan support as a way to involve
affected parties in rulemaking before agencies have developed their proposals. Some
questions have been raised, however, regarding whether the approach actually speeds rulemaking or reduces
litigation.

Reg neg solves controversyno link to ptix


Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

agencies have used reg neg to


develop some of their most contentious rules. For example, the Federal Aviation
Administration and the National Park Service used a variant of the process to write
the regulations and policies governing sightseeing flights over national parks; the
issue had been sufficiently controversial that the President had to intervene and direct
Recent Agency Use of Reg Neg. And, indeed, in the past few years

the two agencies to develop rules for the management of sightseeing aircraft in the National Parks where it is
deemed necessary to reduce or prevent the adverse effects of such aircraft.22 The Department of Transportation
used it to write a regulation governing the delivery of propane and other compressed gases when the regulation
became ensnared in litigation and Congressional action.23 The Occupational Safety and Health Administration used
it to address the erection of steel structures, an issue that had been on its docket for more than a decade with two
abortive attempts at rulemaking when OSHA turned to reg neg.24 Th e

Forest Service has just


published a notice of intent to establish a reg neg committee to develop policies
governing the use of fixed anchors for rock climbing in designated wilderness areas
administered by the Forest Service.25 This issue has become extremely controversial .26

Negotiated rulemaking has proven enormously successful in developing agreements


in highly polarized situations and has enabled the parties to address the best, most
effective or efficient way of solving a regulatory controversy. Agencies have
therefore turned to it to help resolve particularly difficult, contentious issues that
have eluded closure by means of traditional rulemaking procedures

2NC CP Solves Ptix Link


The counterplan breaks down adversarialism, is seen as
legitimate, and is key to effective regulation
Mee 97
(Siobhan, Jd, An Attorney In The Complex And Class Action
Litigation Group, Focuses Her Practice On A Broad Range Of
Commercial Litigation, Negotiated Rulemaking And Combined
Sewer Overflows (Csos): Consensus Saves Ossification?, Fall,
1997 25 B.C. Envtl. Aff. L. Rev. 213, Pg Lexis//Um-Ef)
Benefits that accrue to negotiated rulemaking participants correspond to the criticisms of traditional
rulemaking. n132 In particular, proponents of negotiated rulemaking claim that it increases public participation , n133
fosters nonadversarial relationships, n134 and reduces long-term regulatory costs.
n135 Traditionally, agencies have limited the avenues for public participation in
the rulemaking process to reaction and criticism, releasing rules for the public's comment after they have been
developed [*229] internally. n136 In contrast, negotiated rulemaking elicits wider involvement at
the early stages of production . n137 Input from non-agency and nongovernmental actors, who may possess the most relevant knowledge and
who will be most affected by the rule, is a prerequisite to effective
regulation. n138 Increased participation also leads to what Professor Harter considers the overarching benefit of
negotiations: greater legitimacy. n139 Whereas traditional rulemaking lends itself to
adversarialism, n140 negotiated rulemaking is designed to foster cooperation and
accommodation. n141 Rather than clinging to extreme positions, parties prioritize the underlying issues
and seek trade-offs to maximize their overall interests . n142 Participants, including the
agency, discover and address one another's concerns directly . n143 The give-and-take of this process
provides an opportunity for parties with differing viewpoints to test data and arguments directly. n144 The resultant exploration of
different approaches is more likely than the usual notice and comment process to
generate creative solutions and avoid ossification. n145 [*230] Whether or not it results in a rule,
negotiated rulemaking establishes valuable links between groups that otherwise
would only communicate in an adversarial context . n146 Rather than trying to outsmart one another, former
competitors become part of a team which must consider the needs of each member. n147 Working relationships developed during negotiations give
participants an understanding of the other side. n148 As one negotiator reflected, in "working with the opposition you find they're not quite the ogres you

The chance to iron out what are often


long-standing disagreements can only improve future interactions . n150
thought they were, and they don't hate you as much as you thought." n149

2NC AT Perm do Both


Perm do both links to the net benefitdoes the entirety of the
AFF which _____________

2NC AT Perm do the CP


CP is plan minus since it only mandates the creation of a reg
neg committeeonly does the plan if and only if the committee
decides to do sothat means that the CP is uncertain. Perm
severs the certainty of the plan:
Substantially means certain and real
Words and Phrases 1964 (40 W&P 759) (this edition of W&P is out of print;
the page number no longer matches up to the current edition and I was
unable to find the card in the new edition. However, this card is also
available on google books, Judicial and statutory definitions of words and
phrases, Volume 8, p. 7329)
The words outward, open, actual, visible, substantial, and exclusive, in connection with a change of possession,
mean substantially the same thing. They mean not concealed; not hidden; exposed to view; free from concealment,
denoting that which not merely can be, but is
opposed to potential, apparent, constructive, and imaginary; veritable; genuine; certain; absolute; real at
dissimulation, reserve, or disguise; in full existence;

present time, as a matter of fact, not merely nominal; opposed to form; actually existing; true; not including admitting,
or pertaining to any others; undivided; sole; opposed to inclusive. Bass v. Pease, 79 Ill. App. 308, 318.

Should means mustits certain


Supreme Court of Oklahoma 94
(Kelsey v. Dollarsaver Food Warehouse of Durant, Supreme
Court of Oklahoma, 1994.
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn14//ghs-kw)
The turgid phrase - "should be and the same hereby is" - is a tautological absurdity. This is so because "

should" is synonymous with

must

ought or
and is in itself sufficient to effect an inpraesenti ruling - one that is couched in "a present indicative synonymous with ought." See infra
note 15. 3 Carter v. Carter, Okl., 783 P.2d 969, 970 (1989); Horizons, Inc. v. Keo Leasing Co., Okl., 681 P.2d 757, 759 (1984); Amarex, Inc. v. Baker, Okl.,
655 P.2d 1040, 1043 (1983); Knell v. Burnes, Okl., 645 P.2d 471, 473 (1982); Prock v. District Court of Pittsburgh County, Okl., 630 P.2d 772, 775 (1981);
Harry v. Hertzler, 185 Okl. 151, 90 P.2d 656, 659 (1939); Ginn v. Knight, 106 Okl. 4, 232 P. 936, 937 (1925). 4 "Recordable" means that by force of 12 O.S.
1991 24 an instrument meeting that section's criteria must be entered on or "recorded" in the court's journal. The clerk may "enter" only that which is
"on file." The pertinent terms of 12 O.S. 1991 24 are: "Upon the journal record required to be kept by the clerk of the district court in civil cases . . . shall
be entered copies of the following instruments on file: 1. All items of process by which the court acquired jurisdiction of the person of each defendant in
the case; and 2. All instruments filed in the case that bear the signature of the and judge and specify clearly the relief granted or order made." [Emphasis
added.] 5 See 12 O.S. 1991 1116 which states in pertinent part: "Every direction of a court or judge made or entered in writing, and not included in a
judgment is an order." [Emphasis added.] 6 The pertinent terms of 12 O.S. 1993 696.3 , effective October 1, 1993, are: "A. Judgments, decrees and
appealable orders that are filed with the clerk of the court shall contain: 1. A caption setting forth the name of the court, the names and designation of the
parties, the file number of the case and the title of the instrument; 2. A statement of the disposition of the action, proceeding, or motion, including a
statement of the relief awarded to a party or parties and the liabilities and obligations imposed on the other party or parties; 3. The signature and title of
the court; . . ." 7 The court holds that the May 18 memorial's recital that "the Court finds that the motions should be overruled" is a "finding" and not a
ruling. In its pure form, a finding is generally not effective as an order or judgment. See, e.g., Tillman v. Tillman, 199 Okl. 130, 184 P.2d 784 (1947), cited in
the court's opinion. 8 When ruling upon a motion for judgment n.o.v. the court must take into account all the evidence favorable to the party against
whom the motion is directed and disregard all conflicting evidence favorable to the movant. If the court should conclude the motion is sustainable, it must
hold, as a matter of law, that there is an entire absence of proof tending to show a right to recover. See Austin v. Wilkerson, Inc., Okl., 519 P.2d 899, 903
(1974). 9 See Bullard v. Grisham Const. Co., Okl., 660 P.2d 1045, 1047 (1983), where this court reviewed a trial judge's "findings of fact", perceived as a
basis for his ruling on a motion for judgment n.o.v. (in the face of a defendant's reliance on plaintiff's contributory negligence). These judicial findings were
held impermissible as an invasion of the providence of the jury and proscribed by OKLA. CONST. ART, 23, 6 . Id. at 1048. 10 Everyday courthouse
parlance does not always distinguish between a judge's "finding", which denotes nisi prius resolution of fact issues, and "ruling" or "conclusion of law". The
latter resolves disputed issues of law. In practice usage members of the bench and bar often confuse what the judge "finds" with what that official
"concludes", i.e., resolves as a legal matter. 11 See Fowler v. Thomsen, 68 Neb. 578, 94 N.W. 810, 811-12 (1903), where the court determined a ruling that
"[1] find from the bill of particulars that there is due the plaintiff the sum of . . ." was a judgment and not a finding. In reaching its conclusion the court
reasoned that "[e]ffect must be given to the entire in the docket according to the manifest intention of the justice in making them." Id., 94 N.W. at 811. 12
When the language of a judgment is susceptible of two interpretations, that which makes it correct and valid is preferred to one that would render it
erroneous. Hale v. Independent Powder Co., 46 Okl. 135, 148 P. 715, 716 (1915); Sharp v. McColm, 79 Kan. 772, 101 P. 659, 662 (1909); Clay v. Hildebrand,
34 Kan. 694, 9 P. 466, 470 (1886); see also 1 A.C. FREEMAN LAW OF JUDGMENTS 76 (5th ed. 1925). 13 "Should" not only is used as a "present indicative"
synonymous with ought but also is the past tense of "shall" with various shades of meaning not always easy to analyze. See 57 C.J. Shall 9, Judgments
121 (1932). O. JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St. Louis & S.F.R. Co. v. Brown, 45 Okl. 143, 144 P. 1075, 1080-

Certain contexts mandate a


construction of the term "should" as more than merely indicating preference or
desirability. Brown, supra at 1080-81 (jury instructions stating that jurors "should" reduce the amount of damages in proportion to the amount of
81 (1914). For a more detailed explanation, see the Partridge quotation infra note 15.

contributory negligence of the plaintiff was held to imply an obligation and to be more than advisory); Carrigan v. California Horse Racing Board, 60 Wash.
App. 79, 802 P.2d 813 (1990) (one of the Rules of Appellate Procedure requiring that a party "should devote a section of the brief to the request for the fee

mean that a party is under an obligation to include the requested


segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958) ("should" would mean the same as "shall" or "must"
or expenses" was interpreted to

when used in an instruction to the jury which tells the triers they "should disregard false testimony").

2NC AT Theory
Counterinterp: process CPs are legitimate if we have a
solvency advocate
AND, process CPs good:
1. Key to educationwe need to be able to debate the
desirability of the plans regulatory process; testing all
angles of the AFF is key to determine the best policy
option
2. Key to neg groundits the only CP we can run against
regulatory AFFs
3. Predictability and fairnesstheres a huge lit base and
solvency advocate ensures its predictable
Applegate 98
(John S. Applegate holds a law degree from Harvard Law School and a bachelors degree in
English from Haverford College. Nationally recognized for his work in environmental risk
assessment and policy analysis, Applegate has written books and articles on the regulation of
toxic substances, defense nuclear waste, public participation in environmental decisions, and
international environmental law. He serves on the National Academy of Sciences Nuclear and
Radiation Studies Board. In addition, he is an award-winning teacher, known for his ability to
present complex information with an engaging style and wry wit. Before coming to IU,
Applegate was the James B. Helmer, Jr. Professor of Law at the University of Cincinnati College
of Law. He also was a visiting professor at the Vanderbilt University School of Law. From 1983
to 1987, Applegate practiced environmental law in Washington, D.C., with the law firm of
Covington & Burling. He clerked for the late Judge Edward S. Smith of the U.S. Court of
Appeals for the Federal Circuit. John S. Applegate was named Indiana Universitys first vice
president for planning and policy in July 2008. In March 2010, his portfolio was expanded and
his title changed to vice president for university regional affairs, planning, and policy. In
February 2011, he became executive vice president for regional affairs, planning, and policy.
As Executive Vice President for University Academic Affairs since 2013, his office ensures
coordination of university academic matters, strategic plans, external academic relations,
enterprise systems, and the academic policies that enable the university to most effectively
bring its vast intellectual resources to bear in serving the citizens of the state and nation. The
regional affairs mission of OEVPUAA is to lead the development of a shared identity and
mission for all of IU's regional campuses that complements each campus's individual identity
and mission. In addition, Executive Vice President Applegate is responsible for public safety
functions across the university, including police, emergency management, and environmental
health and safety. In appointing him in 2008, President McRobbie noted that "John Applegate
has proven himself to be very effective at many administrative and academic initiatives that
require a great deal of analysis and coordination within the university and with external
agencies, including the Indiana Commission for Higher Education. His experience and
understanding of both academia and the law make him almost uniquely suited to take on
these responsibilities. In 2006, John Applegate was appointed Indiana Universitys first
Presidential Fellow, a role in which he served both President Emeritus Adam Herbert and
current President Michael McRobbie. A distinguished environmental law scholar, Applegate
joined the IU faculty in 1998. He is the Walter W. Foskett Professor of Law at the Indiana
University Maurer School of Law in Bloomington and also served as the schools executive
associate dean for academic affairs from 2002-2009. Applegate, J. S. Beyond the Usual
Suspects: The Use of Citizen Advisory Boards in Environmental Decisionmaking, Indiana Law
Journal, Volume 73, Issue 3, July 1, 1998.
http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=1939&context=ilj//ghs-kw)

There is substantial literature on negotiated rulemaking . The


interested reader might begin with the Negotiated Rulemaking Act of 1990, 5
U.S.C. 561-570 (1994 & Supp. II 1996), Freeman, supra note 53, Philip J. Harter,
Negotiating Regulations: A Cure for Malaise , 71 GEO. L.J. I (1982), Henry E. Perritt,
Jr., Negotiated Rulemaking Before Federal Agencies: Evaluation of the
Recommendations by the Administrative Conference of the United States , 74
GEO. L.J. 1625 (1986), Lawrence Susskind & Gerard McMahon, The Theory and

Practice of Negotiated Rulemaking, 3 YALE J. ON REG. 133 (1985), and an excellent,


just-published issue on regulatory negotiation, Twenty-Eighth Annual
Administrative Law Issue, 46 DUKE L.J. 1255 (1997)

4. Decision making skillsreg neg is uniquely key to decision


making skills
Fiorino 88
(Daniel J. Fiorino holds a PhD & MA in Political Science from Johns Hopkins University and a BA
in Political Science & Minor in Economics from Youngstown State University. Daniel J. Fiorino is
the Director of the Center for Environmental Policy and Executive in Residence in the School of
Public Affairs at American University. As a faculty member in the Department of Public
Administration and Policy, he teaches courses on environmental policy, energy and climate
change, environmental sustainability, and public management. Dan is the author or co-author
of four books and some three dozen articles and book chapters in his field. According to
Google Scholar, his work has been cited some 2300 times in the professional literature. His
book, The New Environmental Regulation, won the Brownlow Award of the National Academy
of Public Administration (NAPA) for excellence in public administration literature in 2007.
Altogether his publications have received nine national and international awards from the
American Society for Public Administration, Policy Studies Organization, Academy of
Management, and NAPA. His most recent refereed journal articles were on the role of
sustainability in Public Administration Review (2010); explanations for differences in national
environmental performance in Policy Sciences (2011); and technology innovation in renewable
energy in Policy Studies Journal (2013). In 2009 he was a Public Policy Scholar at the Woodrow
Wilson International Center for Scholars. He also serves as an advisor on environmental and
sustainability issues for MDB, Inc., a Washington, DC consulting firm. Dan joined American
University in 2009 after a career at the U.S. Environmental Protection Agency (EPA). Among
his positions at EPA were the Associate Director of the Office of Policy Analysis, Director of the
Waste and Chemicals Policy Division, Senior Advisor to the Assistant Administrator for Policy,
and the Director of the National Environmental Performance Track. The Performance Track
program was selected as one of the top 50 innovations in American government 2006 and
recognized by Administrator Christine Todd Whitman with an EPA Silver Medal in 2002. In
1993, he received EPAs Lee M. Thomas Award for Management Excellence. He has appeared
on or been quoted in several media outlets: the Daily Beast, Newsweek, Christian Science
Monitor, Australian Broadcasting Corporation, Agence France-Presse, and CCTV, on such topics
as air quality, climate change, the BP Horizon Oil Spill, carbon trading, EPA, and U.S.
environmental and energy politics. He currently is co-director of a project on Conceptual
Innovations in Environmental Policy with James Meadowcroft of Carleton University, funded
by the Canada Research Council on Social Sciences and the Humanities. He is a member of the
Partnership on Technology and the Environment with the Heinz Center, Environmental Defense
Fund, Nicholas Institute, EPA, and the Wharton School. He is conducting research on the role
of sustainability in policy analysis and the effects of regulatory policy design and
implementation on technology innovation. In 2013, he created the William K. Reilly Fund for
Environmental Governance and Leadership within the Center for Environmental Policy, working
with associates of Mr. Reilly and several corporate and other sponsors. He is a Fellow of the
National Academy of Public Administration. Dan is co-editor, with Robert Durant, of the
Routledge series on Environmental Sustainability and Public Administration. He is often is
invited to speak to business and academic audiences, most recently as the keynote speaker at
a Tel Aviv University conference on environmental regulation in May 2013. In the summer of
2013 he will present lectures and take part in several events as the Sir Frank Holmes Visiting
Fellow at Victoria University in New Zealand. Fiorino, D. J. Regulatory Negotiations as a Policy
Process, Public Administration Review, Vol 48, No 4, pp 764-772, July-August 1988.
http://www.jstor.org/discover/10.2307/975600?
uid=3739728&uid=2&uid=4&uid=3739256&sid=21104541489843//ghs-kw)

regulatory negotiation reflects the trend


toward alternative dispute settlement. However, because regulatory negotiation is
prospective and general in its application rather than limited to a specific dispute, it also
reflects another theme in American public policy making. That theme is pluralism, or what Robert
Thus, in its premises, objectives, and techniques,

Reich has described in the context of administrative rulemaking interest-group mediation (Reich 1985,

negotiation as a form of regulatory


policy making, especially its contrasts with more analytical policy models. Reich
proposes interest-group mediation and net-benefit maximization as the two
visions that dominate administrative policy making. The first descends from
pluralist political science and was more influential in the 1960s and early 1970s. The second
pp. 1619-1620).[20] Reich's analysis sheds light on

descends from decision theory and micro-economics, and it was more influential in the
late 1970s and early 1980s. In the first, the administrator is a referee who brings affected interests into the
policy process to reconcile their demands and preferences. In the net-benefit model, the administrator is
an analyst who defines policy options, quantifies the likely consequences of each, compares them to a
given set of objectives, and then selects the option offering the greatest net benefit or social utility.

Under the interest-group model, objectives emerge from the bargaining


among influential groups, and a good decision is one to which the parties will
agree. Under the net-benefit model, objectives are articulated in advance as
external guides to the policy process. A good decision is one that meets the
criterion of economic efficiency, defined ideally as a state in which no one
party can improve its position without worsening that of another. 21

5. Policy educationreg negs are a key part of the policy


process
Spector 99,
(Bertram I. Spector, Senior Technical Director at Management Systems International (MSI) and
Executive Director of the Center for Negotiation Analysis. Ph.D. in Political Science from New
York University, May, 1999, Negotiated Rulemaking: A Participative Approach to ConsensusBuilding for Regulatory Development and Implementation, Technical Notes: A Publication of
USAIDs Implementing Policy Change Project, http://www.negotiations.org/Tn-10%20%20Negotiated%20Rulemaking.pdf) AJ

Why use negotiated rulemaking? What are the implications for policy reform, the
implementation of policy changes, and conflict between stakeholders and government? First, the
process generates an environment for dialogue that facilitates the reality
testing of regulations before they are implemented. It enables policy reforms
to be discussed in an open forum by stakeholders and for tradeoffs to be
made that expedite compliance among those who are directly impacted by
the reforms. Second, negotiated rulemaking is a process of
empowerment. It encourages the participation and enfranchisement of parties that have a stake in
reform. It provides voice to interests, concerns and priorities that otherwise
might not be heard or considered in devising new policy. Third, it is a
process that promotes creative but pragmatic solutions. By encouraging a
holistic examination of the policy area, negotiated rulemaking asks the participants to
assess the multiple issues and subissues involved, set priorities among them,
and make compromises. Such rethinking often yields novel and unorthodox
answers. Fourth, negotiated rulemaking offers an efficient mechanism
for policy implementation. Experience shows that it results in earlier
implementation; higher compliance rates; reduced time, money and effort
spent on enforcement; increased cooperation between the regulator and
regulated parties; and reduced litigation over the regulations. Regulatory
negotiations can yield both better solutions and more efficient
compliance.

6. At worse, reject the argument, not the team

2NC AT Agency Responsiveness


No difference in agency responsiveness
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)
3. Negotiated Rulemaking Does Not Abrogate the Agency's Responsibility to Execute Delegated Authority Overall,
the evidence from Phase II is generally inconsistent with the theoretical but empirically untested claim that EPA has
failed to retain its responsibility for writing rules in negotiated settings. Recall that theorists disagree over whether
reg neg will increase agency responsiveness. Most scholars assume that EPA retains more authority in conventional
rulemaking, and that participants exert commensurately less influence over conventional as opposed to negotiated
rules. To test this hypothesis, Kerwin and Langbein asked participants about disproportionate influence and about
agency responsiveness to the respondent personally, as well as agency responsiveness to the public in general. The

the agency is equally responsive to participants in


conventional and negotiated rulemaking, consistent with the hypothesis
that the agency listens to the affected parties regardless of the method of
rule development. Further, when asked what they disliked about the process, less than 10% of both
results suggest that

negotiated and conventional participants volunteered "disproportionate influence." When asked whether any party
had disproportionate influence during rule development, 44% of conventional respondents answered "yes,"

EPA was as likely to be viewed as having


disproportionate influence in negotiated as conventional rules (25% versus 32%
compared to 48% of reg neg respondents. In addition,

respectively). It follows that roughly equal proportions of participants in negotiated and conventional rules viewed
other participants, and especially EPA, as having disproportionate influence. Kerwin and Langbein asked those who
reported disproportionate influence what about the rule led them to believe that lopsided influence existed. In

negotiated rulemaking participants were significantly more likely to see


excessive influence by one party in the process rather than in the rule itself, as
compared to conventional participants (55% versus 13% respectively). However, when asked what it
response,

was about the process that fostered disproportionate influence, conventional rule participants were twice as likely
as negotiated rule participants to point to the central role of EPA (63% versus 30% respectively). By contrast,
negotiated rule participants pointed to other participants who were particularly vocal and active during the
negotiation sessions (26% of negotiated rule respondents versus no conventional respondents). When asked about
agency responsiveness, negotiated rule participants were significantly more likely than conventional rule
participants to view both general participation, and their personal participation, as having a "major" impact on the
proposed rule. By contrast, conventional participants were more likely to see "major" differences between the
proposed and final rule and to believe that public participation and their own participation had a "moderate" or

negotiated rules are


designed so that public participation should have its greatest impact on the
proposed rule; conventional rules are structured so that public participation should
have its greatest impact on the final rule. Given these differences in how the two processes are designed, Kerwin and Langbein sought to measure agency responsiveness overall , rather than at the two
separate moments of access. Although the differences were not statistically significant , the results
"major" impact on that change. These results conform to the researchers' expectations:

suggest that conventional participants perceived their public and personal contribution to rulemaking to have had

given the
absence of statistical significance, we agree with the researchers that it is safer to conclude that the
agency is equally responsive to both conventional and negotiated rule
participants.
slightly more impact than negotiated rule participants perceived their contribution to have had. Still,

2NC AT Cost
Reg negs are more cost effective
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

Negotiated Rulemaking Has Fulfilled its Goals. If better rules were the aspirations for negotiated
rulemaking, the question remains as to whether the process has lived up to the expectations. From my own
personal experience, the

rules that emerge from negotiated rulemaking tend to be


both more stringent and yet more cost effective to implement. That
somewhat paradoxical result comes precisely from the practical orientation of the
committee: it can figure out what information is needed to make a reasonable,
responsible decision and then what actions will best achieve the goal; it can,
therefore, avoid common regulatory mistakes that are costly but do not contribute
substantially to accomplishing the task. The only formal evaluation of negotiated rulemaking that
has been conducted supports these observations. After his early article analyzing the time required for negotiated
rulemaking, Neil Kerwin undertook an evaluation of negotiated rulemaking at the Environmental Protection Agency

Kerwin and Langbein conducted a study of negotiated


rulemaking by examining what actually occurs in a reg neg versus the development of rules by conventional
with Dr. Laura Langbein.103

means. To establish the requisite comparison, they collected data on litigation, data from the comments on
proposed rules, and data from systematic, open-ended interviews with participants in 8 negotiated rules . . . and in
6 comparable conventional rules.104 They interviewed 51 participants of conventional rulemaking and 101 from

Kerwin and Langbeins important work provides


the only rigorous, empirical evaluation that compares a number of factors of
conventional and negotiated rulemaking. Their overall conclusion is: Our research contains strong
but qualified support for the continued use of negotiated rulemaking. The strong
support comes in the form of positive assessments provided by participants in
negotiated rulemaking compared to assessments offered by those involved in
conventional form of regulation development. Further, there is no evidence that
negotiated rules comprise an abrogation of agency authority, and negotiated rules
appear no more (or less) subject to litigation that conventional rules. It is also true that
various negotiated rulemaking committees.105

negotiated rulemaking at the EPA is used largely to develop rules that entail particularly complex issues regarding
the implementation and enforcement of legal obligations rather than those that set the substantive standards
themselves. However, participants

assessments of the resulting rules are more positive


when the issues to be decided entail those of establishing rather than enforcing the
standard. Further, participants assessments are also more positive when the issues
to be decided are relatively more complex. Our research would support a recommendation that

negotiated rulemaking continue to be applied to complex issues, and more widely applied to include those entailing
the standard itself.106 Their findings are particularly powerful when comparing individual attributes of negotiated

negotiated rules
were viewed more favorably in every criteria, and significantly so in several
dimensions that are often contentious in regulatory debates the economic efficiency of
and conventional rules. Table 3 contains a summary of those comparisons. Importantly,

the rule and its cost effectiveness the quality of the scientific evidence and the incorporation of appropriate
technology, and personal experience is not usually considered in dialogues over regulatory procedure, Kerwin

The benefits envisioned by the


proponents of negotiated rulemaking have indeed been realized. That is
and Langbeins findings here too favor negotiated rules. Conclusion.

demonstrated both by Coglianeses own methodology when properly


understood and by the only careful and comprehensive comparative study .
Reg neg has proven to be an enormously powerful tool in addressing highly
complex, politicized rules. These are the very kind that stall agencies when using
traditional or conventional procedures.107 Properly understood and used
appropriately, negotiated rulemaking does indeed fulfill its expectations

Reg negs are cheaper


Langbein and Kerwin 00
(Laura I. Langbein is a quantitative methodologist and professor of public administration and policy at
American University in Washington, D.C. She teaches quantitative methods, program evaluation,
policy analysis, and public choice. Her articles have appeared in journals on politics, economics, policy
analysis and public administration. Langbein received a BA in government from Oberlin College in
1965 and a PhD in political science from the University of North Carolina at Chapel Hill in 1972. She
has taught at American University since 1973: until 1978 as an assistant professor in the School of
Government and Public Administration; from 1978 to 1983 as an associate professor in the School of
Government and Public Administration; and since 1983 as a professor in the School of Public Affairs.
She is also a private consultant on statistics, research design, survey research, and program
evaluation and an accomplished clarinetist. Cornelius Martin "Neil" Kerwin (born April 10, 1949)(2) is
an American educator in public administration and president of American University. A 1971
undergraduate alumnus of American University, Kerwin continued his education with a Master of Arts
degree in political science from the University of Rhode Island in 1973. In 1975, Kerwin returned to his
alma mater and joined the faculty of the American University School of Public Affairs, then the School
of Government and Public Administration. Kerwin completed his doctorate in political science from
Johns Hopkins University in 1978 and continued to teach until 1989, when he became the dean of the
school. Langbein, L. I. Kerwin, C. M. Regulatory Negotiation versus Conventional Rule Making: Claims,
Counterclaims, and Empirical Evidence, Journal of Public Administration Research and Theory, July
2000. http://jpart.oxfordjournals.org/content/10/3/599.full.pdf//ghs-kw)

negotiated rule making. The


strong support comes in the form of positive assessments provided by participants
in negotiated rule making compared to assessments offered by those involved in
conventional forms of regulation development. There is no evidence that negotiated
rules comprise an abrogation of agency authority , and negotiated rules appear no more (or less)
subject to litigation than conventional rules. It is also true that negotiated rule making at the EPA is used
largely to develop rules that entail particularly complex issues regarding the
implementation and enforcement of legal obligations rather than rules that set substantive
Our research contains strong but qualified support for the continued use of

standards. However, participants' assessments of the resulting rules are more positive when the issues to be
decided entail those of establishing rather than enforcing the standard. Participants' assessments are also more
positive when the issues to be decided are relatively less complex. But even when these and other variables are

assessments are significantly more positive than those of


participants in conventional rule making. In short, the process itself seems to affect
participants' views of the rule making, independent of differences between the
types of rules chosen for conventional and negotiated rule making, and independent
of differences among the participants, including differences in their views of the
economic net benefits of the particular rule . This finding is consistent with theoretical expectations
controlled, reg neg participants' overall

regarding the importance of participation and the importance of face-to-face communication to increase the
likelihood of Pareto-improving social outcomes. With respect to participation, previous research indicates that

compliance with a law or regulation and support for policy choice are more likely to
be forthcoming not only when it is economically rational but also when the process
by which the decision is made is viewed as fair (Tyler 1990; Kunreuther et al. 1993; Frey and
Oberholzer-Gee 1996). While we did not ask respondents explicitly to rate the fairness of the rule-making process in

evidence presented in this study shows that reg neg participants


rated the overall process (with and without statistical controls in exhibits 9 and 1 respectively) and the
ability of EPA equitably to implement the rule (exhibit 1) significantly higher than
conventional rule-making participants did. Further, while conventional rule-making participants were
which they participated,

more likely to say that there was no party with disproportionate influence during the development of the rule, reg
neg participants voluteered significantly more positive comments and significantly fewer negative comments about
the process overall. In general,

reg neg appears more likely than conventional rule making to

leave participants with a warm glow about the decision-making process. While the
regression results show that the costs and benefits of the rule being promulgated figure prominently into the

process matters too. Participants care not only


about how rules and policies affect them economically, they also care about how
the authorities who make and implement rules and policies treat them (and others). In
fact, one reg neg respondent, the owner of a small shop that manufactured wood
burning stoves, remarked about the woodstoves rule, which would put him out of
business, that he felt satisfied even as he participated in his own "wake." It remains for
respondents' overall assessment of the final rule,

further research to show whether this warm glow affects long term compliance and whether it extends to affected
parties who were not direct participants in the negotiation process. It is unclear from our research whether greater
satisfaction with negotiated rules implies that negotiated rules are Pareto-superior to conventionally written
rules.13 Becker's (1983) theory of political competition among interest groups implies that in the absence of
transactions costs, groups that bear large costs and opposing groups that reap large benefits have directly
proportional and equal incentives to lobby. Politicians who seek to maximize net political support respond by
balancing costs and benefits at the margin, and the resulting equilibrium will be no worse than market failure would
be. Transactions costs, however, are not zero, and they may not be equal for interests on each side of an issue. For
example, in many environmental policy issues, the benefits are dispersed and occur in the future, while some, but

transactions costs are different


for beneficiaries than for losers. If reg neg reduces transactions costs compared to conventional rule
not all, costs are concentrated and occur now. The consequence is that

making, or if reg neg reduces the imbalance in transactions costs between winners and losers, or among different

it might be reasonable to expect negotiated rules to be


Pareto-superior to conventionally written rules. Reg neg may reduce transactions
costs in two ways. First, participation in writing the proposed rule (which sets the agenda
that determines the final rule) is direct, at least for the participants. In conventional rule making, each
interest has a repeated, bilateral relation with the rule-making agency; the rule-making agency proposes
the rule (and thereby controls the agenda for the final rule), and affected interests respond separately to what is
kinds of winners and losers, then

in the agency proposal. In negotiated rule making, each interest (including the agency) is in a repeated N-person
set of mutual relations; the negotiating group drafts the proposed rule, thereby setting the agenda for the final rule.

Since the agency probably knows less about each group's costs and benefits than
the group knows about its own costs and benefits, the rule that emerges from direct
negotiation should be a more accurate reflection of net benefits than one that is
written by the agency (even though the agency tries to be responsive to the affected parties). In effect,
reg neg can be expected to better establish a core relationship of trust,
reputation, and reciprocity that Ostrom (1998) argues is central to improving net social
benefits. Reg neg may reduce transactions costs not only by entailing repeated
mutual rather than bilateral relations, but also by face to face communication . Ostrom
(1998, 13) argues that face-to-face communication reduces transactions costs by making
it easier to assess trustworthiness and by lowering the decision costs of reaching a
"contingent agreement," in which "individuals agree to contribute x resources to a
common effort so long as at least y others also contribute." In fact, our survey results show
that reg neg participants are significantly more likely than conventional rule-making
participants to believe that others will comply with the final rule (exhibit 1). In the absence
of outside assessments that compare net social benefits of the conventional and negotiated rules in this study,15
the hypothesis that reg neg is Pareto superior to conventional rule making remains an untested speculation.
Nonetheless, it seems to be a plausible hypothesis based on recent theories regarding the importance of
institutions that foster participation in helping to effect Pareto-preferred social outcomes.

2NC AT Consensus
Negotiating parties fear the alternative, which is worse than
reg neg
Perritt 86
(Professor Perritt earned his B.S. in engineering from MIT in 1966, a master's degree in management
from MIT's Sloan School in 1970, and a J.D. from Georgetown University Law Center in 1975. Henry H.
Perritt, Jr., is a professor of law at IIT Chicago-Kent College of Law. He served as Chicago-Kent's dean
from 1997 to 2002 and was the Democratic candidate for the U.S. House of Representatives in the
Tenth District of Illinois in 2002. Throughout his academic career, Professor Perritt has made it
possible for groups of law and engineering students to work together to build a rule of law, promote
the free press, assist in economic development, and provide refugee aid through "Project Bosnia,"
"Operation Kosovo" and "Destination Democracy." Professor Perritt is the author of more than 75 law
review articles and 17 books on international relations and law, technology and law, employment law,
and entertainment law, including Digital Communications Law, one of the leading treatises on Internet
law; Employee Dismissal Law and Practice, one of the leading treatises on employment-at-will; and
two books on Kosovo: Kosovo Liberation Army: The Inside Story of an Insurgency, published by the
University of Illinois Press, and The Road to Independence for Kosovo: A Chronicle of the Ahtisaari
Plan, published by Cambridge University Press. He is active in the entertainment field, as well, writing
several law review articles on the future of the popular music industry and of video entertainment. He
also wrote a 50-song musical about Kosovo, You Took Away My Flag, which was performed in Chicago
in 2009 and 2010. A screenplay for a movie about the same story and characters has a trailer online
and is being shopped to filmmakers. His two new plays, Airline Miles and Giving Ground, are scheduled
for performances in Chicago in 2012. His novel, Arian, was published by Amazon.com in 2012. He has
two other novels in the works. He served on President Clinton's Transition Team, working on
telecommunications issues, and drafted principles for electronic dissemination of public information,
which formed the core of the Electronic Freedom of Information Act Amendments adopted by Congress
in 1996. During the Ford administration, he served on the White House staff and as deputy under
secretary of labor. Professor Perritt served on the Computer Science and Telecommunications Policy
Board of the National Research Council, and on a National Research Council committee on "Global
Networks and Local Values." He was a member of the interprofessional team that evaluated the FBI's
Carnivore system. He is a member of the bars of Virginia (inactive), Pennsylvania (inactive), the
District of Columbia, Maryland, Illinois and the United States Supreme Court. He is a member of the
Council on Foreign Relations and served on the board of directors of the Chicago Council on Foreign
Relations, on the Lifetime Membership Committee of the Council on Foreign Relations, and as
secretary of the Section on Labor and Employment Law of the American Bar Association. He is vicepresident and a member of the board of directors of The Artistic Home theatre company, and is
president of Mass. Iota-Tau Association, the alumni corporation for the SAE fraternity chapter at MIT.
Perritt, H. H. Negotiated Rulemaking Before Federal Agencies: Evaluation of Recommendations By the
Administrative Conference of the United States, Georgetown Law Journal, Volume 74. August, 1976.
http://www.kentlaw.edu/perritt/publications/74_GEO._L.J._1625.htm//ghs-kw)

The negotiations moved slowly until the FAA submitted a draft rule to the
participants. This reinforced the view that the FAA would move unilaterally. It also
reminded the parties that there would be things in a unilaterally
promulgated rule that they would not like--thus reminding them that their
BATNAs were worse than what was being considered at the negotiating
table. Participation by the Vice President's Office, the Office of the Secretary of Transportation, and the OMB at
the initial session discouraged participants from thinking they could influence the contents of the rule outside the
negotiation process. One attempt to communicate with the Administrator while the negotiations were underway

The participants tacitly agreed that it would not be feasible to


develop a 'total package' to which the participants formally could agree. Instead, their
objectives were to narrow differences, explore alternative ways of achieving
objectives at less disruption to operational exigencies, and educate the FAA on
practical issues. The mediator had an acute sense that the negotiation process
should stop before agreement began to erode. Accordingly, he forbore to force
explicit agreement on difficult issues, took few votes, and adjourned the
negotiations when things began to unravel. In addition, the FAA, the mediator, and
participants were tolerant of the political need of participants to adhere to
positions formally, even though signals were given that participants could
live with something else. Agency participation in the negotiating sessions was crucial to the
was rebuffed. [FN263]

usefulness of this type of process. Because the agency was there, it could form its own impressions of what a

it was easy for the agency to


proceed with a consensus standard because it had an evolving sense of the
consensus. Without agency participation, a more formal step would have been necessary to communicate
party's real position was, despite adherence to formal positions. In addition,

negotiating group views to the agency. Taking this formal step could have proven difficult or impossible because it

the presence of an outside


contractor who served as drafter was of some assistance. The drafter, a former FAA
employee, assisted informally in resolving internal FAA disagreements over the
proposed rule after negotiations were adjourned.
would have necessitated more formal participant agreement. In addition,

Reg neg produces participant satisfaction and reduces conflict


consensus will happen
Langbein and Kerwin 00
(Laura I. Langbein is a quantitative methodologist and professor of public administration and policy at
American University in Washington, D.C. She teaches quantitative methods, program evaluation,
policy analysis, and public choice. Her articles have appeared in journals on politics, economics, policy
analysis and public administration. Langbein received a BA in government from Oberlin College in
1965 and a PhD in political science from the University of North Carolina at Chapel Hill in 1972. She
has taught at American University since 1973: until 1978 as an assistant professor in the School of
Government and Public Administration; from 1978 to 1983 as an associate professor in the School of
Government and Public Administration; and since 1983 as a professor in the School of Public Affairs.
She is also a private consultant on statistics, research design, survey research, and program
evaluation and an accomplished clarinetist. Cornelius Martin "Neil" Kerwin (born April 10, 1949)(2) is
an American educator in public administration and president of American University. A 1971
undergraduate alumnus of American University, Kerwin continued his education with a Master of Arts
degree in political science from the University of Rhode Island in 1973. In 1975, Kerwin returned to his
alma mater and joined the faculty of the American University School of Public Affairs, then the School
of Government and Public Administration. Kerwin completed his doctorate in political science from
Johns Hopkins University in 1978 and continued to teach until 1989, when he became the dean of the
school. Langbein, L. I. Kerwin, C. M. Regulatory Negotiation versus Conventional Rule Making: Claims,
Counterclaims, and Empirical Evidence, Journal of Public Administration Research and Theory, July
2000. http://jpart.oxfordjournals.org/content/10/3/599.full.pdf//ghs-kw)

negotiated rule making. The


strong support comes in the form of positive assessments provided by participants
in negotiated rule making compared to assessments offered by those involved in
conventional forms of regulation development. There is no evidence that negotiated
rules comprise an abrogation of agency authority , and negotiated rules appear no more (or less)
subject to litigation than conventional rules. It is also true that negotiated rule making at the EPA is used
largely to develop rules that entail particularly complex issues regarding the
implementation and enforcement of legal obligations rather than rules that set substantive
Our research contains strong but qualified support for the continued use of

standards. However, participants' assessments of the resulting rules are more positive when the issues to be
decided entail those of establishing rather than enforcing the standard. Participants' assessments are also more
positive when the issues to be decided are relatively less complex. But even when these and other variables are

assessments are significantly more positive than those of


conventional rule making. In short, the process itself seems to affect
participants' views of the rule making, independent of differences between the
types of rules chosen for conventional and negotiated rule making, and independent
of differences among the participants, including differences in their views of the
economic net benefits of the particular rule . This finding is consistent with theoretical expectations
controlled, reg neg participants' overall
participants in

regarding the importance of participation and the importance of face-to-face communication to increase the
likelihood of Pareto-improving social outcomes. With respect to participation, previous research indicates that

compliance with a law or regulation and support for policy choice are more likely to
be forthcoming not only when it is economically rational but also when the process
by which the decision is made is viewed as fair (Tyler 1990; Kunreuther et al. 1993; Frey and

Oberholzer-Gee 1996). While we did not ask respondents explicitly to rate the fairness of the rule-making process in

evidence presented in this study shows that reg neg participants


rated the overall process (with and without statistical controls in exhibits 9 and 1 respectively) and the
ability of EPA equitably to implement the rule (exhibit 1) significantly higher than
which they participated,

conventional rule-making participants did. Further, while conventional rule-making participants were
more likely to say that there was no party with disproportionate influence during the development of the rule, reg
neg participants voluteered significantly more positive comments and significantly fewer negative comments about

reg neg appears more likely than conventional rule making to


leave participants with a warm glow about the decision-making process. While the
the process overall. In general,

regression results show that the costs and benefits of the rule being promulgated figure prominently into the

process matters too. Participants care not only


about how rules and policies affect them economically, they also care about how
the authorities who make and implement rules and policies treat them (and others). In
fact, one reg neg respondent, the owner of a small shop that manufactured wood
burning stoves, remarked about the woodstoves rule, which would put him out of
business, that he felt satisfied even as he participated in his own "wake." It remains for
respondents' overall assessment of the final rule,

further research to show whether this warm glow affects long term compliance and whether it extends to affected
parties who were not direct participants in the negotiation process. It is unclear from our research whether greater
satisfaction with negotiated rules implies that negotiated rules are Pareto-superior to conventionally written
rules.13 Becker's (1983) theory of political competition among interest groups implies that in the absence of
transactions costs, groups that bear large costs and opposing groups that reap large benefits have directly
proportional and equal incentives to lobby. Politicians who seek to maximize net political support respond by
balancing costs and benefits at the margin, and the resulting equilibrium will be no worse than market failure would
be. Transactions costs, however, are not zero, and they may not be equal for interests on each side of an issue. For
example, in many environmental policy issues, the benefits are dispersed and occur in the future, while some, but

transactions costs are different


for beneficiaries than for losers. If reg neg reduces transactions costs compared to conventional rule
not all, costs are concentrated and occur now. The consequence is that

making, or if reg neg reduces the imbalance in transactions costs between winners and losers, or among different

it might be reasonable to expect negotiated rules to be


Pareto-superior to conventionally written rules. Reg neg may reduce transactions
costs in two ways. First, participation in writing the proposed rule (which sets the agenda
that determines the final rule) is direct, at least for the participants. In conventional rule making, each
interest has a repeated, bilateral relation with the rule-making agency; the rule-making agency proposes
the rule (and thereby controls the agenda for the final rule), and affected interests respond separately to what is
kinds of winners and losers, then

in the agency proposal. In negotiated rule making, each interest (including the agency) is in a repeated N-person
set of mutual relations; the negotiating group drafts the proposed rule, thereby setting the agenda for the final rule.

Since the agency probably knows less about each group's costs and benefits than
the group knows about its own costs and benefits, the rule that emerges from direct
negotiation should be a more accurate reflection of net benefits than one that is
written by the agency (even though the agency tries to be responsive to the affected parties). In effect,
reg neg can be expected to better establish a core relationship of trust,
reputation, and reciprocity that Ostrom (1998) argues is central to improving net social
benefits. Reg neg may reduce transactions costs not only by entailing repeated
mutual rather than bilateral relations, but also by face to face communication . Ostrom
(1998, 13) argues that face-to-face communication reduces transactions costs by making
it easier to assess trustworthiness and by lowering the decision costs of reaching a
"contingent agreement," in which "individuals agree to contribute x resources to a
common effort so long as at least y others also contribute." In fact, our survey results show
that reg neg participants are significantly more likely than conventional rule-making
participants to believe that others will comply with the final rule (exhibit 1). In the absence
of outside assessments that compare net social benefits of the conventional and negotiated rules in this study,15
the hypothesis that reg neg is Pareto superior to conventional rule making remains an untested speculation.
Nonetheless, it seems to be a plausible hypothesis based on recent theories regarding the importance of
institutions that foster participation in helping to effect Pareto-preferred social outcomes.

A consensus will be reachedparties have incentives to


cooperate and compromise
Harter 09
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.

Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States. Harter, P. J.
Collaboration: The Future of Governance, Journal of Dispute Resolution, Volume 2009, Issue 2,
Article 7. 2009. http://scholarship.law.missouri.edu/cgi/viewcontent.cgi?
article=1581&context=jdr//ghs-kw)

Consensus is often misunderstood. It is typically used, derisively, to mean a group decision that is the

consequence of a "group think" that resulted from little or no exploration of the issues, with neither general inquiry,
discussion, nor deliberation. A common example would be the boss's saying, "Do we all agree? . . . Good, we have
a consensus!" In this context, consensus is the acquiescence to an accepted point of view. It is, as is often alleged,
the lowest common denominator that is developed precisely to avoid controversy as opposed to generating a better
answer. It is a decision resulting from the lack of diversity. It is in fact actually a cascade that may be more extreme
than the views of any member! Thus, the question legitimately is, if this is the understanding of the term, would
you want it if you could get it, or would the result to too diluted? A number of articles posit, with neither
understanding nor research, that it always results in the least common denominator. Done right, however,

consensus is exactly the opposite: it is the wisdom of crowds. It builds on the insights and
experiences of diversity. And it is a vital element of collaborative governance in terms of actually reaching
agreement and in terms of the quality of the resulting agreement. That undoubtedly sounds
counterintuitive, especially for the difficult, complex, controversial matters that are
customarily the subject of direct negotiations among governments and their constituents.
Indeed, you often hear that it can't be done. One would expect that the controversy
would make consensus unlikely or that if concurrence were obtained, it would likely be so watered down
that least common denominator againthat it would not be worth much. But, interestingly, it has
exactly the opposite effect. Consensus can mean many things so it is important to understand what
is consensus for these purposes. The default definition of consensus in the Negotiated Rulemaking Act is the

each interest has


a veto over the decision, and any party may block a final agreement by withholding
concurrence. Consensus has a significant impact on how the negotiations actually
function: It makes it "safe" to come to the table. If the committee were to
make decisions by voting, even if a supermajority were required, a party might fear
being outvoted. In that case, it would logically continue to build power to achieve its will
outside the negotiations. Instead, it has the power inside the room to prevent
something from happening that it cannot live with . Thus, at least for the duration of the
negotiations, the party can focus on the substance of the policy and not build political
might. The committee is converted from a group of disparate , often antagnistic,
interests into one with a common purpose: reaching a mutually acceptable
agreement. During a policy negotiation such as this, you can actually feel the committee snap
together into a coherent whole when the members realize that . It forces the parties
to deal with each other which prevents "rolling" someone : "OK, I have the votes, so shut up
"unanimous concurrence among the interests represented on [the] . . . committee." Thus,

and let's vote." Rolling someone in a negotiation is a very good way to create an opponent, to you and to any

Having to actually listen to each other also creates a friction of ideas that
generates the "wisdom of crowds." It enables
the parties to make sophisticated proposals in which they agree to do something,
but only if other parties agree to do something in return. These "if but only if offers cannot
be made in a voting situation for fear that the offeror would not obtain the necessary quid pro quo. It also
enables the parties to develop and present information they might otherwise be reluctant to
share for fear of its being misused or used against them. A veto prevents that. If a party cannot control
the decision, it will logically amass as much factual information as possible in order to
resulting agreement.

results in better decisionsinstead of a cascade, it

limit the discretion available to the one making the decision; the theory is that if you win on the facts, the range of
choices as to what to do on the policy is considerably narrowed. Thus, records are stuffed with data that may
wellbe irrelevant to the outcome or on which the parties largely agree.

If the decision is made by

consensus, the parties do control the outcome , and as a result, they can concentrate on
making the final decision. The question for the committee then becomes, how much information do we
need to make a responsible resolution? The committee may not need to resolve many of the underlying facts before

the use of consensus can significantly reduce the


amount of defensive (or probably more accurately, offensive) record-building that customarily
attends adversarial processes. It forces the parties to look at the agreement as a
wholeconsensus is reached only on the entire package, not its individual elements. The very essence of
negotiation is that different parties value issues differently. What is important to one party is not so important to
another, and that makes for trades that maximize overall value. The resulting agreement
can be analogized to buying a house: something is always wrong with any house
you would consider buying (price, location, kitchen needs repair, etc.), but you cannot buy only
part of a house or move it to another location; the choice must be made as to which
housethe entire thingyou will purchase. It also means that the resulting decision will
not stray from the statutory mandate . That is because one of the parties to the negotiation is very
a policy choice is clear. Interestingly, therefore,

likely to benefit from an adherence to the statutory requirements and would not concur in a decision that did not

if all of the parties represented concur in the outcome, the


likelihood of a successful challenge is greatly reduced so that the decision
has a rare degree of finality.
implement it. Finally,

2NC AT Speed
Reg neg is bettersolves faster
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

the average length of EPAs negotiated rulemakings


the time it took EPA to fulfill its goal was 751 days or 32% faster than
traditional rulemaking. This knocks a full year off the average time it takes EPA to
develop rule by the traditional method. And, note these are highly complex and
controversial rules and that one of them survived Presidential intervention. Thus,
the dynamics surrounding these rules are by no mean average. This means that
reg negs actual performance is much better than that . Interestingly and consistently, the
Properly understood, therefore,

average time for all of EPAs reg negs when viewed in context is virtually identical to that of the sample drawn by
Kerwin and Furlong77 differing by less than a month. Furthermore, if all of the reg negs that were conducted by
all the agencies that were included in Coglianeses table78 were analyzed along the same lines as discussed

here,79 the average time for all negotiated rulemakings drops to less than 685 days .80
No Substantive Review of Rules Based on Reg Neg Consensus. Coglianese argues that negotiated rules are actually
subjected to a higher incident of judicial review than are rules developed by traditional methods, at least those
issued by EPA.81 But, like his analysis of the time it takes to develop rules, Coglianese fails to look at either what
happened in the negotiated rulemaking itself or the nature of any challenge. For example, he makes much of the
fact that the Grand Canyon visibility rule was challenged by interests that were not a party to the negotiations;82
yet, he also points out that this rule was not developed under the Negotiated Rulemaking Act83 which explicitly
establishes procedures that are designed to ensure that each interest can be represented. This challenge
demonstrates the value of convening negotiations.84 And, it is significantly misleading to include it when discussing
the judicial review of negotiated rules since the process of reg neg was not followed. As for Reformulated Gasoline,
the rule as issued by EPA did not reflect the consensus but rather was modified by EPA under the direction of
President Bush.85 There were, indeed, a number of challenges to the application of the rule,86 but amazingly little
to the rule itself given its history. Indeed, after the proposal was changed, many members of the committee

the fact
that the rule had been negotiated not only resulted in a much better rule, 87 it
enabled the rule to withstand in large part a massive assault. Coglianese also somehow
continued to meet in an effort to put Humpty Dumpty back together again, which they largely did;

attributes a challenge within the World Trade Organization to a shortcoming of reg neg even though such issues
were explicitly outside the purview of the committee; to criticize reg neg here is like saying surgery is not effective
when the patient refused to undergo it. While the Underground Injection rule was challenged, the committee never
reached an agreement88 and, moreover, the convening report made clear that there were very strong
disagreements over the interpretation of the governing statute that would likely have to be resolved by a Court of
Appeals. Coglianese also asserts that the Equipment Leaks rule was the subject of review; it was, but only because
the Clean Air requires parties to file challenges in a very short period, and a challenger therefore filed a defensive
challenge while it worked out some minor details over the regulation. Those negotiations were successful and the
challenge was withdrawn. The Chemical Manufacturers Association, the challenger, had no intention of a
substantive challenge.89 Moreover, a challenge to other parts of the HON should not be ascribed to the Equipment
Leaks part of the rule. The agreement in the Asbestos in Schools negotiation explicitly contemplated judicial review
strange, but true and hence it came as no surprise and as no violation of the agreement. As for the Wood
Furniture Rule, the challenges were withdrawn after informal negotiations in which EPA agreed to propose
amendments to the rule.90 Similarly, the challenge to EPAs Disinfectant By-Products Rule91 was withdrawn. In
short, the rules that have emerged from negotiated rulemaking have been remarkably resistant to substantive
challenges. And, indeed, this far into the development of the process, the standard of review and the extent to
which an agreement may be binding on either a signatory or someone whom a party purports to represent are still

Coglianese paints a
substantially misleading picture by failing to distinguish substantive challenges to
unknown the speculation of many an administrative law class.92 Thus, here too,

rules that are based on a consensus from either challenges to issues that were not
the subject of negotiations or were filed while some details were worked out.
Properly understood, reg negs have been phenomenally successful in
warding off substantive review.

Reg negs solve faster and betterCoglianeses results


concluded neg when properly interpreted
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

Negotiated Rulemaking Has Fulfilled its Goals. If better rules were the aspirations for negotiated
rulemaking, the question remains as to whether the process has lived up to the expectations. From my own
personal experience, the

rules that emerge from negotiated rulemaking tend to be


both more stringent and yet more cost effective to implement. That
somewhat paradoxical result comes precisely from the practical orientation of the
committee: it can figure out what information is needed to make a reasonable,
responsible decision and then what actions will best achieve the goal; it can,
therefore, avoid common regulatory mistakes that are costly but do not contribute
substantially to accomplishing the task. The only formal evaluation of negotiated rulemaking that
has been conducted supports these observations. After his early article analyzing the time required for negotiated
rulemaking, Neil Kerwin undertook an evaluation of negotiated rulemaking at the Environmental Protection Agency

Kerwin and Langbein conducted a study of negotiated


rulemaking by examining what actually occurs in a reg neg versus the development of rules by conventional
with Dr. Laura Langbein.103

means. To establish the requisite comparison, they collected data on litigation, data from the comments on
proposed rules, and data from systematic, open-ended interviews with participants in 8 negotiated rules . . . and in
6 comparable conventional rules.104 They interviewed 51 participants of conventional rulemaking and 101 from

Kerwin and Langbeins important work provides


the only rigorous, empirical evaluation that compares a number of factors of
conventional and negotiated rulemaking. Their overall conclusion is: Our research contains strong
but qualified support for the continued use of negotiated rulemaking. The strong
support comes in the form of positive assessments provided by participants in
negotiated rulemaking compared to assessments offered by those involved in
conventional form of regulation development. Further, there is no evidence that
negotiated rules comprise an abrogation of agency authority, and negotiated rules
appear no more (or less) subject to litigation that conventional rules. It is also true that
various negotiated rulemaking committees.105

negotiated rulemaking at the EPA is used largely to develop rules that entail particularly complex issues regarding
the implementation and enforcement of legal obligations rather than those that set the substantive standards
themselves. However, participants

assessments of the resulting rules are more positive


when the issues to be decided entail those of establishing rather than enforcing the
standard. Further, participants assessments are also more positive when the issues
to be decided are relatively more complex. Our research would support a recommendation that

negotiated rulemaking continue to be applied to complex issues, and more widely applied to include those entailing
the standard itself.106 Their findings are particularly powerful when comparing individual attributes of negotiated

negotiated rules
were viewed more favorably in every criteria, and significantly so in several
and conventional rules. Table 3 contains a summary of those comparisons. Importantly,

dimensions that are often contentious in regulatory debates

the economic efficiency of


the rule and its cost effectiveness the quality of the scientific evidence and the incorporation of appropriate
technology, and personal experience is not usually considered in dialogues over regulatory procedure, Kerwin

The benefits envisioned by the


proponents of negotiated rulemaking have indeed been realized . That is
demonstrated both by Coglianeses own methodology when properly
understood and by the only careful and comprehensive comparative study .
Reg neg has proven to be an enormously powerful tool in addressing highly
complex, politicized rules. These are the very kind that stall agencies when using
traditional or conventional procedures.107 Properly understood and used
appropriately, negotiated rulemaking does indeed fulfill its expectations
and Langbeins findings here too favor negotiated rules. Conclusion.

2NC AT Transparency
The process is transparent
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

negotiated rules were far from secret deals. The


Negotiated Rulemaking Act of 1990 (NRA) requires federal agencies to provide
notice of regulatory negotiations in the Federal Register,50 to formally charter reg
neg committees,51 and to observe the transparency and accountability
requirements52 of the Federal Advisory Committee Act. 53 Any individual or organization that
Defenders of reg neg retorted that

might be significantly affected by a proposed rule can apply for membership in a reg neg committee,54 and even
if the agency rejects their application, they remain free to attend as spectators.55 Most significantly, the

requires that the agency submit negotiated rules to traditional notice and
comment.56

NRA

2NC AT Undemocratic
The process is equal and fair
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

reg neg is superior to


conventional rulemaking on virtually all of the measures that were considered. Strikingly, the process
engenders a significant learning effect, especially compared to conventional
rulemaking; participants report, moreover, that this learning has long-term value not
confined to a particular rulemaking. Most significantly, the negotiation of rules appears to
enhance the legitimacy of outcomes. Kerwin and Langbein's data indicate that process
matters to perceptions of legitimacy. Moreover, as we have seen, reg neg participant reports
On balance, the combined results of Phase I and II of the study suggest that

of higher satisfaction could not be explained by their assessments of the outcome alone. Instead, higher
satisfaction seems to arise in part from a combination of process and substance variables. This suggests a link
between procedure and satisfaction, which is consistent with the mounting evidence in social psychology that
"satisfaction is one of the principal consequences of procedural fairness." This potential for procedure to enhance
satisfaction may prove especially salutary precisely when participants do not favor outcomes. As Tyler and Lind
have suggested, "hedonic glee" over positive outcomes may "obliterate" procedural effects; perceptions of
procedural fairness may matter more, however, "when outcomes are negative (and) organizations have the
greatest need to render decisions more palatable, to blunt discontent, and to give losers reasons to stay committed

At a minimum, the data call into question and sometimes flatly


contradictmost of the theoretical criticisms of reg neg that have surfaced in the
scholarly literature over the last twenty years. There is no evidence that
negotiated rulemaking abrogates an agency's responsibility to implement
legislation. Nor does it appear to exacerbate power imbalances or increase
the risk of capture. When asked whether any party seemed to have
disproportionate influence during the development of the rule, about the
same proportion of reg neg and conventional participants said yes.
Parties perceived their influence to be about the same for conventional
and negotiated rules, undermining the hypothesis that reg neg
exacerbates capture.
to the organization."

Commissions CP

1NC
Counterplan: The United States Congress should establish an
independent commission empowered to submit to Congress
recommendations regarding domestic federal government
surveillance. Congress will allow 60 days to pass legislation
overriding recommendations by a two-thirds majority. If
Congress doesnt vote within the specified period, those
recommendations will become law. The Commission should
recommend to Congress that _____<insert the plan>_______
Commission solves the plan
RWB 13
(Reporters Without Borders is a UNESCO and UN Consultant and a non-profit organization. US
congress urged to create commission to investigate mass snooping, RWB, 06-10-2013.
https://en.rsf.org/united-states-us-congress-urged-to-create-10-06-2013,44748.html//ghs-kw)

Reporters Without Borders calls on the US Congress to create a commission of


enquiry into the links between US intelligence agencies and nine leading
Internet sector companies that are alleged to have given them access to
their servers. The commission should also identify all the countries and
organizations that have contributed to the mass digital surveillance machinery that
according to reports in the Washington Post and Guardian newspapers in the past few days the US
authorities have created. According to these reports, the telephone company Verizon hands over
the details of the phone calls of millions of US and foreign citizens every day to the
National Security Agency (NSA), while nine Internet majors including Microsoft, Yahoo,
Facebook, Google and Apple have given the FBI and NSA direct access to
their users data under a secret programme called Prism. US intelligence agencies are
reportedly able to access all of the emails, audio and video files, instant messaging
conversations and connection data transiting through these companies servers .
According to The Guardian, Government Communication Headquarters (GCHQ), the NSAs British equivalent, also

The proposed congressional commission should evaluate


the degree to which the collected data violates privacy and therefore also freedom
of expression and information. The commissions findings must not be classified as
defence secrets. These issues protection of privacy and freedom of expression are matters of
public interest.
has access to data collected under Prism.

2NC O/V
Counterplan solves 100% of the caseCongress creates an
independent commission comprised of experts to debate the
merits of the plan, and the commission recommends to
Congress that it passes the planCongress must pass
legislation specifically blocking those recommendations within
60 days or the commissions recommendations become law
AND, that solves the AFFcommissions are empowered to
debate Internet backdoors and submit recommendations
thats RWB

2NC Solvency
Empirics prove commissions solve
FT 10
(Andrews, Edmund. Deficit Panel Faces Obstacles in Poisonous Political Atmosphere, Fiscal Times.
02-18-2010. http://www.thefiscaltimes.com/Articles/2010/02/18/Fiscal-Commission-Faces-BigObstacles?page=0%2C1//ghs-kw)

at least two previous presidential commissions


succeeded at breaking through intractable political problems when Congress was
paralyzed. The 1983 Greenspan commission, headed by Alan Greenspan, who later became
chairman of the Federal Reserve, reached an historic agreement to gradually raise
Social Security taxes and gradually increase the minimum age at which
workers qualify for Social Security retirement benefits. Those
recommendations passed both the House and Senate, and averted a
potentially catastrophic financial crisis with Social Security.
Supporters of a bipartisan deficit commission note that

2NC Solves Better


CP solves bettertechnical complexity
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

Congress may choose to establish a commission when legislators and


their staffs do not currently have sufficient knowledge or expertise in a complex policy
area.22 By assembling experts with backgrounds in particular policy areas to focus
on a specific mission, legislators might efficiently obtain insight into complex public
policy problems.23
Obtaining Expertise

2NC Cybersecurity Solvency


Commissions are keysolves legitimacy and perception
Abrahams and Bryen 14
(Rebecca Abrahams and Dr. Stephen Bryen, CCO and Chairman of Ziklag Systems, respectively. "Investigating Heartbleed,"
Huffington Post. 04-11-2014. http://www.huffingtonpost.com/rebecca-abrahams/investigating-heartbleed_b_5134404.html//ghs-kw)

But who can investigate the matter? This is a non-trivial question because the government is no longer
trustworthy. Congress could set up an independent commission to investigate compromises to computer
security. It should be staffed by experts in cryptography and by national security specialists. The
Commission, if empowered, should also make recommendations on a way forward for internet security.
What is needed is a system that is accountable, where the participants are reliable, and where there is
security from interference of any kind. Right now, no one can, or should, trust the Internet.

2NC Politics NB
No link to politicscommissions result in bipartisanship and
bypass Congressional politics
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)
Overcoming Political Complexity Complex policy issues may also create institutional problems because they do not
fall neatly within the jurisdiction of any particular committee in Congress.26 By virtue of their ad hoc status,

a commission may allow particular legislation


or policy solutions to bypass the traditional development process in Congress,
potentially removing some of the impediments inherent in a decentralized legislature. 27
Consensus Building Legislators seeking policy changes may be confronted by an array of
political interests, some in favor of proposed changes and some against. When
these interests clash, the resulting legislation may encounter gridlock in the highly
structured political institution of the modern Congress .28 By creating a
commission, Congress can place policy debates in a potentially more
flexible environment, where congressional and public attention can be
developed over time.29 Reducing Partisanship Solutions to policy problems produced
within the normal legislative process may also suffer politically from charges of
partisanship.30 Similar charges may be made against investigations conducted by Congress.31 The nonpartisan or bipartisan character of most congressional commissions may
make their findings and recommendations less susceptible to such charges
and more politically acceptable to diverse viewpoints . The bipartisan or
nonpartisan arrangement can potentially give their recommendations
strong credibility, both in Congress and among the public, even when
dealing with divisive issues of public policy. 32 Commissions may also give
political factions space to negotiate compromises in good faith, bypassing the shortterm tactical political maneuvers that accompany public negotiations. 33 Similarly,
because commission members are not elected, they may be better suited to
suggesting unpopular, but necessary, policy solutions. 34 Solving Collective Action Problems A
commission may allow legislators to solve collective action problems, situations in
which all legislators individually seek to protect the interests of their own district,
despite widespread agreement that the collective result of such interests is
something none of them prefer. Legislators can use a commission to jointly tie their
hands in such circumstances, allowing general consensus about a particular policy
solution to avoid being impeded by individual concerns about the effect or
implementation of the solution.35 For example, in 1988 Congress established the Base
Closure and Realignment Commission (BRAC) as a politically and geographically
neutral body to make independent decisions about closures of military bases. 36 The
list of bases slated for closure by the commission was required to be either accepted or
rejected as a whole by Congress, bypassing internal congressional politics over
which individual bases would be closed, and protecting individual Members from political
charges that they didnt save their districts base.37
commissions may circumvent such issues. Similarly,

CP avoids the focus link to politics


Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)
Overcoming Issue Complexity

Complex policy issues may cause time management

challenges for Congress. Legislators often keep busy schedules and may not have
time to deal with intricate or technical policy problems, particularly if the issues
require consistent attention over a period of time. 24 A commission can devote itself
to a particular issue full-time, and can focus on an individual problem without
distraction.25

No link to politicscommissions create bipartisan negotiations


Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

The third major reason for Congress to delegate to a commission is the


strategy of distancing itself from a politically risky decision. These instances

generally occur when Congress faces redistributive policy problems, such as Social Security, military base closures,

Such problems are the most difficult because legislators must take
a clear policy position on something that has greater costs to their districts than benefits, or that
Medicare, and welfare.

shifts resources visibly from one group to another. Institutionally, Congress has to make national policy that has a

Members realize that their


individual interests, based on constituents demands, may be at odds with the
national interest, and this can lead to possible electoral repercussions. 55 Even when
pursuing policies that are in the interests of the country as a whole, legislators do
not want to be blamed for causing losses to their constituents. In such an event, the
split characteristics of the institution come into direct conflict. Many on Capitol Hill endorse a
collective benefit, but the self-interest of lawmakers often gets in the way.

commission for effectively resolving a policy problem rather than the other machinery available to Congress. A
commission finds remedies when the normal decision making process has stalled. A long-time Senate staff director

At their
most effective, these panels allow Congress to realize purposes most
members cannot find the confidence to do unless otherwise done behind
the words of the commission. 56 When an issue imposes concentrated costs on individual districts
said of the proposed Second National Blue Ribbon Commission to Eliminate Waste in Government:

yet provides dispersed benefits to the nation, Congress responds by masking legislators individual contributions

Members avoid blame


and promote good policy by saying something is out of their hands. This method
allows legislators especially those aiming for reelection to vote for the general
benefit of something without ever having to support a plan that directly imposes
large and traceable geographic costs on their constituents. The avoidance or share-the-blame
route was much of the way Congress and the president finally dealt with
the problem of financially shoring up Social Security in the late 1980s. One
and delegates responsibility to a commission for making unpleasant decisions. 57

senior staff assistant to a western Republican representative observed that the creation of the Social Security
Commission was largely for avoidance: There are sacred cows and then there is Social Security. Neither party or
any politician wants to cut this. Regardless of what you say or do about it, in the end, you defer. Everyone backs
away from this. Similarly, a legislative director to a southern Democratic representative summarized: So many
people are getting older and when you take a look at who turns out, who registers, people over sixty-five have the
highest turnout and they vote like clockwork. The Commission on Executive, Legislative, and Judicial Salaries, later
referred to as the Quadrennial Commission (1967), is another example. Lawmakers delegated to a commission the
power to set pay for themselves and other top federal officials, whose pay they linked to their own, to help them

Because the
proposal made by the commission would take effect unless Congress voted
to oppose it, the use of the commission helped insulate legislators from
avoid blame. Increasing their own pay is a decision few politicians willingly endorse.

political hazards. 58 That is, because it was the commission that granted pay raises, legislators could tell
their constituents that they would have voted against the increase if given the chance. Members could get the pay
raise and also the credit for opposing it. Redistribution is the most visible public policy type because it involves the
most conspicuous, long run allocations of values and resources. Most divisive socioeconomic issues affirmative
action, medical care for the aged, aid to depressed geographic areas, public housing, and the elimination of
identifiable governmental actions involve debates over equality or inequality and degrees of redistribution.

These are political hot potatoes, in which a commission is a good means of putting
a fire wall between you [the lawmaker] and that hot potato, the chief of staff to a
midwestern Democratic representative acknowledged. Base closing took on a redistributive character as federal
expenditures outpaced revenues. It was marked not only by extreme conflict but also by techniques to mask or

The Base Closure Commission (1991)


was created with an important provision that allowed for silent congressional
approval of its recommendations. Congress required the commission to submit its reports of proposed
sugarcoat the redistributions or make them more palatable.

closures to the secretary of defense. The president had fifteen days to approve or disapprove the list in its entirety.
If approved, the list of recommended base closures became final unless both houses of Congress adopted a joint
resolution of disapproval within forty-five days. Congress had to consider and vote on the recommendations en bloc
rather than one by one, thereby giving the appearance of spreading the misery equally to affected clienteles. A
former staff aide for the Senate Armed Services Committee who was active in the creation of the Base Closure
Commission contended, There was simply no political will by Congress. The then-secretary of
defense started the process [base closing] with an in-house commission [within the Defense Department].

Eventually, however, Congress used the commission idea as a scheme for a way
out of a box. CONCLUSION Many congressional scholars attribute delegation principally to electoral

considerations. 59 For example, in the delegation of legislative authority to standing committees, legislators, keen
on maximizing their reelection prospects, request assignments to committees whose jurisdictions coincide with the
interests of key groups in their districts. Delegation of legislative functions to the president, to nonelected officials

delegation
fosters the avoidance of blame. 60 Mindful that most policies entail both costs and
benefits, and apprehensive that those suffering the costs will hold them responsible,
members of Congress often find that the most attractive option is to let someone
else make the tough choices. Others see congressional delegation as unavoidable (and even desirable)
in the federal bureaucracy, or to ad hoc commissions also grows out of electoral motives. Here,

in light of basic structural flaws in the design of Congress. 61 They argue that Congress is incapable of crafting

congressional
action can be stymied at several junctures in the legislative policymaking process.
Congress is decentralized, having few mechanisms for integrating or coordinating
its policy decisions; it is an institution of bargaining, consensus-seeking, and
compromise. The logic of delegation is broad: to fashion solutions to tough
problems, to broker disputes, to build consensus, and to keep fragile coalitions
together. The commission co-opts the most publicly ideological and privately
pragmatic, the liberal left and the conservative right. Leaders of both parties or
their designated representatives can negotiate a deal without the media, the public, or interest
groups present. When deliberations are private, parties can make offers without being
denounced either by their opponents or by affected groups. Removing external contact
policies that address the full complexity of modern-day problems. 62 Another charge is that

reduces the opportunity to use an offer from the other side to curry favor with constituents.

2NC Commissions Popular


Commissions give political coverresult in compromise
Fiscal Seminar 9
(The Fiscal Seminar is a group of scholars who meet on a regular basis, under the auspices of The
Brookings Institution and The Heritage Foundation, to discuss federal budget and fiscal policy issues.
The members of the Fiscal Seminar acknowledge the contributions of Paul Cullinan, a former colleague
and Brookings scholar, in the development of this paper, and the editorial assistance of Emily Monea.
THE POTENTIAL ROLE OF ENTITLEMENT OR BUDGET COMMISSIONS IN ADDRESSING LONG-TERM
BUDGET PROBLEMS, The Fiscal Seminar. 06-2009.)

the Greenspan Commission provided a forum for developing a political


compromise on a set of politically unsavory changes. In this case, the political parties
shared a deep concern about the impending insolvency of the Social Security system but feared the
exposure of promoting their own solutions. The commission created political cover
for the serious background negotiations that resulted in the ultimate compromise.
The structure of the commission reflected these concerns and was composed of
fifteen members, with the President, the Senate Majority Leader, and the Speaker of
the House each appointing five members to the panel.
In contrast,

2NC AT Perm do the CP


Permutation is severance:
1. Severance: CPs mechanism is distinctdelegates to the
commission and isnt Congressional action
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He
received his Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from
California State University, Chico. Prior to joining the National War College, Dr. Campbell was a
Legislative Aide to Representative Mike Thompson (CA-01), chair of the House Intelligence
Committee's Subcommittee on Terrorism, Analysis and Counterintelligence, where he handled
Appropriations, Defense and Trade matters for the congressman. Before that, he was an
Analyst in American National Government at the Congressional Research Service, an Associate
Professor of Political Science at Florida International University, and an American Political
Science Association Congressional Fellow, where he served as a policy adviser to Senator Bob
Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of 11 books on
Congress, most recently the Guide to Political Campaigns in America, and Impeaching Clinton:
Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT,
USA: Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

why and when does Congress formulate policy by commissions rather than
by the normal legislative process? Lawmakers have historically delegated
authority to others who could accomplish ends they could not. Does this
So

form of congressional delegation thus reflect the particularities of an issue area? Or does it mirror deeper
structural reasons such as legislative organization, time, or manageability? In the end, what is the impact
on representation versus the effectiveness of delegating discretionary authority to temporary entities
composed largely of unelected officials, or are both attainable together?

2. Severs resolved: resolved means to enact by lawnot the


counterplan mandate
Words and Phrases 64 vol 37A
Definition of the word resolve, given by Webster is to express an opinion or determination
by resolution or vote; as it was resolved by the legislature; It is of similar force to the word
enact, which is defined by Bouvier as meaning to establish by law.

3. Severs should: Should requires immediate action


Summers 94 (Justice Oklahoma Supreme Court, Kelsey v.
Dollarsaver Food Warehouse of Durant, 1994 OK 123, 11-8,
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn13)

The legal question to be resolved by the court is whether the word


"should"13 in the May 18 order connotes futurity or may be deemed a ruling in
praesenti.14 The answer to this query is not to be divined from rules of grammar;15 it must be
4

governed by the age-old practice culture of legal professionals and its immemorial language usage. To
determine if the omission (from the critical May 18 entry) of the turgid phrase, "and the same hereby is",
(1) makes it an in futuro ruling - i.e., an expression of what the judge will or would do at a later stage - or
(2) constitutes an in in praesenti resolution of a disputed law issue, the trial judge's intent must be
garnered from the four corners of the entire record.16 [CONTINUES TO FOOTNOTE] 13 "Should" not only
is used as a "present indicative" synonymous with ought but also is the past tense of "shall" with various
shades of meaning not always easy to analyze. See 57 C.J. Shall 9, Judgments 121 (1932). O.
JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St. Louis & S.F.R. Co. v. Brown,
45 Okl. 143, 144 P. 1075, 1080-81 (1914). For a more detailed explanation, see the Partridge quotation

Certain contexts mandate a construction of the term "should" as


more than merely indicating preference or desirability. Brown, supra at 1080-81 (jury
instructions stating that jurors "should" reduce the amount of damages in proportion to the amount of
contributory negligence of the plaintiff was held to imply an obligation and to be more
than advisory); Carrigan v. California Horse Racing Board, 60 Wash. App. 79, 802 P.2d 813 (1990) (one
infra note 15.

of the Rules of Appellate Procedure requiring that a party "should devote a section of the brief to the

a party is under an obligation to


include the requested segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958) ("should"
request for the fee or expenses" was interpreted to mean that

would mean the same as "shall" or "must" when used in an instruction to the jury which tells the triers

In praesenti means literally "at the present


time." BLACK'S LAW DICTIONARY 792 (6th Ed. 1990). In legal parlance the phrase denotes that
which in law is presently or immediately effective, as opposed to something that
will or would become effective in the future [in futurol]. See Van Wyck v. Knevals, 106 U.S.
they "should disregard false testimony"). 14

360, 365, 1 S.Ct. 336, 337, 27 L.Ed. 201 (1882).

4. Severs should again: should is mandatory


Summers 94 (Justice Oklahoma Supreme Court, Kelsey v.
Dollarsaver Food Warehouse of Durant, 1994 OK 123, 11-8,
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn13)
4

The legal question to be resolved by the court is whether the word "should"13 in the May 18 order
connotes futurity or may be deemed a ruling in praesenti.14 The answer to this query is not to be divined
from rules of grammar;15 it must be governed by the age-old practice culture of legal professionals and its
immemorial language usage. To determine if the omission (from the critical May 18 entry) of the turgid
phrase, "and the same hereby is", (1) makes it an in futuro ruling - i.e., an expression of what the judge will
or would do at a later stage - or (2) constitutes an in in praesenti resolution of a disputed law issue, the
trial judge's intent must be garnered from the four corners of the entire record.16 [CONTINUES TO
FOOTNOTE] 13 "Should" not only is used as a "present indicative" synonymous with ought but also is the
past tense of "shall" with various shades of meaning not always easy to analyze. See 57 C.J. Shall 9,
Judgments 121 (1932). O. JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St.
Louis & S.F.R. Co. v. Brown, 45 Okl. 143, 144 P. 1075, 1080-81 (1914). For a more detailed explanation, see

Certain contexts mandate a construction of the


term "should" as more than merely indicating preference or desirability. Brown,
supra at 1080-81 (jury instructions stating that jurors "should" reduce the amount of damages in
proportion to the amount of contributory negligence of the plaintiff was held to imply an
obligation and to be more than advisory ); Carrigan v. California Horse Racing Board, 60
the Partridge quotation infra note 15.

Wash. App. 79, 802 P.2d 813 (1990) (one of the Rules of Appellate Procedure requiring that a party "should
devote a section of the brief to the request for the fee or expenses" was interpreted to mean that a party is
under an obligation to include the requested segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958)

("should" would mean the same as "shall" or "must" when used in an instruction to the
jury which tells the triers they "should disregard false testimony"). 14 In praesenti means literally "at the
present time." BLACK'S LAW DICTIONARY 792 (6th Ed. 1990). In legal parlance the phrase denotes that
which in law is presently or immediately effective, as opposed to something that will or would become
effective in the future [in futurol]. See Van Wyck v. Knevals, 106 U.S. 360, 365, 1 S.Ct. 336, 337, 27 L.Ed.
201 (1882).

Severance is a reason to reject the team:


1. Neg groundmakes the AFF a shifting target and allows
them to spike out of offense
2. Unpredictablekills clash which destroys advocacy skills
and education

2NC AT Perm do Both


Permutation do both links to politics:
1. Congressional debatesCP means Congress doesnt
debate the substance of the plan, only the commission
reportperm makes Congress to debate the plan, triggers
the link over partisan inclinations and electoral pressures
thats the politics net benefit ev
2. Time crunchperm forces the plan now, doesnt give the
commission time to generate political support and links to
politics
Biggs 09
(Biggs, Andrews G. Andrew G. Biggs is a resident scholar at the American Enterprise Institute,
where his work focuses on Social Security and pensions. From 2003 through 2008, he served
at the Social Security Administration, as Associate Commissioner for Retirement Policy, Deputy
Commissioner for Policy, and ultimately the principal Deputy Commissioner of the agency.
During 2005, he worked at the White House National Economic Council on Social Security
reform, and in 2001 was on the staff of the President's Commission to Strengthen Social
Security. He blogs on Social Security-related issues at Notes on Social Security Reform.
Rumors Of Obama Social Security Reform Commission, Frum Forum. 02-17-2009.
http://www.frumforum.com/rumors-of-obama-social-security-reform-commission///ghs-kw)

One problem with President Bushs 2001 Commission was that it didnt
represent the reasonable spectrum of beliefs on Social Security reform. This didnt make it
a dishonest commission; like President Roosevelts Committee on Economic Security, it was designed to
put flesh on the bones laid out by the President. In this case, the Commission was tasked with designing a

a commission
only builds political capital toward enacting reform if its seen as building a
consensus through a process in which all views have been heard. In both the
2001 Commission and the later 2005 reform drive, Democrats didnt feel they
were part of the process. They clearly will be a central part of the process this time, but the goal
reform plan that included personal accounts and excluded tax increases. That said,

will now be to include Republicans. Just as Republicans shouldnt reflexively oppose any Obama
administration reform plans for political reasons, so Democrats shouldnt seek to exclude Republicans from

a reform task force should include a variety of different


players, including members of government, both legislative and executive,
representatives of outside interest groups, and experts who can provide
technical advice and help ensure the integrity of the reforms decided upon.
the process. Second,

The 2001 Bush Commission didnt include any sitting Members of Congress and only a small fraction of
commissioners had the technical expertise needed to make the plans the best they could be. A broader

any task force or commission needs time. The 2001


Commission ran roughly from May through December of that year and had to
conduct a number of public hearings. This was simply too much to do in too
little time, and as a result the plans were fairly bare bones. There is plenty
else on the policy agenda at the moment, so theres no reason not to give a
working group a year or more to put things together.
group would be helpful. Third,

2NC AT Theory
Counterinterp: process CPs are legitimate if we have a
solvency advocate
AND, process CPs good:
1. Key to neg groundagent CPs are the only generics we
have on this topic
2. Policy educationcommissions are key to understanding
the policy process
Schwalbe, 03
(Steve,- PhD Public Policy from Auburn, former professor at the Air War College and Col. in the
USAF Independent Commissions: Their History, Utilization and Effectiveness)

Many analysts characterize commissions as an unofficial, separate


branch of government, much like the news media. Campbell referred to commissions as
the fifth arm of government, after the media, the often-referred-to fourth arm.17 However, the media and
FIFTH BRANCH

independent commissions have as many similarities as differences. They are similar in that neither is mentioned in the Constitution. Both
conduct oversight functions. Both serve to educate and inform the public. Both allow elites to participate in shaping government policy. On
the other hand, the media and independent commissions are dissimilar in many ways. Where the news media responds to market forces, and
hence will likely operate in perpetuity, independent commissions respond to a federal requirement to resolve a difficult problem. Therefore,
they exist for a relatively short period of time, expiring once a final report is published and disseminated. Where the medias primary

a commissions primary responsibilities can range


from developing a recommended solution to a difficult problem to regulating
an entire department of the executive branch. The media receives its funding primarily from
functions are reporting and analyzing the news,

advertisers, where commissions receive their funding from Congress, the President, or from private sources. The news media deal with issues
foreign and domestic, while independent commissions generally focus on domestic issues. PURPOSE

Commissions serve

numerous purposes in the U.S. Government. Campbell cited three

primary reasons for the establishment


of federal independent commissions. First, they are established to provide expertise the Congress does not have among its own elected
officials or their staffs. Next, he noted that the second most frequently cited reason by members of Congress for establishing a commission
was to reduce the workload in Congress. Finally, they are formed to provide a convenient scapegoat to deflect the wrath of the electorate; i.e.,
blame avoidance.18 Fisher found three advantages of regulatory commissions. First, commission members bring essential expert insights
to a commission because the regulated industries are normally complex and highly technical. Second, appointing commissioners for
extended terms of full-time work allows commissioners to become very familiar with the technical aspects of an industry, through periodic
contacts that Congress would not be able to accomplish. As a result of their tenure, varied membership, and shared responsibility,
commissioners would be resistant to external pressures. Finally, regulatory commissions provide policy continuity essential to the stability of
a regulated industry.19 What the taxpayers are primarily looking for from independent commissions are non- partisan solutions to current
problems. A good example of establishing a commission to find non-partisan solutions is Congress regulating its own ethical behavior.
University of Florida Professor Beth Rosenson researched this issue and concluded that authorizing an ethics commission may be based on
the fear of electoral retaliation if legislators do not take aggressive action to regulate their own ethics.20 Campbell noted that commissions
perform several other functions besides providing recommendations to the President and Congress. The most common reason provided by
analysts is that members of Congress generally want to avoid making difficult decisions that may adversely affect their chances for reelection.
As he noted, Incentives to avoid blame lead members of Congress to adopt a distinctive set of political strategies, such as passing the buck
or deflection.21 Another technique legislators use to avoid incurring the wrath of the voters is to schedule any controversial independent
commissions for after the next election. Establish- ing a commission to research the issue and come up with recommendations after a preset
period of time is an effective way to do that. The most clear-cut example demonstrating this technique is the timing of the BRAC commissions
in the 1990s all three made their base closure recommendations in non-election years (1991, 1993, and 1995). Even the next BRAC
commission, established by the National Defense Authorization Act for Fiscal Year 2002, is not required to submit its base closure
recommendations until 2005. Congress certainly is not the most efficient organization in the U.S.; hence, there are times when an
independent commission is the more efficient and effective way to go. Law- makers are almost always short on time and information, which
makes the option of delegating authority to a commission very appealing. Oftentimes, the expertise and necessary information is very costly
for Congress to acquire. Commissions are generally the most inexpensive way for Congress to solve complex problems. From 1993-1997,
Campbell found that 92 congressional offices introduced legislation that included proposals to establish ad hoc commissions.22 There are
numerous other reasons for establishing independent commissions. They are created as a symbolic response to a crisis or to satisfy the
electorate at home. They have served as trial balloons to test the political waters, or to make political gains with the voters. They can be
created to gain public or political consensus. Often, when Congress has exhausted all its other options, a commission serves as an option of
last resort.23 Commissions are a relatively impartial way to help resolve problems between the executive and legislative branches of
government, especially during periods of congressional gridlock. Wolanin also noted that commissions are particularly useful for problems
and in circumstances marked by federal executive branch incapacity. Federal bureaucracies suffer from many of the same shortcomings
attributed to Congress when considering commissions. They often lack the expertise, information, and time to conduct the research and make
recommendations to resolve internal problems. They can be afflicted by groupthink, not being able to think outside the box, or by not being

Commissions offer a non-partisan, neutral option to address bureaucratic


Secretary
Donald Rumsfeld
has decided to implement the
recommendations of the congressionally- chartered Commission on Space,
which he chaired prior to being appointed Secretary of Defense!25 One of the more important functions of independent
able to see the big picture.
policy

problems.24

Defense

commissions is educating and persuading. Due to the high visibility of most appointed commissioners, a policy issue will automatically tend
to gain public attention. According to Wolanin, the prestige and visibility of commissions give them the capability to focus attention on a
problem, and to see that thinking about it permeates more rapidly. A recent example of a high-visibility commission chair appointment was
Henry Kissinger, selected to chair the commission to look into the perceived intelligence failure regarding the September 11, 2001 terrorist
attack on the U.S. .26 Wolanin cited four educational impacts of commissions: 1) educating the general public; 2) educating government
officials; 3) serving as intellectual milestones; and, 4) educating the commission members themselves. Regarding education of the general
public, he stated that, Commissions have helped to place broad new issues on the national agenda, to elevate them to a level of legitimate

and pressing matters about which government should take affirmative action. Regarding educating government officials, he noted that,

The educational impact of commissions within governmentmake it safer


for congressmen and federal executives to openly discuss or advocate a
proposal that has been sanctioned by such an august group. Commission reports have often been so
influential that they serve as milestones in affected fields. Such reports have become source material for

analysts, commentators, and even students, particularly when commission reports are widely published and disseminated. Finally, by serving
on a commission, members also learn much about the issue, and about the process of analyzing a problem and coming up with viable
recommendations. Commissioners also learn from one another.27

3. Predictabilitycommissions are widely used and


predictable and solvency advocate checks
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He
received his Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from
California State University, Chico. Prior to joining the National War College, Dr. Campbell was a
Legislative Aide to Representative Mike Thompson (CA-01), chair of the House Intelligence
Committee's Subcommittee on Terrorism, Analysis and Counterintelligence, where he handled
Appropriations, Defense and Trade matters for the congressman. Before that, he was an
Analyst in American National Government at the Congressional Research Service, an Associate
Professor of Political Science at Florida International University, and an American Political
Science Association Congressional Fellow, where he served as a policy adviser to Senator Bob
Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of 11 books on
Congress, most recently the Guide to Political Campaigns in America, and Impeaching Clinton:
Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT,
USA: Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

Ad hoc commissions as instruments of government have a long


history. They are used by almost all units and levels of government
for almost every conceivable task. Ironically, the use which Congress makes
of commissions preparing the groundwork for legislation, bringing public
issues into the spotlight, whipping legislation into shape, and giving priority
to the consideration of complex, technical, and critical developments receives
relatively little attention from political scientists. As noted in earlier chapters, following the logic of rational

legislators
often delegate fact-

choice theory, individual decisions to delegate are occasioned by imperfect information;


who want to develop effective policies, but who lack the necessary expertise,

finding and policy development. Others contend that some commissions are set up to shift
blame in order to maximize benefits and minimize losses.

4. At worse, reject the argument, not the team

2NC AT Certainty
Counterplan solves your certainty argsexpertise
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

By delegating some of its policymaking authority to expertise commissions,


Congress creates institutions that reduce uncertainty. Tremendous gains accrue as a
result of delegating tasks to other organizations with a comparative advantage in
performing them. Commissions are especially adaptable devices for addressing problems that do not fall
neatly within committees jurisdictional boundaries. They can complement and supplement the
regular committees. In the 1990s, it became apparent that committees were ailing beset by mounting
workloads, duplication and jurisdictional battles, and conflicts between program and funding panels. But relevant
expertise can be mobilized by a commission that brings specialized information to
its tasks, especially if commission members and staff are selected on the basis of
education, their training, and their experience in the area which cross-cut the
responsibilities of several standing committees.

2NC AT Commissions Bad


No disadscommissions are inevitable due to Congressional
structure
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

Others see congressional delegation as unavoidable (and even desirable) in light of basic
structural flaws in the design of Congress. 61 They argue that Congress is incapable of
crafting policies that address the full complexity of modern-day problems.
62 Another charge is that congressional action can be stymied at several junctures in the
legislative policymaking process. Congress is decentralized, having few mechanisms
for integrating or coordinating its policy decisions ; it is an institution of bargaining, consensusseeking, and compromise. The logic of delegation is broad: to fashion solutions to tough
problems, to broker disputes, to build consensus, and to keep fragile coalitions
together. The commission co-opts the most publicly ideological and privately pragmatic, the liberal left and the
conservative right. Leaders of both parties or their designated representatives can negotiate a deal without the
media, the public, or interest groups present. When deliberations are private, parties can make offers without being
denounced either by their opponents or by affected groups. Removing external contact reduces the opportunity to
use an offer from the other side to curry favor with constituents.

2NC AT Congress Doesnt Pass Recommendations


Recommendations are passedeither bipartisan or perceived
as non-partisan
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

Solutions to policy problems produced within the normal legislative


process may also suffer politically from charges of partisanship.30 Similar charges
may be made against investigations conducted by Congress. 31 The non-partisan
or bipartisan character of most congressional commissions may make their
findings and recommendations less susceptible to such charges and more
politically acceptable to diverse viewpoints. The bipartisan or nonpartisan
arrangement can potentially give their recommendations strong
credibility, both in Congress and among the public, even when dealing
with divisive issues of public policy.32 Commissions may also give political
factions space to negotiate compromises in good faith, bypassing the short-term
tactical political maneuvers that accompany public negotiations. 33 Similarly,
because commission members are not elected, they may be better suited to
suggesting unpopular, but necessary, policy solutions. 34
Reducing Partisanship

Recommendations are passedBRAC Commission proves


Fiscal Seminar 9
(The Fiscal Seminar is a group of scholars who meet on a regular basis, under the auspices of The
Brookings Institution and The Heritage Foundation, to discuss federal budget and fiscal policy issues.
The members of the Fiscal Seminar acknowledge the contributions of Paul Cullinan, a former colleague
and Brookings scholar, in the development of this paper, and the editorial assistance of Emily Monea.
THE POTENTIAL ROLE OF ENTITLEMENT OR BUDGET COMMISSIONS IN ADDRESSING LONG-TERM
BUDGET PROBLEMS, The Fiscal Seminar. 06-2009.)

the success of BRAC seems to have resulted more from the defined
structure and process of the commission.5 Under BRAC, a package of recommendations
originated with the Department of Defense, was modified by the BRAC commission, and was
then reviewed by the President. Congress then had to consider the package as a whole with no
amendments allowed; if it failed to pass a resolution of disapproval, the recommendations
would be implemented as if they had been enacted in law. Not one of the five sets
of BRAC recommendations has been rejected by the Congress. 6,
On the other hand,

2NC AT No Authority
Commissions have broad authority
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

commissions have reached the point where they can take over various fact-finding
functions formerly performed by Congress itself. Once the facts have been found by a
commission, it is possible for Congress to subject those facts to the scrutiny of
cross-examination and debate. And if the findings stand up under such scrutiny, there remains for
Congressional

Congress the major task of determining the policy to be adopted with reference to the known factual situation. Once
it was clear, for example, that the acquired immune deficiency syndrome (AIDS) yielded an extraordinary range of
newfound political and practical difficulties, the need for legislative action was readily apparent. The question that
remained was one of policy: how to prevent the spread of AIDS. Should it be by accelerated research? By public
education? By facilitating housing support for people living with AIDS? Or by implementing a program of AIDS
counseling and testing? The AIDS Commission could help Congress answer such questions.

2NC AT Perception
CP solves your perception arguments
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

By establishing a commission, Congress can often provide a


highly visible forum for important issues that might otherwise receive
scant attention from the public.38 Commissions often are composed of notable public
figures, allowing personal prestige to be transferred to policy solutions. 39 Meetings
and press releases from a commission may receive significantly more
attention in the media than corresponding information coming directly
from members of congressional committees. Upon completion of a commissions work
product, public attention may be temporarily focused on a topic that otherwise would
receive scant attention, thus increasing the probability of congressional action within the policy area.40
Raising Visibility

Private Sector CP

1NC
Counterplan: the private sector should implement and enforce
default encryption standards on a level equivalent with those
announced by Apple in 2014.
Apples new standards are unhackable even by Apple
eliminates backdoors
Green 10/4
(Green, Matthew D. Matthew D. Green is an Assistant Research Professor at the Johns Hopkins
Information Security Institute. He completed his PhD in 2008. His research includes techniques for
privacy-enhanced information storage, anonymous payment systems, and bilinear map-based
cryptography. "A Few Thoughts on Cryptographic Engineering: Why can't Apple decrypt your iPhone?
10-4-2014. http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-youriphone.html//ghs-kw)
In the rest of this post I'm going to talk about how these protections may work and how

Apple can

realistically claim not to possess a back door.

One caveat: I should probably point out that


Apple isn't known for showing up at parties and bragging about their technology -- so while a fair amount of this is
based on published information provided by Apple, some of it is speculation. I'll try to be clear where one ends and
the other begins. Password-based encryption 101 Normal password-based file encryption systems take in a
password from a user, then apply a key derivation function (KDF) that converts a password (and some salt) into an
encryption key. This approach doesn't require any specialized hardware, so it can be securely implemented purely in
software provided that (1) the software is honest and well-written, and (2) the chosen password is strong, i.e., hard
to guess. The problem here is that nobody ever chooses strong passwords. In fact, since most passwords are
terrible, it's usually possible for an attacker to break the encryption by working through a 'dictionary' of likely
passwords and testing to see if any decrypt the data. To make this really efficient, password crackers often use
special-purpose hardware that takes advantage of parallelization (using FPGAs or GPUs) to massively speed up the
process. Thus a common defense against cracking is to use a 'slow' key derivation function like PBKDF2 or scrypt.
Each of these algorithms is designed to be deliberately resource-intensive, which does slow down normal login
attempts -- but hits crackers much harder. Unfortunately, modern cracking rigs can defeat these KDFs by simply
throwing more hardware at the problem. There are some approaches to dealing with this -- this is the approach of

How Apple's encryption


works Apple doesn't use scrypt. Their approach is to add a 256-bit device-unique
secret key called a UID to the mix, and to store that key in hardware
where it's hard to extract from the phone. Apple claims that it does not record
these keys nor can it access them. On recent devices (with A7 chips), this key and the
mixing process are protected within a cryptographic co-processor called the Secure
Enclave. The Apple Key Derivation function 'tangles' the password with the UID key
by running both through PBKDF2-AES -- with an iteration count tuned to require
about 80ms on the device itself.** The result is the 'passcode key'. That key is then
used as an anchor to secure much of the data on the phone. Overview of Apple key
derivation and encryption (iOS Security Guide, p.10). Since only the device itself knows UID -- and
the UID can't be removed from the Secure Enclave -- this means all password
cracking attempts have to run on the device itself. That rules out the use of FPGA or
ASICs to crack passwords. Of course Apple could write a custom firmware
that attempts to crack the keys on the device but even in the best case
such cracking could be pretty time consuming, thanks to the 80ms PBKDF2 timing. (Apple
pegs such cracking attempts at 5 1/2 years for a random 6-character password consisting of
lowercase letters and numbers. PINs will obviously take much less time, sometimes as little as half an
memory-hard KDFs like scrypt -- but this is not the direction that Apple has gone.

hour. Choose a good passphrase!) So one view of Apple's process is that it depends on the user picking a strong
password. A different view is that it also depends on the attacker's inability to obtain the UID. Let's explore this a bit

The Secure Enclave is designed to prevent exfiltration of


the UID key. On earlier Apple devices this key lived in the application processor itself. Secure Enclave
provides an extra level of protection that holds even if the software on the
application processor is compromised -- e.g., jailbroken. One worrying thing about this approach is
more. Securing the Secure Enclave

that, according to Apple's documentation, Apple controls the signing keys that sign the Secure Enclave firmware. So
using these keys, they might be able to write a special "UID extracting" firmware update that would undo the

protections described above, and potentially allow crackers to run their attacks on specialized hardware. Which

How does Apple avoid holding a backdoor signing key that


allows them to extract the UID from the Secure Enclave? It seems to me that there are a few
possible ways forward here. No software can extract the UID. Apple's documentation even claims
that this is the case; that software can only see the output of encrypting something with
UID, not the UID itself. The problem with this explanation is that it isn't really clear that this guarantee
leads to the following question?

covers malicious Secure Enclave firmware written and signed by Apple. Update 10/4: Comex and others (who have

The UID
appears to be connected to the AES circuitry by a dedicated path, so software can
set it as a key, but never extract it. Moreover this appears to be the same for both
the Secure Enclave and older pre-A7 chips. So ignore options 2-4 below.
forgotten more about iPhone internals than I've ever known) confirm that #1 is the right answer.

2NC O/V
The counterplan solves 100% of the caseprivate corporations
will institute strong encryption standards on all their products
and store decryption mechanisms on individual devices
without retaining separate decryption programsthis means
nobody but the owner of the device can decrypt the
informationthats Green
AND, solves backdoorscompanies are technologically
incapable of providing backdoors in the world of the CP
solves the AFFthats Green

AT Perception
Other companies follow solves their credibility internal links
Whittaker 14
(Zack Whittaker. "Apple doubles-down on security, shuts out law enforcement from accessing iPhones,
iPads," ZDNet. 9-18-2014. http://www.zdnet.com/article/apple-doubles-down-on-security-shuts-outlaw-enforcement-from-accessing-iphones-ipads///ghs-kw)

The new encryption methods prevent even Apple from accessing even the relatively
small amount of data it holds on users. "Unlike our competitors, Apple cannot bypass your
passcode and therefore cannot access this data ," the company said in its new privacy policy,
updated Wednesday. "So it's not technically feasible for us to respond to government warrants for the extraction of
this data from devices in their possession running iOS 8." There are some caveats, however. For the iCloud data it
stores, Apple still has the ability (and the legal responsibility) to turn over data it stores on its own servers, or thirdparty servers it uses to support the service. iCloud data can include photos, emails, music, documents, and

Apple has set itself apart from the rest of


the crowd by bolstering its encryption efforts in such a way that makes it impossible
for it to decrypt the data. Apple chief executive Tim Cook said in a recent interview with PBS' Charlie Rose
that if the government "laid a subpoena" at its doors, Apple "can't provide" the data. He said,
bluntly: "We don't have a key. The door is closed." Although the iPhone and iPad maker was late to
contacts. In the wake of the Edward Snowden disclosures,

the transparency report party, the company has rocketed up the ranks of the civil liberties table. The Electronic
Frontier Foundation's annual reports for 2012 and 2013 showed Apple as having poor privacy practices around user
data, gaining just one star out of five each year. In 2014, Apple scored the full five stars a massive turnaround

Yahoo is bolstering encryption between its datacenters ,


and recently turned on encryption-by-default on its email service . Microsoft is also
encrypting its network traffic amid reports of the National Security Agency's
datacenter tapping program. And Google is working hard to crackdown on
government spies cracking into its networks and cables. Privacy and security
are, and have been for a while, the pinnacle of tech credibility. And Apple just scored
about a billion points on that scale, leaving most of its Silicon Valley partners in the dust
from two years prior. In the meantime,

AT Links to Terror
No link to their disadsother sources of data
NYT 14
(David E. Sanger and Brian X. Chen. "Signaling Post-Snowden Era, New iPhone Locks Out N.S.A. ," New
York Times. 9-26-2014. http://www.nytimes.com/2014/09/27/technology/iphone-locks-out-the-nsasignaling-a-post-snowden-era-.html?_r=0//ghs-kw)

concerns about Apples new encryption to hinder law enforcement


seemed overblown. He said there were still plenty of ways for the police to get
customer data for investigations. In the example of a kidnapping victim, the police can still
request information on call records and geolocation information from phone carriers
like AT&T and Verizon Wireless. Eliminating the iPhone as one source I dont think is
going to wreck a lot of cases, he said. There is such a mountain of other evidence
from call logs, email logs, iCloud, Gmail logs. Theyre tapping the whole Internet.
Mr. Zdziarski said that

XO CP

1NC
XOs solve the Secure Data Act
Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.
He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

the U.S. government should draw a clear line in the sand and declare
that the policy of the U.S. government is to strengthen not weaken
information security. The U.S. Congress should pass legislation, such as the Secure Data Act introduced
by Sen. Wyden (D-OR), banning any government efforts to introduce backdoors in software or weaken encryption.43 In the short term ,
President Obama, or his successor, should sign an executive order
formalizing this policy as well. In addition, when U.S. government agencies discover vulnerabilities in software or hardware products, they should
Second,

responsibly notify these companies in a timely manner so that the companies can fix these flaws. The best way to protect U.S. citizens from digital threats is to promote strong
cybersecurity practices in the private sector.

Zero-Days Adv CP

1NC
Counterplan: the United States federal government should
legalize and regulate the zero-day exploit market.
Regulation is key to stop zero days from falling into enemy
hands
Gallagher 13
(Ryan Gallagher. "The Secretive Hacker Market for Software Flaws," Slate Magazine. 1-16-2013.
http://www.slate.com/articles/technology/future_tense/2013/01/zero_day_exploits_should_the_hacker_g
ray_market_be_regulated.html//ghs-kw)

hackers hunt for security


vulnerabilities worth thousands of dollars on a secretive unregulated marketplace.
Behind computer screens from France to Fort Worth, Texas, elite

Using sophisticated techniques to detect weaknesses in widely used programs like Google Chrome, Java, and Flash,

they spend hours crafting zero-day exploitscomplex codes custom-made to


target a software flaw that has not been publicly disclosed, so they can bypass antivirus or firewall detection to help infiltrate a computer system. Like most technologies, the
exploits have a dual use. They can be used as part of research efforts to help strengthen computers against
intrusion. But they can also be weaponized and deployed aggressively for everything from
government spying and corporate espionage to flat-out fraud. Now, as cyberwar escalates across the globe,
there are fears that the burgeoning trade in finding and selling exploits is spiralling out of
controlspurring calls for new laws to rein in the murky trade. Some legitimate
companies operate in a legal gray zone within the zero-day market, selling exploits to governments
and law enforcement agencies in countries across the world. Authorities can use
them covertly in surveillance operations or as part of cybersecurity or espionage
missions. But because sales are unregulated, there are concerns that some
gray market companies are supplying to rogue foreign regimes that may
use exploits as part of malicious targeted attacks against other countries
or opponents. There is also an anarchic black market that exists on invite-only
Web forums, where exploits are sold to a variety of actorsoften for criminal
purposes. The importance of zero-day exploits, particularly to governments, has become
increasingly apparent in recent years. Undisclosed vulnerabilities in Windows played a crucial role in
how Iranian computers were infiltrated for surveillance and sabotage when the
countrys nuclear program was attacked by the Stuxnet virus (an assault reportedly launched
by the United States and Israel). Last year, at least eight zero days in programs like Flash and Internet Explorer
were discovered and linked to a Chinese hacker group dubbed the Elderwood gang, which targeted more than
1,000 computers belonging to corporations and human rights groups as part of a shady intelligence-gathering effort
allegedly sponsored by China. The most lucrative zero days can be worth hundreds of thousands of dollars in both
the black and gray markets. Documents released by Anonymous in 2011 revealed Atlanta-based security firm
Endgame Systems offering to sell 25 exploits for $2.5 million. Emails published alongside the documents showed
the firm was trying to keep a very low profile due to feedback we've received from our government clients. (In
keeping with that policy, Endgame didnt respond to questions for this story.) But not everyone working in the
business of selling software exploits is trying to fly under the radarand some have decided to blow the whistle on
what they see as dangerous and irresponsible behavior within their secretive profession. Adriel Desautels, for one,
has chosen to speak out. The 36-year-old exploit broker from Boston runs a company called Netragard, which
buys and sells zero days to organizations in the public and private sectors. (He wont name names, citing
confidentiality agreements.) The lowest-priced exploit that Desautels says he has sold commanded $16,000; the

Unlike other companies and sole traders operating in the zeroday trade, Desautels has adopted a policy to sell his exploits only domestically
within the United States, rigorously vetting all those he deals with. If he didnt have
this principle, he says, he could sell to anyone he wantedeven Iran or China
because the field is unregulated. And thats exactly why he is concerned. As technology
advances, the effect that zero-day exploits will have is going to become more
physical and more real, he says. The software becomes a weapon. And if you
dont have controls and regulations around weapons, youre really open to
highest, more than $250,000.

introducing chaos and problems. Desautels says he knows of greedy and


irresponsible people who will sell to anybody, to the extent that some exploits
might be sold by the same hacker or broker to two separate governments not on
friendly terms. This can feasibly lead to these countries unwittingly targeting each
others computer networks with the same exploit, purchased from the same seller.
If I take a gun and ship it overseas to some guy in the Middle East and he uses it to
go after American troopsits the same concept, he says. The position Desautels has
taken casts him as something of an outsider within his trade. Frances Vupen, one of
the foremost gray-market zero-day sellers, takes a starkly different approach. Vupen
develops and sells exploits to law enforcement and intelligence agencies across the
world to help them intercept communications and conduct offensive cyber security
missions, using what it describes as extremely sophisticated codes that bypass
all modern security protections and exploit mitigation technologies. Vupens latest
financial accounts show it reported revenue of about $1.2 million in 2011, an overwhelming majority of which (86
percent) was generated from exports outside France. Vupen says it will sell exploits to a list of more than 60
countries that are members or partners of NATO, provided these countries are not subject to any export sanctions.
(This means Iran, North Korea, and Zimbabwe are blacklistedbut the likes of Kazakhstan, Bahrain, Morocco, and
Russia are, in theory at least, prospective customers, as they are not subject to any sanctions at this time.) As a
European company, we exclusively work with our allies and partners to help them protect their democracies and
citizens against threats and criminals, says Chaouki Bekrar, Vupens CEO, in an email. He adds that even if a given
country is not on a sanctions list, it doesnt mean Vupen will automatically work with it, though he declines to name

Vupens policy of selling


to a broad range of countries has attracted much controversy, sparking furious
debate around zero-day sales, ethics, and the law. Chris Soghoian of the ACLUa prominent
privacy and security researcher who regularly spars with Vupen CEO Bekrar on Twitterhas accused Vupen of
being modern-day merchants of death selling the bullets for cyberwar. Just as
the engines on an airplane enable the military to deliver a bomb that kills people, so
too can a zero day be used to deliver a cyberweapon that causes physical harm or
loss of life, Soghoian says in an email. He is astounded that governments are sitting on flaws
by purchasing zero-day exploits and keeping them secret. This ultimately entails exposing their own
specific countries or continents where his firm does or does not have customers.

citizens to espionage, he says, because it means that the government knows about software vulnerabilities but is
not telling the public about them. Some claim, however, that the zero-day issue is being overblown and politicized.
You dont need a zero day to compromise the workstation of an executive, let alone an activist, says Wim Remes,
a security expert who manages information security for Ernst & Young. Others argue that the U.S. government in
particular needs to purchase exploits to keep pace with what adversaries like China and Iran are doing. If were
going to have a military to defend ourselves, why would you disarm our military? says Robert Graham at the
Atlanta-based firm Errata Security. If the government cant buy exploits on the open market, they will just develop
them themselves, Graham says. He also fears that regulation of zero-day sales could lead to a crackdown on
legitimate coding work. Plus, digital arms dont existits an analogy. They dont kill people. Bad things really dont
happen with them. * * * So are zero days really a danger? The overwhelming majority of compromises of computer
systems happen because users failed to update software and patch vulnerabilities that are already known about.
However, there are a handful of cases in which undisclosed vulnerabilitiesthat is, zero dayshave been used to

It was a zero day, for instance, that was recently used by


malicious hackers to compromise Microsofts Hotmail and steal emails and details of
the victims' contacts. Last year, it was reported that a zero day was used to target a flaw in
Internet Explorer and hijack Gmail accounts. Noted off
ensive security companies such as
target organizations or individuals.

Italys Hacking Team and the England-based Gamma Group are among those to make use of zero-day exploits to

companies
have been accused of supplying their technologies to countries with an authoritarian
bent. Tracking and communications interception can have serious real-world
consequences for dissidents in places like Iran, Syria, or the United Arab Emirates.
In the wrong hands, it seems clear, zero days could do damage. This potential has been
recognized in Europe, where Dutch politician Marietje Schaake has been crusading for
groundbreaking new laws to curb the trade in what she calls digital weapons.
Speaking on the phone from Strasbourg, France*, Schaake tells me shes concerned about security exploits,
help law enforcement agencies install advanced spyware on target computersand both of these

particularly where they are being sold with the intent to help enable access to computers or mobile devices not

is considering pressing for the European Commission, the


a whole new regulatory framework that would encompass
the trade in zero days, perhaps by looking at incentives for companies or hackers to
report vulnerabilities that they find. Such a move would likely be welcomed by the handful
authorized by the owner. She adds that she
EUs executive body, to bring in

of organizations already working to encourage hackers and security researchers to responsibly disclose
vulnerabilities they find instead of selling them on the black or gray markets. The Zero Day Initiative, based in
Austin, Texas, has a team of about 2,700 researchers globally who submit vulnerabilities that are then passed on to
software developers so they can be fixed. ZDI, operated by Hewlett-Packard, runs competitions in which hackers
can compete for a pot of more than $100,000 in prize funds if they expose flaws. We believe our program is
focused on the greater good, says Brian Gorenc, a senior security researcher who works with the ZDI.

DAs

Terror

1NC - Generic
Terror risk is highmaintaining current surveillance is key
Inserra, 6/8 (David Inserra is a Research Associate for Homeland Security and Cyber
Security in the Douglas and Sarah Allison Center for Foreign and National Security Policy of
the Kathryn and Shelby Cullom Davis Institute for National Security and Foreign Policy, at
The Heritage Foundation, 6-8-2015, "69th Islamist Terrorist Plot: Ongoing Spike in Terrorism
Should Force Congress to Finally Confront the Terrorist Threat," Heritage Foundation,
http://www.heritage.org/research/reports/2015/06/69th-islamist-terrorist-plot-ongoing-spikein-terrorism-should-force-congress-to-finally-confront-the-terrorist-threat)
On June 2 in Boston, Usaamah Abdullah Rahim drew a knife and attacked police officers and
FBI agents, who then shot and killed him. Rahim was being watched by Bostons Joint
Terrorism Task Force as he had been plotting to behead police officers as part of violent
jihad. A conspirator, David Wright or Dawud Sharif Abdul Khaliq, was arrested shortly
thereafter for helping Rahim to plan this attack. This plot marks the 69th publicly known
Islamist terrorist plot or attack against the U.S. homeland since 9/11, and is part of a recent
spike in terrorist activity. The U.S. must redouble its efforts to stop terrorists before they
strike, through the use of properly applied intelligence tools. The Plot According to the criminal
complaint filed against Wright, Rahim had originally planned to behead an individual outside the state of
Massachusetts,[1] which, according to news reports citing anonymous government officials, was Pamela Geller, the
organizer of the draw Mohammed cartoon contest in Garland, Texas.[2] To this end, Rahim had purchased multiple
knives, each over 1 foot long, from Amazon.com. The FBI was listening in on the calls between Rahim

and Wright and recorded multiple conversations regarding how these weapons would be
used to behead someone. Rahim then changed his plan early on the morning of June 2. He planned to go on
vacation right here in Massachusetts. Im just going to, ah, go after them, those boys in blue. Cause, ah, its the
easiest target.[3] Rahim and Wright had used the phrase going on vacation repeatedly in their conversations as
a euphemism for violent jihad. During this conversation, Rahim told Wright that he planned to attack a police officer
on June 2 or June 3. Wright then offered advice on preparing a will and destroying any incriminating evidence.
Based on this threat, Boston police officers and FBI agents approached Rahim to question him, which prompted him
to pull out one of his knives. After being told to drop his weapon, Rahim responded with you drop yours and
moved toward the officers, who then shot and killed him. While Rahims brother, Ibrahim, initially claimed that
Rahim was shot in the back, video surveillance was shown to community leaders and civil rights groups, who have
confirmed that Rahim was not shot in the back.[4 ] Terrorism Not Going Away This 69th Islamist plot is also
the seventh in this calendar year. Details on how exactly Rahim was radicalized are still forthcoming, but
according to anonymous officials, online propaganda from ISIS and other radical Islamist groups are
the source.[5] That would make this attack the 58th homegrown terrorist plot and continue

the recent trend of ISIS playing an important role in radicalizing individuals in the United
States. It is also the sixth plot or attack targeting law enforcement in the U.S., with a recent uptick in plots aimed
at police. While the debate over the PATRIOT Act and the USA FREEDOM Act is taking a break, the terrorists are not.
The result of the debate has been the reduction of U.S. intelligence and counterterrorism capabilities, meaning that
the U.S. has to do even more with less when it comes to connecting the dots on terrorist plots.[6] Other
legitimate intelligence tools and capabilities must be leaned on now even more. Protecting the
Homeland To keep the U.S. safe, Congress must take a hard look at the U.S. counterterrorism enterprise and
determine other measures that are needed to improve it. Congress should: Emphasize community outreach. Federal
grant funds should be used to create robust community-outreach capabilities in higher-risk urban areas. These
funds must not be used for political pork, or so broadly that they no longer target those communities at greatest
risk. Such capabilities are key to building trust within these communities, and if the United States is to thwart lonewolf terrorist attacks, it must place effective community outreach operations at the tip of the spear. Prioritize local
cyber capabilities. Building cyber-investigation capabilities in the higher-risk urban areas must become a primary
focus of Department of Homeland Security grants. With so much terrorism-related activity occurring on the Internet,
local law enforcement must have the constitutional ability to monitor and track violent extremist activity on the
Web when reasonable suspicion exists to do so. Push the FBI toward being more effectively driven by intelligence.
While the FBI has made high-level changes to its mission and organizational structure, the bureau is still working on
integrating intelligence and law enforcement activities. Full integration will require overcoming inter-agency cultural
barriers and providing FBI intelligence personnel with resources, opportunities, and the stature they need to
become a more effective and integral part of the FBI . Maintain essential counterterrorism tools.

Support for important investigative tools is essential to maintaining the security of the U.S.
and combating terrorist threats. Legitimate government surveillance programs are also a
vital component of U.S. national security and should be allowed to continue. The need for
effective counterterrorism operations does not relieve the government of its obligation to

follow the law and respect individual privacy and liberty. In the American system, the
government must do both equally well. Clear-Eyed Vigilance The recent spike in terrorist plots
and attacks should finally awaken policymakersall Americans, for that matterto the seriousness
of the terrorist threat. Neither fearmongering nor willful blindness serves the United States.
Congress must recognize and acknowledge the nature and the scope of the Islamist terrorist
threat, and take the appropriate action to confront it.

Backdoors are key to stop terrorism and child predators


Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal.
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

Terrorists will use bioweapons- guarantees extinction


Cooper 13 (Joshua, 1/23/13, University of South Carolina, Bioterrorism and the Fermi
Paradox, http://people.math.sc.edu/cooper/fermi.pdf, 7/15/15, SM)

We may conclude that, when a civilization reaches its space-faring age, it will more or less at the same
moment (1) contain many individuals who seek to cause large-scale destruction,

and
(2) acquire the capacity to tinker with its own genetic chemistry. This is a perfect
recipe for bioterrorism, and, given the many very natural pathways for its development
and the overwhelming evidence that precisely this course has been taken by humanity , it is
hard to see how bioterrorism does not provide a neat, if profoundly unsettling, solution to Fermis paradox. One
might object that, if omnicidal individuals are successful in releasing highly virulent and
deadly genetic malware into the wild, they are still unlikely to succeed in killing everyone. However,
even if every such mass death event results only in a high (i.e., not total) kill rate
and there is a large gap between each such event (so that individuals can build
up the requisite scientific infrastructure again ), extinction would be inevitable
regardless. Some of the engineered bioweapons will be more successful than others; the inter-apocalyptic eras
will vary in length; and post-apocalyptic environments may be so war-torn, disease-

stricken, and impoverished of genetic variation that they may culminate in true
extinction events even if the initial cataclysm only results in 90% death rates ,
since they may cause the effective population size to dip below the so-called
minimum viable population. This author ran a Monte Carlo simulation using as (admittedly very
crude and poorly informed, though arguably conservative) estimates the following Earth-like parameters:
bioterrorism event mean death rate 50% and standard deviation 25% (beta distribution), initial population 1010,
minimum viable population 4000, individual omnicidal act probability 107 per annum, and population growth
rate 2% per annum. One thousand trials yielded an average post-space-age time until extinction of less than 8000
years. This is essentially instantaneous on a cosmological scale, and varying the parameters by quite a bit does
nothing to make the survival period comparable with the age of the universe.

1NC - ISIS Version


ISIS will emerge as a serious threat to the US
Morell 15 (Michael Morell is the former deputy director of the CIA and has twice served
as acting director. He is the author of The Great War of Our Time: The CIA's Fight Against
Terrorism From al Qa'ida to ISIS. May 14, 2015 Time Magazine
ISIS Is a Danger on
U.S. Soil http://time.com/3858354/isis-is-a-danger-on-u-s-soil/)
The terrorist group poses a gathering threat. In the aftermath of the attempted terrorist attack on May 4 in Garland,
Texasfor which ISIS claimed responsibilitywe find ourselves again considering the question of whether or
not ISIS is a real threat. The answer is yes. A very serious one. Extremists inspired by Osama bin
Ladens ideology consider themselves to be at war with the U.S.; they want to attack us. It is
important to never forget thatno matter how long it has been since 9/11. ISIS is just the latest manifestation of bin
Ladens design. The group has grown faster than any terrorist group we can remember, and the
threat it poses to us is as wide-ranging as any we have seen. What ISIS has that al-Qaeda doesnt is
a Madison Avenue level of sophisticated messaging and social media. ISIS has a multilingual propaganda arm
known as al-Hayat, which uses GoPros and cameras mounted on drones to make videos that appeal to its followers.
And ISIS uses just about every tool in the platform boxfrom Twitter to YouTube to Instagramto great effect,
attracting fighters and funding. Digital media are one of the groups most significant strengths; they have helped
ISIS become an organization that poses four significant threats to the U.S. First, it is a threat to the stability of the
entire Middle East. ISIS is putting the territorial integrity of both Iraq and Syria at risk. And a further collapse of
either or both of these states could easily spread throughout the region, bringing with it sectarian and religious
strife, humanitarian crises and the violent redrawing of borders, all in a part of the world that remains critical to U.S.
national interests. ISIS now controls more territoryin Iraq and Syriathan any other terrorist group anywhere in the
world. When al-Qaeda in Iraq joined the fight in Syria, the group changed its name to ISIS. ISIS added Syrians and
foreign fighters to its ranks, built its supply of arms and money and gained significant battlefield experience fighting
Bashar Assads regime. Together with the security vacuum in Iraq and Nouri al-Malikis alienation of the Sunnis, this
culminated in ISISs successful blitzkrieg across western Iraq in the spring and summer of 2014, when it seized large
amounts of territory. ISIS is not the first extremist group to take and hold territory. Al-Shabab in Somalia did so a
number of years ago and still holds territory there, al-Qaeda in the Islamic Maghreb did so in Mali in 2012, and alQaeda in Yemen did so there at roughly the same time. I fully expect extremist groups to attempt to takeand
sometimes be successful in takingterritory in the years ahead. But no other group has taken so much territory so
quickly as ISIS has. Second, ISIS is attracting young men and women to travel to Syria and Iraq to join its cause. At
this writing, at least 20,000 foreign nationals from roughly 90 countries have gone to Syria and Iraq to join the fight.
Most have joined ISIS. This flow of foreigners has outstripped the flow of such fighters into Iraq during the war there
a decade ago. And there are more foreign fighters in Syria and Iraq today than there were in Afghanistan in the
1980s working to drive the Soviet Union out of that country. These foreign nationals are getting experience on the
battlefield, and they are becoming increasingly radicalized to ISISs cause. There is a particular subset of these
fighters to worry about. Somewhere between 3,500 and 5,000 jihadist wannabes have traveled to
Syria and Iraq from Western Europe, Canada, Australia and the U.S. They all have easy access to the U.S.
homeland, which presents two major concerns: that these fighters will leave the Middle East

and either conduct an attack on their own or conduct an attack at the direction of the ISIS
leadership. The former has already happened in Europe. It has not happened yet in the U.S.
but it will. In spring 2014, Mehdi Nemmouche, a young Frenchman who went to fight in Syria, returned to Europe
and shot three people at the Jewish Museum of Belgium in Brussels. The third threat is that ISIS is building a
following among other extremist groups around the world. The allied exaltation is happening at a faster pace than
al-Qaeda ever enjoyed. It has occurred in Algeria, Libya, Egypt and Afghanistan. More will follow. These groups,
which are already dangerous, will become even more so. They will increasingly target ISISs enemies (including us),
and they will increasingly take on ISISs brutality. We saw the targeting play out in early 2015 when an ISISassociated group in Libya killed an American in an attack on a hotel in Tripoli frequented by diplomats and
international businesspeople. And we saw the extreme violence play out just a few weeks after that when another
ISIS-affiliated group in Libya beheaded 21 Egyptian Coptic Christians. And fourth, perhaps most insidiously, ISISs
message is radicalizing young men and women around the globe who have never traveled to Syria or Iraq but who
want to commit an attack to demonstrate their solidarity with ISIS. These are the so-called lone wolves. Even before
May 4, such an ISIS-inspired attack had already occurred in the U.S.: an individual with sympathies for ISIS attacked
two New York City police officers with a hatchet. Al-Qaeda has inspired such U.S. attacksthe Fort Hood shootings in
late 2009 that killed 13 and the Boston Marathon bombing in spring 2013 that killed five and injured nearly 300.
The attempted attack in Texas is just the latest of these. We can expect more of these kinds of attacks in the U. S.
Attacks by ISIS-inspired individuals are occurring at a rapid pace around the worldroughly 10 since ISIS took control
of so much territory. Two such attacks have occurred in Canada, including the October 2014 attack on the
Parliament building. And another occurred in Sydney, in December 2014. Many planning such attacksin Australia,
Western Europe and the U.S.have been arrested before they could carry out their terrorist plans. Today an ISIS-

directed attack in the U. S. would be relatively unsophisticated (small-scale), but over time
ISISs capabilities will grow. This is what a long-term safe haven in Iraq and Syria would give ISIS, and it is

exactly what the group is planning to do. They have announced their intentionsjust like bin Laden did in the years
prior to 9/11.

Backdoors are key to stop ISIS recruitment


Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Jim Comey, ISIS, and "Going Dark"," Lawfare. 723-2015. http://www.lawfareblog.com/jim-comey-isis-and-going-dark//ghs-kw)

the nexus of our domestic ISIS


problem and what the FBI calls the "going dark" issue . CNN the other day reported on some
I had a lengthy conversation with FBI Director Jim Comey today about

remarks Comey made on the subject, remarks that have not gotten enough attention but reflect a problem at the

FBI Director James Comey said Thursday his agency does not yet have the
capabilities to limit ISIS attempts to recruit Americans through social media. It is
becoming increasingly apparent that Americans are gravitating toward the militant
organization by engaging with ISIS online , Comey said, but he told reporters that "we don't have the
capability we need" to keep the "troubled minds" at home. "Our job is to find needles in a
nationwide haystack, needles that are increasingly invisible to us because of end-toend encryption," Comey said. "This is the 'going dark' problem in high definition."
Comey said ISIS is increasingly communicating with Americans via mobile apps that are
difficult for the FBI to decrypt. He also explained that he had to balance the desire to intercept the
front of his mind these days:

communication with broader privacy concerns. "It is a really, really hard problem, but the collision that's going on
between important privacy concerns and public safety is significant enough that we have to figure out a way to
solve it," Comey said. Let's unpack this. As has been widely reported, the FBI has been busy recently dealing with

ISIS has gotten extremely good at the


inducing self-radicalization in disaffected souls worldwide using Twitter and because of the
ISIS threats. There have been a bunch of arrests, both because

convergence of Ramadan and the run-up to the July 4 holiday. As has also been widely reported, the FBI is

the effect of end-to-end encryption on its ability to conduct


counterterrorism operations and other law enforcement functions. The concern is
two-fold: It's about data at rest on devices, data that is now being encrypted in a
fashion that can't easily be cracked when those devices are lawfully seized. And it's
also about data in transit between devices, data encrypted such that when captured
with a lawful court-ordered wiretap, the signal intercepted is undecipherable. Comey
concerned about

raised his concerns on both subjects at a speech at Brookings last year and has talked about them periodically since

the extent to which the ISIS concerns and


the "going dark" concerns have converged. In his Brookings speech, Comey did not focus on
then: What was not clear to me until today, however, was

counterterrorism in the examples he gave of the going dark problem. In the remarks quoted by CNN, and in his

the landscape is changing fast. Initial


recruitment may take place on Twitter, but the promising ISIS candidate quickly gets
moved onto messaging platforms that are encrypted end to end. As a practical matter, that
means there are people in the United States whom authorities reasonably
believe to be in contact with ISIS for whom surveillance is lawful and
appropriate but for whom useful signals interception is not technically
feasible. That's a pretty scary thought. I don't know what the right answer is to this problem, which involves a
particularly complex mix of legitimate cybersecurity, investigative, and privacy questions. I do think the
problem is a very different one if the costs of impaired law enforcement access to
signal is enhanced ISIS ability to communicate with its recruits than if we're dealing
primarily with more routine crimes, even serious ones.
conversation with me today, however, he made clear that

ISIS is a threat to the grid


Landsbaum 14
(Mark, 9/5/2014, OC Register, Mark Landsbaum: Attack on power grid could
bring dark days, http://www.ocregister.com/articles/emp-633883-powerattack.html, 7/15/15, SM)

It could be worse. Terrorists pose an imminent threat to the U.S. electrical grid ,
which could leave the good ol USA looking like 19th century USA for a lot longer than three days. Dont take my
word for it. Ask

Peter Pry, former CIA officer and one-time House Armed Services Committee staffer,

who served on a congressional commission investigating such eventualities. There

is an imminent
threat from ISIS to the national electric grid and not just to a single U.S.
city, Pry warns. He points to a leaked U.S. Federal Energy Regulatory Commission report in March that said a

coordinated terrorist attack on just nine of the nations 55,000 electrical


power substations could cause coast-to-coast blackouts for up to 18 months .
Consider what youll have to worry about then. If you were uncomfortable watching looting and riots on TV last
month in Ferguson, Mo., as police stood by, project such unseemly behavior nationwide. For 18 months. Its likely
phones wont be reliable, so you wont have to watch police stand idly by. Chances are, police wont show up.
Worse, your odds of needing them will be excruciatingly more likely if terrorists attack the power grid using an
electromagnetic pulse (EMP) burst of energy to knock out electronic devices. The Congressional EMP Commission,

critical
systems in this country are distressingly unprotected. We calculated that,
based on current realities, in the first year after a full-scale EMP event, we
could expect about two-thirds of the national population 200 million
Americans to perish from starvation and disease, as well as anarchy in the
streets. Skeptical? Consider who is capable of engineering such measures before dismissing the likelihood. In
his 2013 book, A Nation Forsaken, Michael Maloof reported that the 2008 EMP Commission considered whether a
hostile nation or terrorist group could attack with a high-altitude EMP weapon
and determined, any number of adversaries possess both the ballistic
missiles and nuclear weapons capabilities, and could attack within 15 years. That was six
years ago. North Korea, Pakistan, India, China and Russia are all in the position
to launch an EMP attack against the United States now, Maloof wrote last year. Maybe
on which I served, did an extensive study of this, Pry says. We discovered to our own revulsion that

youll rest more comfortably knowing the House intelligence authorization bill passed in May told the intelligence
community to report to Congress within six months, on the threat posed by man-made electromagnetic pulse
weapons to United States interests through 2025, including threats from foreign countries and foreign nonstate
actors. Or, maybe thats not so comforting. In 2004 and again in 2008, separate congressional commissions gave
detailed, horrific reports on such threats. Now, Congress wants another report. In his book, Maloof quotes Clay
Wilson of the Congressional Research Service, who said, Several nations, including reported sponsors of terrorism,
may currently have a capability to use EMP as a weapon for cyberwarfare or cyberterrorism to disrupt
communications and other parts of the U.S. critical infrastructure. What would an EMP attack look like? Within an
instant, Maloof writes, we will have no idea whats happening all around us, because we will have no news. There
will be no radio, no TV, no cell signal. No newspaper delivered. Products wont flow into the nearby Wal-Mart. The
big trucks will be stuck on the interstates. Gas stations wont be able to pump the fuel they do have. Some police
officers and firefighters will show up for work, but most will stay home to protect their own families. Power lines will
get knocked down in windstorms, but nobody will care. Theyll all be fried anyway. Crops will wither in the fields
until scavenged since the big picking machines will all be idled, and there will be no way to get the crop to market
anyway. Nothing

thats been invented in the last 50 years based on computer


chips, microelectronics or digital technology will work. And it will get
worse.

Cyberterror leads to nuclear exchanges traditional defense


doesnt apply
Fritz 9 (Jason, Master in International Relations from Bond, BS from St.
Cloud), Hacking Nuclear Command and Control, International Commission
on Nuclear Non-proliferation and Disarmament, 2009, pnnd.org)//duncan
This paper will analyse the threat of cyber terrorism in regard to nuclear weapons.

Specifically, this research will use open source knowledge to identify the structure of nuclear command and control
centres, how those structures might be compromised through computer network operations, and how doing so

If access to command
and control centres is obtained, terrorists could fake or actually cause one
nuclear-armed state to attack another, thus provoking a nuclear response
would fit within established cyber terrorists capabilities, strategies, and tactics.

from another nuclear power. This may be an easier alternative for terrorist
groups than building or acquiring a nuclear weapon or dirty bomb themselves. This
would also act as a force equaliser, and provide terrorists with the asymmetric
benefits of high speed, removal of geographical distance, and a relatively low cost.
Continuing difficulties in developing computer tracking technologies which could trace
the identity of intruders, and difficulties in establishing an internationally agreed upon legal
framework to guide responses to computer network operations, point towards an inherent
weakness in using computer networks to manage nuclear weaponry . This is
particularly relevant to reducing the hair trigger posture of existing nuclear
arsenals. All computers which are connected to the internet are susceptible to
infiltration and remote control. Computers which operate on a closed network may also be compromised
by various hacker methods, such as privilege escalation, roaming notebooks, wireless access points, embedded
exploits in software and hardware, and maintenance entry points. For example, e-mail spoofing targeted at
individuals who have access to a closed network, could lead to the installation of a virus on an open network. This
virus could then be carelessly transported on removable data storage between the open and closed network.

Efforts by
militaries to place increasing reliance on computer networks , including experimental
technology such as autonomous systems, and their desire to have multiple launch
options, such as nuclear triad capability, enables multiple entry points for terrorists .
Information found on the internet may also reveal how to access these closed networks directly.

For example, if a terrestrial command centre is impenetrable, perhaps isolating one nuclear armed submarine would
prove an easier task. There is evidence to suggest multiple attempts have been made by hackers to compromise
the extremely low radio frequency once used by the US Navy to send nuclear launch approval to submerged
submarines. Additionally, the alleged Soviet system known as Perimetr was designed to automatically launch
nuclear weapons if it was unable to establish communications with Soviet leadership. This was intended as a
retaliatory response in the event that nuclear weapons had decapitated Soviet leadership; however it did not
account for the possibility of cyber terrorists blocking communications through computer network operations in an
Should a warhead be launched, damage could be further
enhanced through additional computer network operations. By using proxies, multilayered attacks could be engineered. Terrorists could remotely commandeer computers in China and

attempt to engage the system.

use them to launch a US nuclear attack against Russia. Thus Russia would believe it was under attack from the US

emergency response communications


could be disrupted, transportation could be shut down, and disinformation, such as
misdirection, could be planted, thereby hindering the disaster relief effort and
maximizing destruction. Disruptions in communication and the use of
disinformation could also be used to provoke uninformed responses. For
and the US would believe China was responsible. Further,

example, a nuclear strike between India and Pakistan could be coordinated with Distributed Denial of Service
attacks against key networks, so they would have further difficulty in identifying what happened and be forced to
respond quickly. Terrorists could also knock out communications between these states so they cannot discuss the

amidst the confusion of a traditional large-scale terrorist attack,


claims of responsibility and declarations of war could be falsified in an attempt to
instigate a hasty military response. These false claims could be posted directly on Presidential,
situation. Alternatively,

military, and government websites. E-mails could also be sent to the media and foreign governments using the IP
addresses and e-mail accounts of government officials. A sophisticated and all encompassing combination of
traditional terrorism and cyber terrorism could be enough to launch nuclear weapons on its own, without the need
for compromising command and control centres directly.

2NC UQ - ISIS
ISIS is mobilizing now and ready to take action.
DeSoto 5/7 (Randy DeSoto May 7, 2015 http://www.westernjournalism.com/isis-claimsto-have-71-trained-soldiers-in-targeted-u-s-states/ Randy DeSoto is a writer for
Western Journalism, which consistently ranks in the top 5 most popular conservative
online news outlets in the country)
Purported ISIS jihadists issued threats against the United States Tuesday, indicating the
group has trained soldiers positioned throughout the country, ready to attack any target we
desire. The online post singles out controversial blogger Pamela Geller, one of the organizers of the Draw
the Prophet Muhammad cartoon contest in Garland, Texas, calling for her death to heal the hearts of our brothers
and disperse the ones behind her. ISIS also claimed responsibility for the shooting, which marked
the first time the terror group claimed responsibility for an attack on U.S. soil , according to the
New York Daily News. The attack by the Islamic State in America is only the beginning of our efforts to
establish a wiliyah [authority or governance] in the heart of our enemy, the ISIS post reads. As for Geller, the
jihadists state: To those who protect her: this will be your only warning of housing this woman and her circus show.
Everyone who houses her events, gives her a platform to spill her filth are legitimate targets. We have been
watching closely who was present at this event and the shooter of our brothers. ISIS further claims to have

known that the Muhammad cartoon contest venue would be heavily guarded, but conducted
the attack to demonstrate the willingness of its followers to die for the Sake of Allah. The FBI
and the Department of Homeland Security, in fact, issued a bulletin on April 20 indicating the event would be a
likely terror target. ISIS drew its message to a close with an ominous threat: We have 71 trained

soldiers in 15 different states ready at our word to attack any target we desire. Out of the 71
trained soldiers 23 have signed up for missions like Sunday, We are increasing in number
bithnillah [if God wills]. Of the 15 states, 5 we will name Virginia, Maryland, Illinois, California,
and MichiganThe next six months will be interesting. Fox News reports that the U.S. intelligence
community was assessing the threat and trying to determine if the source is directly related
to ISIS leadership or an opportunist such as a low-level militant seeking to further capitalize
on the Garland incident. Former Navy Seal Rob ONeill told Fox News he believes the ISIS threat is credible,
and the U.S. must be prepared. He added that the incident in Garland is a prime example of the difference
between a gun free zone and Texas. They showed up at Charlie Hebdo, and it was a massacre. If these two guys
had gotten into that building it would have been Charlie Hebdo times ten. But these two guys showed up because
they were offended by something protected by the First Amendment, and were quickly introduced to the Second
Amendment. Geller issued a statement regarding the ISIS posting: This threat illustrates the savagery and
barbarism of the Islamic State. They want me dead for violating Sharia blasphemy laws. What remains to be seen is
whether the free world will finally wake up and stand for the freedom of speech, or instead kowtow to this evil and
continue to denounce me.

ISIS will attack three reasons its capabilities are growing,


an attack would be good propaganda, and it basically hates all
things America
Rogan 15 (Tom, panelist on The McLaughlin Group and holds the Tony Blankley Chair at
the Steamboat Institute, Why ISIS Will Attack America, National Review, 3-24-15,
http://www.nationalreview.com/article/415866/why-isis-will-attack-america-tom-rogan)//MJ

There is no good in you if they are secure and happy while you have a pulsing vein. Erupt volcanoes of jihad
everywhere. Light the earth with fire upon all the [apostate rulers], their soldiers and supporters. ISIS leader Abu
Bakr al-Baghdadi, November 2014. Those words werent idle. The Islamic State (ISIS) is still advancing,

across continents and cultures. Its attacking Shia Muslims in Yemen, gunning down Western
tourists in Tunisia, beheading Christians in Libya, and murdering or enslaving all who do not
yield in Iraq and Syria. Its black banner seen as undaunted by the international coalition against it, new
recruits still flock to its service. The Islamic States rise is, in other words, not over, and it is likely to end up
involving an attack on America. Three reasons why such an attempt is inevitable: ISISS STRATEGY
PRACTICALLY DEMANDS IT Imbued with existential hatred against the United States, the group doesnt just oppose
American power, it opposes Americas identity. Where the United States is a secular democracy that binds law to
individual freedom, the Islamic State is a totalitarian empire determined to sweep freedom from the earth. As an
ideological and physical necessity, ISIS must ultimately conquer America. Incidentally, this kind of
total-war strategy explains why counterterrorism experts are rightly concerned about nuclear proliferation. The

Islamic States strategy is also energized by its desire to replace al-Qaeda as Salafi jihadisms
global figurehead. While al-Qaeda in the Arabian Peninsula (AQAP) and ISIS had a short flirtation last year, ISIS
has now signaled its intent to usurp al-Qaedas power in its home territory. Attacks by ISIS last week against
Shia mosques in the Yemeni capital of Sanaa were, at least in part, designed to suck recruits, financial donors, and
prestige away from AQAP. But to truly displace al-Qaeda, ISIS knows it must furnish a new 9/11. ITS
CAPABILITIES ARE GROWING Today, ISIS has thousands of European citizens in its ranks. Educated at the
online University of Edward Snowden, ISIS operations officers have cut back intelligence services

ability to monitor and disrupt their communications. With EU intelligence services stretched
beyond breaking point, ISIS has the means and confidence to attempt attacks against the
West. EU passports are powerful weapons: ISIS could attack as al-Qaeda has repeatedly U.S. targets around
the world. AN ATTACK ON THE U.S. IS PRICELESS PROPAGANDA For transnational Salafi jihadists like alQaeda and ISIS, a successful blow against the U.S. allows them to claim the mantle of a global
force and strengthens the narrative that theyre on a holy mission. Holiness is especially important:
ISIS knows that to recruit new fanatics and deter its enemies, it must offer an abiding narrative of strength and
divine purpose. With the groups leaders styling themselves as Mohammeds heirs, Allahs

chosen warriors on earth, attacking the infidel United States would reinforce ISISs narrative.
Of course, attacking America wouldnt actually serve the Islamic States long-term objectives. Quite the opposite:
Any atrocity would fuel a popular American resolve to crush the group with expediency. (Make no mistake, it would
be crushed.) The problem, however, is that, until then, America is in the bulls eye.

2NC Cyber - ISIS


ISIS is a threat to the grid
Landsbaum 14
(Mark, 9/5/2014, OC Register, Mark Landsbaum: Attack on power grid could
bring dark days, http://www.ocregister.com/articles/emp-633883-powerattack.html, 7/15/15, SM)
It could be worse. Terrorists pose an imminent threat to the U.S. electrical grid ,
which could leave the good ol USA looking like 19th century USA for a lot longer than three days. Dont take my

Peter Pry, former CIA officer and one-time House Armed Services Committee staffer,
is an imminent
threat from ISIS to the national electric grid and not just to a single U.S. city,
Pry warns. He points to a leaked U.S. Federal Energy Regulatory Commission report in March that said a
coordinated terrorist attack on just nine of the nations 55,000 electrical
power substations could cause coast-to-coast blackouts for up to 18 months .
word for it. Ask

who served on a congressional commission investigating such eventualities. There

Consider what youll have to worry about then. If you were uncomfortable watching looting and riots on TV last
month in Ferguson, Mo., as police stood by, project such unseemly behavior nationwide. For 18 months. Its likely
phones wont be reliable, so you wont have to watch police stand idly by. Chances are, police wont show up.
Worse, your odds of needing them will be excruciatingly more likely if terrorists attack the power grid using an
electromagnetic pulse (EMP) burst of energy to knock out electronic devices. The Congressional EMP Commission,

critical
systems in this country are distressingly unprotected. We calculated that,
based on current realities, in the first year after a full-scale EMP event, we
could expect about two-thirds of the national population 200 million
Americans to perish from starvation and disease, as well as anarchy in the
streets. Skeptical? Consider who is capable of engineering such measures before dismissing the likelihood. In
his 2013 book, A Nation Forsaken, Michael Maloof reported that the 2008 EMP Commission considered whether a
hostile nation or terrorist group could attack with a high-altitude EMP weapon
and determined, any number of adversaries possess both the ballistic
missiles and nuclear weapons capabilities, and could attack within 15 years. That was six
years ago. North Korea, Pakistan, India, China and Russia are all in the position
to launch an EMP attack against the United States now, Maloof wrote last year. Maybe
on which I served, did an extensive study of this, Pry says. We discovered to our own revulsion that

youll rest more comfortably knowing the House intelligence authorization bill passed in May told the intelligence
community to report to Congress within six months, on the threat posed by man-made electromagnetic pulse
weapons to United States interests through 2025, including threats from foreign countries and foreign nonstate
actors. Or, maybe thats not so comforting. In 2004 and again in 2008, separate congressional commissions gave
detailed, horrific reports on such threats. Now, Congress wants another report. In his book, Maloof quotes Clay
Wilson of the Congressional Research Service, who said, Several nations, including reported sponsors of terrorism,
may currently have a capability to use EMP as a weapon for cyberwarfare or cyberterrorism to disrupt
communications and other parts of the U.S. critical infrastructure. What would an EMP attack look like? Within an
instant, Maloof writes, we will have no idea whats happening all around us, because we will have no news. There
will be no radio, no TV, no cell signal. No newspaper delivered. Products wont flow into the nearby Wal-Mart. The
big trucks will be stuck on the interstates. Gas stations wont be able to pump the fuel they do have. Some police
officers and firefighters will show up for work, but most will stay home to protect their own families. Power lines will
get knocked down in windstorms, but nobody will care. Theyll all be fried anyway. Crops will wither in the fields
until scavenged since the big picking machines will all be idled, and there will be no way to get the crop to market
anyway. Nothing

thats been invented in the last 50 years based on computer


chips, microelectronics or digital technology will work. And it will get
worse.

2NC Links
Backdoors are key to prevent terrorism
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)

The risks related to going dark are real.

When the President of the United States,60 the Prime


Minister of the United Kingdom,61 and the Director of the FBI62 all publically express deep concerns about how this

encryption
technologies that are making it increasingly easy for individual users to
prevent even lawful government access to potentially vital information
related to crimes or other national security threats. This evolution of
individual encryption capabilities represents a fundamental distortion of the balance
between government surveillance authority and individual liberty central to the Fourth
phenomenon will endanger their respective nations, it is difficult to ignore. Today,

Amendment. And balance is the operative word. The right of The People to be secure against unreasonable
government intrusions into those places and things protected by the Fourth Amendment must be vehemently

Reasonable searches, however, should not only be permitted, but they should be
mandated where necessary. Congress has the authority to ensure that such
searches are possible. While some argue that this could cause American manufacturers to suffer, saddled
protected.

as they will appear to be by the Snowden Effect, the rules will apply equally to any manufacturer that wishes to
do business in the United States. Considering that the United States economy is the largest in the world, it is highly
unlikely that foreign manufacturers will forego access to our market in order to avoid having to create CALEA-like
solutions to allow for lawful access to encrypted data. Just as foreign cellular telephone providers, such as T-Mobile,
are active in the United States, so too will foreign device manufacturers and other communications services adjust
their technology to comply with our laws and regulations. This will put American and foreign companies on an equal
playing field while encouraging ingenuity and competition. Most importantly, the

right of the people to


be secure in their persons, houses, papers, and effects will be protected not only against
unreasonable searches and seizures, but also against attacks by criminals and terrorists.
And is not this, in essence, the primary purpose of government?

Backdoors are key to securityterror turns the case


Goldsmith 13
(Jack Goldsmith. Jack Goldsmith, a contributing editor, teaches at Harvard Law School and is a member
of the Hoover Institution Task Force on National Security and Law. "We Need an Invasive NSA," New
Republic. 10-10-2013. http://www.newrepublic.com/article/115002/invasive-nsa-will-protect-us-cyberattacks//ghs-kw)
Ever since stories about the National Security Agencys (NSA) electronic intelligence-gathering capabilities began

The New York Times has published more than a dozen editorials excoriating the
national surveillance state. It wants the NSA to end the mass warehousing of everyones data and
the use of back doors to break encrypted communications. A major element of the Times critique is that
tumbling out last June,

the NSAs domestic sweeps are not justified by the terrorist threat they aim to prevent. At the end of August, in the
midst of the Times assault on the NSA, the newspaper suffered what it described as a malicious external attack
on its domain name registrar at the hands of the Syrian Electronic Army, a group of hackers who support Syrian
President Bashar Al Assad. The papers website was down for several hours and, for some people, much longer. In
terms of the sophistication of the attack, this is a big deal, said Marc Frons, the Times chief information officer. Ten
months earlier, hackers stole the corporate passwords for every employee at the Times, accessed the computers of
53 employees, and breached the e-mail accounts of two reporters who cover China. We brought in the FBI, and the
FBI said this had all the hallmarks of hacking by the Chinese military, Frons said at the time. He also acknowledged
that the hackers were in the Times system on election night in 2012 and could have wreaked havoc on its

cyber-intrusions threaten corporate America and the U.S.


government every day. Relentless assaults on Americas computer networks by
China and other foreign governments, hackers and criminals have created an urgent
need for safeguards to protect these vital systems , the Times editorial page noted last year
coverage if they wanted. Such

while supporting legislation encouraging the private sector to share cybersecurity information with the government.

Keith Alexander, the director of the NSA, who had noted a 17-fold increase in cyberintrusions on critical infrastructure from 2009 to 2011 and who described the losses
in the United States from cyber-theft as the greatest transfer of wealth in history.
If a catastrophic cyber-attack occurs, the Times concluded, Americans will be justified
in asking why their lawmakers ... failed to protect them. When catastrophe
strikes, the public will adjust its tolerance for intrusive government
measures. The Times editorial board is quite right about the seriousness of the
cyber- threat and the federal governments responsibility to redress it. What it does
not appear to realize is the connection between the domestic NSA surveillance it
detests and the governmental assistance with cybersecurity it cherishes . To keep
our computer and telecommunication networks secure, the government
will eventually need to monitor and collect intelligence on those networks
using techniques similar to ones the Times and many others find
reprehensible when done for counterterrorism ends. The fate of domestic
surveillance is today being fought around the topic of whether it is needed to stop
Al Qaeda from blowing things up. But the fight tomorrow, and the more important
fight, will be about whether it is necessary to protect our ways of life embedded in
computer networks. Anyone anywhere with a connection to the Internet can engage in cyber-operations
within the United States. Most truly harmful cyber-operations, however, require group effort
and significant skill. The attacking group or nation must have clever hackers,
significant computing power, and the sophisticated software known as malwarethat
enables the monitoring, exfiltration, or destruction of information inside a computer.
It cited General

The supply of all of these resources has been growing fast for many yearsin governmental labs devoted to

Telecommunication networks
are the channels through which malware typically travels , often anonymized or encrypted, and
buried in the billions of communications that traverse the globe each day. The targets are the
communications networks themselves as well as the computers they connect things
developing these tools and on sprawling black markets on the Internet.

like the Times servers, the computer systems that monitor nuclear plants, classified documents on computers in

To keep these
computers and networks secure, the government needs powerful intelligence
capabilities abroad so that it can learn about planned cyber-intrusions. It also needs
to raise defenses at home. An important first step is to correct the market failures that plague
the Pentagon, the nasdaq exchange, your local bank, and your social-network providers.

cybersecurity. Through law or regulation, the government must improve incentives for individuals to use security
software, for private firms to harden their defenses and share information with one another, and for Internet service
providers to crack down on the botnetsnetworks of compromised zombie computersthat underlie many cyberattacks. More, too, must be done to prevent insider threats like Edward Snowdens, and to control the stealth
introduction of vulnerabilities during the manufacture of computer componentsvulnerabilities that can later be

The U.S. government can fully


monitor air, space, and sea for potential attacks from abroad. But it has limited
access to the channels of cyber-attack and cyber-theft, because they are owned by
private telecommunication firms, and because Congress strictly limits government access to private
used as windows for cyber-attacks. And yet thats still not enough.

communications. I cant defend the country until Im into all the networks, General Alexander reportedly told

being in the network means having


government computers scan the content and metadata of Internet communications
in the United States and store some of these communications for extended periods.
Such access, he thinks, will give the government a fighting chance to find the needle of known
malware in the haystack of communications so that it can block or degrade the attack or
exploitation. It will also allow it to discern patterns of malicious activity in the swarm
of communications, even when it doesnt possess the malwares signature. And it
will better enable the government to trace back an attacks trajectory so that it can
discover the identity and geographical origin of the threat. Alexanders domestic
senior government officials a few months ago. For Alexander,

cybersecurity plans look like pumped-up versions of the NSAs counterterrorism-related homeland surveillance that
has sparked so much controversy in recent months. That is why so many people in Washington think that

Alexanders vision has virtually no chance of moving forward, as the Times recently reported. Whatever trust
was there is now gone, a senior intelligence official told Times. There are two reasons to think that these

the government, with extensive assistance from the NSA, will one day
intimately monitor private networks. The first is that the cybersecurity threat is more
pervasive and severe than the terrorism threat and is somewhat easier to see. If the Times website
goes down a few more times and for longer periods, and if the next penetration of
its computer systems causes large intellectual property losses or a compromise in
its reporting, even the editorial page would rethink the proper balance of privacy
and security. The point generalizes: As cyber-theft and cyber-attacks continue to
spread (and they will), and especially when they result in a catastrophic
disaster (like a banking compromise that destroys market confidence, or a
successful attack on an electrical grid), the public will demand
government action to remedy the problem and will adjust its tolerance for
intrusive government measures. At that point, the nations willingness to adopt some version of
predictions are wrong and that

Alexanders vision will depend on the possibility of credible restraints on the NSAs activities and credible ways for

the second
reason why skeptics about enhanced government involvement in the network might be wrong. The
public mistrusts the NSA not just because of what it does, but also because of its extraordinary secrecy. To
obtain the credibility it needs to secure permission from the American people to protect our
networks, the NSA and the intelligence community must fundamentally recalibrate their
attitude toward disclosure and scrutiny. There are signs that this is happening and
the public to monitor, debate, and approve what the NSA is doing over time. Which leads to

that, despite the undoubted damage he inflicted on our national security in other respects, we have Edward
Snowden to thank. Before the unauthorized disclosures, we were always conservative about discussing specifics of
our collection programs, based on the truism that the more adversaries know about what were doing, the more
they can avoid our surveillance, testified Director of National Intelligence James Clapper last month. But the
disclosures, for better or worse, have lowered the threshold for discussing these matters in public. In the last few

the NSA has done the unthinkable in releasing dozens of documents that
implicitly confirm general elements of its collection capabilities. These revelations are
weeks,

bewildering to most people in the intelligence community and no doubt hurt some elements of collection. But they

are justified by the countervailing need for public debate about , and public confidence in,
NSA activities that had run ahead of what the public expected. And they suggest that secrecy about collection
capacities is one value, but not the only or even the most important one. They also show that not all revelations of
NSA capabilities are equally harmful. Disclosure that it sweeps up metadata is less damaging to its mission than
disclosure of the fine-grained details about how it collects and analyzes that metadata.

2NC AT Encryption =/= Backdoors


All our encryption args still apply
Sasso 14
(Brendan Sasso. technology correspondent for National Journal, previously covered technology policy issues for The Hill and was a
researcher and contributing writer for the 2012 edition of the Almanac of American Politics. "The NSA Isn't Just Spying on Us, It's Also
Undermining Internet Security," nationaljournal. 4-29-2014. http://www.nationaljournal.com/daily/the-nsa-isn-t-just-spying-on-us-it-salso-undermining-internet-security-20140429//ghs-kw)
According to the leaked documents, the NSA inserted a so-called back door into at least one encryption standard

that was developed by the National Institute of Standards and Technology. The NSA could use
that back door to spy on suspected terrorists, but the vulnerability was also available to any other
hacker who discovered it.

2NC Turns Backdoors


Cyberattacks turn the casepublic pressures for backdoors
Goldsmith 13
(Jack Goldsmith. Jack Goldsmith, a contributing editor, teaches at Harvard Law School and is a member
of the Hoover Institution Task Force on National Security and Law. "We Need an Invasive NSA," New
Republic. 10-10-2013. http://www.newrepublic.com/article/115002/invasive-nsa-will-protect-us-cyberattacks//ghs-kw)
There are two reasons to think that these predictions are wrong and that the government, with extensive assistance
from

the NSA, will one day intimately monitor private networks.

The first is that the

If the
Times website goes down a few more times and for longer periods, and if the next
penetration of its computer systems causes large intellectual property losses or a
compromise in its reporting, even the editorial page would rethink the proper
balance of privacy and security. The point generalizes: As cyber-theft and cyberattacks continue to spread (and they will), and especially when they result
in a catastrophic disaster (like a banking compromise that destroys market confidence, or a
successful attack on an electrical grid), the public will demand
government action to remedy the problem and will adjust its tolerance for
intrusive government measures.
cybersecurity threat is more pervasive and severe than the terrorism threat and is somewhat easier to see.

Ptix

1NC
Backdoors are popular nownational security concerns
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark: Part I,"
Lawfare. 7-23-2015. http://www.lawfareblog.com/thoughts-encryption-and-going-dark-part-i//ghs-kw)
In other words, I think Comey and Yates inevitably are asking for legislation , at least in the
longer term. The administration has decided not to seek it now, so the conversation is taking place at
a somewhat higher level of abstraction than it would if there were a specific legislative proposal on

But the current discussion should be understood as an effort to begin


building a legislative coalition for some sort of mandate that internet platform
companies retain (or build) the ability to permit, with appropriate legal process, the capture
and delivery to law enforcement and intelligence authorities of decrypted versions
of the signals they carry. This coalition does not exist yet , particularly not in the House of
Representatives. But yesterday's hearings were striking in showing how successful
Comey has been in the early phases of building it. A lot of members are clearly
concerned already. That concern will likely grow if Comey is correct about the speed
with which major investigative tools are weakening in their utility. And it could
become a powerful force in the event an attack swings the pendulum away from
civil libertarian orthodoxy.
the table.

2NC
(KQ) 1AC Macri 14 evidence magnifies the link to politics: The
U.S. Senate voted down consideration of a bill on Tuesday that
would have reigned in the NSAs powers to conduct domestic
surveillance, upping the legal hurdles for certain types of
spying Rogers repeated Thursday he was largely uninterested
in.
Even if backdoors are unpopular now, that will inevitably
change
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)

There's a final, non-legal factor that may push companies to work this problem as
energetically as they are now moving toward end-to-end encryption: politics. We are
at very particular moment in the cryptography debate, a moment in which law
enforcement sees a major problem as having arrived but the tech companies see
that problem as part of the solution to the problems the Snowden revelations
created for them. That is, we have an end-to-end encryption issue, in significant part, because companies
are trying to assure customers worldwide that they have their backs privacy-wise and are not simply tools of NSA. I

politics are likely to change. If Comey is right and we start seeing law
enforcement and intelligence agencies blind in investigating and preventing horrible
crimes and significant threats, the pressure on the companies is going to shift. And
it may shift fast and hard. Whereas the companies now feel intense pressure to
assure customers that their data is safe from NSA, the kidnapped kid with the
encrypted iPhone is going to generate a very different sort of political response. In
extraordinary circumstances, extraordinary access may well seem reasonable. And
people will wonder why it doesn't exist.
think those

Military DA

1NC
Cyber-deterrence is strong now but keeping our capabilities in
line with other powers is key to maintain stability
Healey 14
(Healey, Jason. Jason Healey is a Nonresident Senior Fellow for the Cyber Statecraft Initiative of the
Atlantic Council and Senior Research Scholar at Columbia University's School of International and
Public Affairs, focusing on international cooperation, competition, and conflict in cyberspace. From
2011 to 2015, he worked as the Director of the Council's Cyber Statecraft Initiative. Starting his career
in the United States Air Force, Mr. Healey earned two Meritorious Service Medals for his early work in
cyber operations at Headquarters Air Force at the Pentagon and as a plankholder (founding member)
of the Joint Task Force Computer Network Defense, the world's first joint cyber warfighting unit. He
has degrees from the United States Air Force Academy (political science), Johns Hopkins University
(liberal arts), and James Madison University (information security). "Commentary: Cyber Deterrence Is
Working," Defense News. 7-30-2014.
http://archive.defensenews.com/article/20140730/DEFFEAT05/307300017/Commentary-CyberDeterrence-Working//ghs-kw)

cyber deterrence is not only


possible but has been working for decades. Cyberwar professionals are in the midst of a
decades-old debate on how America could deter adversaries from attacking us in
cyberspace. In 2010, then-Deputy Defense Secretary Bill Lynn summed up the prevailing view that Cold War
Despite the mainstream view of cyberwar professionals and theorists,

deterrence models do not apply to cyberspace because of low barriers to entry and the anonymity of Internet
attacks. Cyber attacks, unlike intercontinental missiles, dont have a return address. But this view is too narrow and

The history of how nations have actually fought (or not fought) conflicts in
cyberspace makes it clear deterrence is not only theoretically possible, but is
actually keeping an upper threshold to cyber hostilities. The hidden hand of
deterrence is most obvious in the discussion of a digital Pearl Harbor. In 2012,
then-Defense Secretary Leon Panetta described his worries of such a bolt-from-theblue attack that could cripple the United States or its military. Though his phrase raised
eyebrows among cyber professionals, there was broad agreement with the basic implication:
The United States is strategically vulnerable and potential adversaries have both
the means for strategic attack and the will to do it. But worrying about a digital
Pearl Harbor actually dates not to 2012 but to testimony by Winn Schwartau to
Congress in 1991. So cyber experts have been handwringing about a digital Pearl
Harbor for more than 20 of the 70 years since the actual Pearl Harbor. Waiting for Blow To
Come? Clearly there is a different dynamic than recognized by conventional wisdom. For over two
decades, the United States has had its throat bared to the cyber capabilities of
potential adversaries (and presumably their throats are as bared to our capabilities),
yet the blow has never come. There is no solid evidence anyone has ever been killed by any cyber
technical.

attack; no massive power outages, no disruptions of hospitals or faking of hospital records, no tampering of dams

The Internet is a fierce domain and conflicts are common


between nations. But deterrence or at least restraint has kept a lid on the
worst. Consider: Large nations have never launched strategically significant
disruptive cyber attacks against other large nations. China, Russia and the United States seem
causing a catastrophic flood.

to have plans to do so not as surprise attacks from a clear sky, but as part of a major (perhaps even existential)

Cyber attacks between equals have


always stayed below the threshold of death and destruction. Larger nations do
seem to be willing to launch significant cyber assaults against rivals but only during
larger crises and below the threshold of death and destruction, such as Russian attacks
international security crisis not unlike the original Pearl Harbor.

against Estonia and Georgia or China egging on patriotic hackers to disrupt computers in dust-ups with Japan,
Vietnam or the Philippines. The United States and Israel have perhaps come closest to the threshold with the
Stuxnet attacks but even here, the attacks were against a very limited target (Iranian programs to enrich uranium)
and hardly out of the blue. Nations seem almost completely unrestrained using cyber espionage to further their
security (and sometimes commercial) objectives and only slightly more restrained using low levels of cyber force for
small-scale disruption, such as Chinese or Russian disruption of dissidents websites or British disruption of chat
rooms used by Anonymous to coordinate protest attacks.

In a discussion about any other kind of

military power, such as nuclear weapons, we would have no problem using the word
deterrence to describe nations reluctance to unleash capabilities against one
another. Indeed, a comparison with nuclear deterrence is extremely relevant, but
not necessarily the one that Cold Warriors have recognized. Setting a Ceiling Nuclear
weapons did not make all wars unthinkable, as some early postwar thinkers had
hoped. Instead, they provided a ceiling under which the superpowers fought all
kinds of wars, regular and irregular. The United States and Soviet Union, and their allies and proxies,
engaged in lethal, intense conflicts from Korea to Vietnam and through proxies in Africa, Asia and Latin America.

Nuclear warheads did not stop these wars, but did set an upper threshold neither
side proved willing to exceed. Likewise, the most cyber capable nations (including
America, China and Russia) have been more than willing to engage in irregular cyber
conflicts, but have stayed well under the threshold of strategic cyber
warfare, creating a de facto norm. Nations have proved just as unwilling to launch
a strategic attack in cyberspace as they are in the air, land, sea or space. The new
norm is same as the old norm. This norm of strategic restraint is a blessing but still is no help
to deter cyber crime or the irregular conflicts that have long occurred under the threshold. Cyber espionage and
lesser state-sponsored cyber disruption seem to be increasing markedly in the last few years.

Backdoors are key to cyberoffensive capabilities


Schneier 13
(Schneier. Schneier is a fellow at the Berkman Center for Internet & Society at Harvard Law School and
a program fellow at the New America Foundation's Open Technology. He is an American cryptographer,
computer security and privacy specialist, and writer. He is the author of several books on general
security topics, computer security and cryptography. He is also a contributing writer for The Guardian
news organization.[ "US Offensive Cyberwar Policy. 06-21-2013.
https://www.schneier.com/blog/archives/2013/06/us_offensive_cy.html//ghs-kw)

Cyberattacks have the potential to be both immediate and devastating. They can
disrupt communications systems, disable national infrastructure , or, as in the case of
Stuxnet, destroy nuclear reactors; but only if they've been created and targeted beforehand.
Before launching cyberattacks against another country, we have to go through several
steps. We have to study the details of the computer systems they're running and
determine the vulnerabilities of those systems. If we can't find exploitable vulnerabilities,
we need to create them: leaving "back doors," in hacker speak. Then we have to build
new cyberweapons designed specifically to attack those systems. Sometimes we have to
embed the hostile code in those networks -- these are called "logic bombs" -- to be unleashed in the future. And we
have to keep penetrating those foreign networks, because computer systems
always change and we need to ensure that the cyberweapons are still effective. Like
our nuclear arsenal during the Cold War, our cyberweapons arsenal must be pretargeted and
ready to launch. That's what Obama directed the US Cyber Command to do. We can see glimpses of
how effective we are in Snowden's allegations that the NSA is currently penetrating
foreign networks around the world: "We hack network backbones -- like huge
Internet routers, basically -- that give us access to the communications of hundreds
of thousands of computers without having to hack every single one."

Loss of cyber-offensive capabilities incentivizes China to take


Taiwanturns heg and the economy
Hjortdal 11
(Magnus Hjortdal received his BSc and MSc in Political Science, with a specialization in IR, from the
University of Copenhagen. He was an Assistant Lecturer at the University of Copenhagen, a Research
Fellow at the Royal Danish Defence College, and is now the Head of the Ministry of Foreign Affairs in
Denmark. China's Use of Cyber Warfare: Espionage Meets Strategic Deterrence , Journal of Strategic
Security, Vol. 4 No. 2, Summer 2011: Strategic Security in the Cyber Age, Article 2, pp 1-24.
http://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1101&context=jss//ghs-kw)

China's military strategy mentions cyber capabilities as an area that the People's

Liberation Army (PLA) should invest in and use on a large scale. 13 The U.S. Secretary of
Defense, Robert Gates, has also declared that China's development in the cyber area
increasingly concerns him,14 and that there has been a decade-long trend of cyber attacks emanating
from China.15 Virtually all digital and electronic military systems can be attacked via cyberspace. Therefore, it is
essential for a state to develop capabilities in this area if it wishes to challenge the
present American hegemony. The interesting question then is whether China is developing capabilities in
cyberspace in order to deter the United States.16 China's military strategists describe cyber
capabilities as a powerful asymmetric opportunity in a deterrence strategy. 19
Analysts consider that an "important theme in Chinese writings on computernetwork operations (CNO) is the use of computer-network attack (CNA) as the
spearpoint of deterrence."20 CNA increases the enemy's costs to become too great
to engage in warfare in the first place, which Chinese analysts judge to be essential
for deterrence.21 This could, for example, leave China with the potential ability to
deter the United States from intervening in a scenario concerning Taiwan.
CNO is viewed as a focal point for the People's Liberation Army, but it is not clear how the actual capacity functions

If a state with superpower potential (here China) is


to create an opportunity to ascend militarily and politically in the international
system, it would require an asymmetric deterrence capability such as that described
here.23 It is said that the "most significant computer network attack is characterized
as a pre-emption weapon to be used under the rubric of the rising Chinese strategy
of [] gaining mastery before the enemy has struck." 24 Therefore, China, like other states
or precisely what conditions it works under.22

seeking a similar capacity, has recruited massively within the hacker milieu inside China.25 Increasing resources in
the PLA are being allocated to develop assets in relation to cyberspace.26 The improvements are visible: The PLA
has established "information warfare" capabilities,27 with a special focus on cyber warfare that, according to their
doctrine, can be used in peacetime.28 Strategists from the PLA advocate the use of virus and hacker attacks that
can paralyze and surprise its enemies.29 Aggressive and Widespread Cyber Attacks from China and the

China's use of asymmetric capabilities, especially cyber warfare,


could pose a serious threat to the American economy .30 Research and development in cyber
International Response

espionage figure prominently in the 12th Five-Year Plan (20112015) that is being drafted by both the Chinese

China could well have the most extensive


and aggressive cyber warfare capability in the world, and that this is being driven by
China's desire for "global-power status." 32 These observations do not come out of the blue, but are a
central government and the PLA.31 Analysts say that

consequence of the fact that authoritative Chinese writings on the subject present cyber warfare as an obvious
asymmetric instrument for balancing overwhelming (mainly U.S.) power, especially in case of open conflict, but also
as a deterrent.33

Escalates to nuclear war and turns the economy


Landay 2k
(Jonathan S. Landay, National Security and Intelligence Correspondent, -2K [Top Administration
Officials Warn Stakes for U.S. Are High in Asian Conflicts, Knight Ridder/Tribune News Service, March
10, p. Lexis. Ghs-kw)

China and Taiwan, North Korea and South Korea, or India and
Pakistan are spoiling to fight. But even a minor miscalculation by any of them could
destabilize Asia, jolt the global economy and even start a nuclear war. India,
Pakistan and China all have nuclear weapons, and North Korea may have a few , too.
Asia lacks the kinds of organizations, negotiations and diplomatic
relationships that helped keep an uneasy peace for five decades in Cold
War Europe. Nowhere else on Earth are the stakes as high and relationships so
fragile, said Bates Gill, director of northeast Asian policy studies at the Brookings Institution, a Washington think
tank. We see the convergence of great power interest overlaid with lingering
confrontations with no institutionalized security mechanism in place. There are
elements for potential disaster. In an effort to cool the regions tempers, President Clinton, Defense
Few if any experts think

Secretary William S. Cohen and National Security Adviser Samuel R. Berger all will hopscotch Asias capitals this
month.

For America, the stakes could hardly be higher. There are 100,000 U.S. troops

in Asia committed to defending Taiwan, Japan and South Korea, and the United
States would instantly become embroiled if Beijing moved against Taiwan or North
Korea attacked South Korea. While Washington has no defense commitments to either India or Pakistan, a
conflict between the two could end the global taboo against using nuclear
weapons and demolish the already shaky international nonproliferation
regime. In addition, globalization has made a stable Asia _ with its massive markets,
cheap labor, exports and resources indispensable to the U.S. economy. Numerous
U.S. firms and millions of American jobs depend on trade with Asia that totaled $600
billion last year, according to the Commerce Department.

2NC UQ
Cyber-capabilities strong now but its close
NBC 13
(NBC citing Scott Borg, CEO of the US Cyber Consequences Unit, and independent, non-profit research
institute. Borg has lectured at Harvard, Yale, Columbia, London, and other leading universities.
"Expert: US in cyberwar arms race with China, Russia," NBC News. 02-20-2013.
http://investigations.nbcnews.com/_news/2013/02/20/17022378-expert-us-in-cyberwar-arms-race-withchina-russia//ghs-kw)

The United States is locked in a tight race with China and Russia to build
destructive cyberweapons capable of seriously damaging other nations
critical infrastructure, according to a leading expert on hostilities waged via the Internet. Scott Borg,
CEO of the U.S. Cyber Consequences Unit, a nonprofit institute that advises the U.S. government and businesses on

all three nations have built arsenals of sophisticated computer


viruses, worms, Trojan horses and other tools that place them atop the rest of the
world in the ability to inflict serious damage on one another, or lesser powers.
Ranked just below the Big Three, he said, are four U.S. allies: Great Britain, Germany, Israel
and perhaps Taiwan. But in testament to the uncertain risk/reward ratio in cyberwarfare, Iran has used attacks
cybersecurity, said

on its nuclear program to bolster its offensive capabilities and is now developing its own "cyberarmy," Borg said.
Borg offered his assessment of the current state of cyberwar capabilities Tuesday in the wake of a report by the
American computer security company Mandiant linking hacking attacks and cyber espionage against the U.S. to a
sophisticated Chinese group known as Peoples Liberation Army Unit 61398. In todays brave new interconnected

hackers who can defeat security defenses are capable of disrupting an array of
critical services, including delivery of water, electricity and heat, or bringing transportation to a grinding halt.
world,

U.S. senators last year received a closed-door briefing at which experts demonstrated how a power company
employee could take down the New York City electrical grid by clicking on a single email attachment, the New York
Times reported. U.S. officials rarely discuss offensive capability when discussing cyberwar, though several privately

the U.S. could "shut down" the electrical grid of a smaller nation -if it chose to do so. Borg echoed that assessment, saying the U.S.
cyberwarriors, who work within the National Security Agency, are very good across the
board. There is a formidable capability. Stuxnet and Flame (malware used
to disrupt and gather intelligence on Iran's nuclear program) are demonstrations of
that, he said. (The U.S.) could shut down most critical infrastructure in potential
adversaries relatively quickly.
told NBC News recently that
Iran, for example

Cyber-deterrence works now


Healey 14
(Healey, Jason. Jason Healey is a Nonresident Senior Fellow for the Cyber Statecraft Initiative of the
Atlantic Council and Senior Research Scholar at Columbia University's School of International and
Public Affairs, focusing on international cooperation, competition, and conflict in cyberspace. From
2011 to 2015, he worked as the Director of the Council's Cyber Statecraft Initiative. Starting his career
in the United States Air Force, Mr. Healey earned two Meritorious Service Medals for his early work in
cyber operations at Headquarters Air Force at the Pentagon and as a plankholder (founding member)
of the Joint Task Force Computer Network Defense, the world's first joint cyber warfighting unit. He
has degrees from the United States Air Force Academy (political science), Johns Hopkins University
(liberal arts), and James Madison University (information security). "Commentary: Cyber Deterrence Is
Working," Defense News. 7-30-2014.
http://archive.defensenews.com/article/20140730/DEFFEAT05/307300017/Commentary-CyberDeterrence-Working//ghs-kw)

Nations have been unwilling to take advantage of each others vulnerable


infrastructures perhaps because, as Joe Nye notes in his book, The Future of Power, interstate
deterrence through entanglement and denial still exist for cyber conflicts. The most
capable cyber nations rely heavily on the same Internet infrastructure and global standards (though using
significant local infrastructure), so attacks above a certain threshold are not obviously in any nations self-interest.

In addition, both deterrence by denial and deterrence by punishment are


in force. Despite their vulnerabilities, nations may still be able to mount effective-enough
defenses to deny any benefits to the adversary. Taking down a cyber target is
spectacularly easy and well within the capability of the proverbial two-teenagers-

in-a-basement. But keeping a target down over time in the face of determined
defenses is very hard, demanding intelligence, battle damage assessment and the
ability to keep restriking targets over time. These capabilities are still largely the
province of the great cyber powers, meaning it can be trivially easy to determine
the likely attacker. During all of the most disruptive cyber conflicts (such as Estonia,
Georgia or Stuxnet) there was quick consensus on the obvious choice of which nation or
nations were behind the assault. If any of those attacks had caused large numbers
of deaths or truly strategic disruption, hiding behind Internet anonymity (It wasnt us
and you cant prove otherwise) would ring flat and invite a retaliatory strike.

2NC Link - Backdoors


Backdoors and surveillance are key to winning the cyber arms
race
Spiegel 15
(Spiegel Online, Hamburg, Germany. "The Digital Arms Race: NSA Preps America for Future Battle,"
SPIEGEL ONLINE. 1-17-2015. http://www.spiegel.de/international/world/new-snowden-docs-indicatescope-of-nsa-preparations-for-cyber-battle-a-1013409.html//ghs-kw)
Potential interns are also told that research into third party computers might include plans to "remotely degrade or
destroy opponent computers, routers, servers and network enabled devices by attacking the hardware." Using a

With
programs like Berserkr they would implant "persistent backdoors" and "parasitic drivers".
program called Passionatepolka, for example, they may be asked to "remotely brick network cards."

Using another piece of software called Barnfire, they would "erase the BIOS on a brand of servers that act as a
backbone to many rival governments." An intern's tasks might also include remotely destroying the functionality of
hard drives. Ultimately, the goal of the internship program was "developing an attacker's mindset." The internship
listing is eight years old, but the attacker's mindset has since become a kind of doctrine for the NSA's data spies.
And the intelligence service isn't just trying to achieve mass surveillance of Internet communication, either. The
digital spies of the Five Eyes alliance -- comprised of the United States, Britain, Canada, Australia and New Zealand

NSA whistleblower
are planning for wars of the future in which
the Internet will play a critical role, with the aim of being able to use the net to
paralyze computer networks and, by doing so, potentially all the infrastructure they
control, including power and water supplies, factories, airports or the flow of money.
-- want more. The Birth of D Weapons According to top secret documents from the archive of
Edward Snowden seen exclusively by SPIEGEL, they

During the 20th century, scientists developed so-called ABC weapons -- atomic, biological and chemical. It took

New digital weapons


have now been developed for the war on the Internet. But there are almost no
international conventions or supervisory authorities for these D weapons, and the
only law that applies is the survival of the fittest. Canadian media theorist Marshall McLuhan
decades before their deployment could be regulated and, at least partly, outlawed.

foresaw these developments decades ago. In 1970, he wrote, "World War III is a guerrilla information war with no
division between military and civilian participation." That's precisely the reality that spies are preparing for today.
The US Army, Navy, Marines and Air Force have already established their own cyber forces, but it is

the NSA, also

officially a military agency, that is taking the lead. It's no coincidence that the director of the NSA also serves
as the head of the US Cyber Command. The country's leading data spy, Admiral Michael Rogers, is also its chief
cyber warrior and his close to 40,000 employees are responsible for both digital spying and destructive network

From a military perspective, surveillance of the Internet is


merely "Phase 0" in the US digital war strategy. Internal NSA documents indicate
that it is the prerequisite for everything that follows. They show that the aim of the
surveillance is to detect vulnerabilities in enemy systems. Once "stealthy implants"
have been placed to infiltrate enemy systems, thus allowing "permanent accesses,"
then Phase Three has been achieved -- a phase headed by the word "dominate" in
the documents. This enables them to "control/destroy critical systems & networks at
will through pre-positioned accesses (laid in Phase 0)." Critical infrastructure is
considered by the agency to be anything that is important in keeping a society
running: energy, communications and transportation. The internal documents state
that the ultimate goal is "real time controlled escalation". One NSA presentation proclaims
that "the next major conflict will start in cyberspace." To that end, the US government
is currently undertaking a massive effort to digitally arm itself for network warfare. For the
attacks. Surveillance only 'Phase 0'

2013 secret intelligence budget, the NSA projected it would need around $1 billion in order to increase the strength
of its computer network attack operations. The budget included an increase of some $32 million for "unconventional
solutions" alone.

Back doors are key to cyber-warfare


Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.

After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

The policy debate has moved so that offensive options are more prominent now,
said former deputy defense secretary William J. Lynn III, who has not seen the budget document and was speaking generally. I think

Of
the 231 offensive operations conducted in 2011, the budget said, nearly three-quarters
were against top-priority targets, which former officials say includes adversaries such as
Iran, Russia, China and North Korea and activities such as nuclear proliferation. The
theres more of a case made now that offensive cyberoptions can be an important element in deterring certain adversaries.

document provided few other details about the operations. Stuxnet, a computer worm reportedly developed by the United States
and Israel that destroyed Iranian nuclear centrifuges in attacks in 2009 and 2010, is often cited as the most dramatic use of a
cyberweapon. Experts said no other known cyberattacks carried out by the United States match the physical damage inflicted in
that case. U.S. agencies define offensive cyber-operations as activities intended to manipulate, disrupt, deny, degrade, or destroy
information resident in computers or computer networks, or the computers and networks themselves, according to a presidential
directive issued in October 2012. Most offensive operations have immediate effects only on data or the proper functioning of an
adversarys machine: slowing its network connection, filling its screen with static or scrambling the results of basic calculations. Any
of those could have powerful effects if they caused an adversary to botch the timing of an attack, lose control of a computer or
miscalculate locations. U.S. intelligence services are making routine use around the world of government-built malware that differs
little in function from the advanced persistent threats that U.S. officials attribute to China. The principal difference, U.S. officials
told The Post, is that China steals U.S. corporate secrets for financial gain. The Department of Defense does engage in computer
network exploitation, according to an e-mailed statement from an NSA spokesman, whose agency is part of the Defense
Department. The department does ***not*** engage in economic espionage in any domain, including cyber. Millions of implants

The administrations cyber-operations sometimes involve what one budget


document calls field operations abroad, commonly with the help of CIA operatives
or clandestine military forces, to physically place hardware implants or software
modifications. Much more often, an implant is coded entirely in software by an NSA group called Tailored Access
Operations (TAO). As its name suggests, TAO builds attack tools that are custom-fitted to their
targets. The NSA units software engineers would rather tap into networks than
individual computers because there are usually many devices on each network.
Tailored Access Operations has software templates to break into common brands
and models of routers, switches and firewalls from multiple product vendor lines,
according to one document describing its work. The implants that TAO creates are intended to persist
through software and equipment upgrades, to copy stored data, harvest
communications and tunnel into other connected networks. This year TAO is working on implants
that can identify select voice conversations of interest within a target network and exfiltrate select cuts, or excerpts, according to

In some cases, a single compromised device opens the door


to hundreds or thousands of others. Sometimes an implants purpose is to
create a back door for future access. You pry open the window
somewhere and leave it so when you come back the owner doesnt know
its unlocked, but you can get back in when you want to, said one intelligence official,
one budget document.

who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of anonymity to

these operations are known as exploitation, not attack,


essential precursors both to attack and defense. By the end of this year,
GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S.
discuss sensitive technology. Under U.S. cyberdoctrine,
but they are

intelligence budget. The NSA appears to be planning a rapid expansion of those numbers, which were limited until recently by the
need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use

For GENIEs next phase, according to an


authoritative reference document, the NSA has brought online an automated
system, code-named TURBINE, that is capable of managing potentially millions of
implants for intelligence gathering and active attack. The ROC When it comes time to fight the
of only 8,448 of the 68,975 machines with active implants in 2011.

cyberwar against the best of the NSAs global competitors, the TAO calls in its elite operators, who work at the agencys Fort Meade
headquarters and in regional operations centers in Georgia, Texas, Colorado and Hawaii. The NSAs organizational chart has the
main office as S321. Nearly everyone calls it the ROC, pronounced rock: the Remote Operations Center. To the NSA as a whole,
the ROC is where the hackers live, said a former operator from another section who has worked closely with the exploitation teams.
Its basically the one-stop shop for any kind of active operation thats not defensive. Once the hackers find a hole in an

[t]argeted systems are compromised electronically, typically


providing access to system functions as well as data. System logs and processes are
modified to cloak the intrusion, facilitate future access, and accomplish other
operational goals, according to a 570-page budget blueprint for what the government calls its Consolidated Cryptologic
adversarys defense,

Program, which includes the NSA. Teams from the FBI, the CIA and U.S. Cyber Command work alongside the ROC, with overlapping
missions and legal authorities. So do the operators from the NSAs National Threat Operations Center, whose mission is focused
primarily on cyberdefense. That was Snowdens job as a Booz Allen Hamilton contractor, and it required him to learn the NSAs best

the ROC teams give Cyber Command specific


target related technical and operational material (identification/recognition), tools
and techniques that allow the employment of U.S. national and tactical specific
computer network attack mechanisms. The intelligence communitys cybermissions include
defense of military and other classified computer networks against foreign attack , a
hacking techniques. According to one key document,

task that absorbs roughly one-third of a total cyber operations budget of $1.02 billion in fiscal 2013, according to the Cryptologic
Program budget. The ROCs breaking-and-entering mission, supported by the GENIE infrastructure, spends nearly twice as much:

Most GENIE operations aim for exploitation of foreign systems, a term


defined in the intelligence budget summary as surreptitious virtual or physical
access to create and sustain a presence inside targeted systems or facilities. The
document adds: System logs and processes are modified to cloak the intrusion,
facilitate future access, and accomplish other operational goals. The NSA designs most of its
$651.7 million.

own implants, but it devoted $25.1 million this year to additional covert purchases of software vulnerabilities from private malware
vendors, a growing gray-market industry based largely in Europe.

2NC Link Exports


Backdoors are inserted in US products and exported globally
Schneier indicates backdoors in networks is key to cyberoperations
Greenwald 14
(Glenn Greenwald. Glenn Greenwald is an ex-constitutional lawyer and a contributor for the Guardian,
NYT, LAT, and The Intercept. He received his BA from George Washington University and a JD from
NYU. "Glenn Greenwald: how the NSA tampers with US-made internet routers," Guardian. 5-12-2014.
http://www.theguardian.com/books/2014/may/12/glenn-greenwald-nsa-tampers-us-internet-routerssnowden//ghs-kw)
But while American companies were being warned away from supposedly untrustworthy Chinese routers, foreign
organisations would have been well advised to beware of American-made ones. A June 2010 report from the head of

The NSA routinely


receives or intercepts routers, servers and other computer network
devices being exported from the US before they are delivered to the
international customers. The agency then implants backdoor surveillance tools,
repackages the devices with a factory seal and sends them on . The NSA thus
gains access to entire networks and all their users. The document gleefully observes
that some "SIGINT tradecraft is very hands-on (literally!)". Eventually, the
implanted device connects back to the NSA. The report continues: "In one recent case,
after several months a beacon implanted through supply-chain interdiction called
back to the NSA covert infrastructure. This call back provided us access to further
exploit the device and survey the network." It is quite possible that Chinese firms are implanting
the NSA's Access and Target Development department is shockingly explicit.

surveillance mechanisms in their network devices. But the US is certainly doing the same.

Routers are keygives us access to thousands of connected


devices
Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.
After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

The policy debate has moved so that offensive options are more prominent now,
said former deputy defense secretary William J. Lynn III, who has not seen the budget document and was speaking generally. I think

Of
the 231 offensive operations conducted in 2011, the budget said, nearly three-quarters
were against top-priority targets, which former officials say includes adversaries such as
Iran, Russia, China and North Korea and activities such as nuclear proliferation. The
theres more of a case made now that offensive cyberoptions can be an important element in deterring certain adversaries.

document provided few other details about the operations. Stuxnet, a computer worm reportedly developed by the United States
and Israel that destroyed Iranian nuclear centrifuges in attacks in 2009 and 2010, is often cited as the most dramatic use of a
cyberweapon. Experts said no other known cyberattacks carried out by the United States match the physical damage inflicted in
that case. U.S. agencies define offensive cyber-operations as activities intended to manipulate, disrupt, deny, degrade, or destroy
information resident in computers or computer networks, or the computers and networks themselves, according to a presidential
directive issued in October 2012. Most offensive operations have immediate effects only on data or the proper functioning of an
adversarys machine: slowing its network connection, filling its screen with static or scrambling the results of basic calculations. Any
of those could have powerful effects if they caused an adversary to botch the timing of an attack, lose control of a computer or
miscalculate locations. U.S. intelligence services are making routine use around the world of government-built malware that differs
little in function from the advanced persistent threats that U.S. officials attribute to China. The principal difference, U.S. officials

told The Post, is that China steals U.S. corporate secrets for financial gain. The Department of Defense does engage in computer
network exploitation, according to an e-mailed statement from an NSA spokesman, whose agency is part of the Defense
Department. The department does ***not*** engage in economic espionage in any domain, including cyber. Millions of implants

The administrations cyber-operations sometimes involve what one budget


document calls field operations abroad, commonly with the help of CIA operatives
or clandestine military forces, to physically place hardware implants or software
modifications. Much more often, an implant is coded entirely in software by an NSA group called Tailored Access
Operations (TAO). As its name suggests, TAO builds attack tools that are custom-fitted to their
targets. The NSA units software engineers would rather tap into networks than
individual computers because there are usually many devices on each network.
Tailored Access Operations has software templates to break into common brands
and models of routers, switches and firewalls from multiple product vendor lines,
according to one document describing its work. The implants that TAO creates are intended to persist
through software and equipment upgrades, to copy stored data, harvest
communications and tunnel into other connected networks. This year TAO is working on implants
that can identify select voice conversations of interest within a target network and exfiltrate select cuts, or excerpts, according to

In some cases, a single compromised device opens the door


to hundreds or thousands of others. Sometimes an implants purpose is to
create a back door for future access. You pry open the window
somewhere and leave it so when you come back the owner doesnt know
its unlocked, but you can get back in when you want to, said one intelligence official,
one budget document.

who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of anonymity to

these operations are known as exploitation, not attack,


essential precursors both to attack and defense. By the end of this year,
GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S.
discuss sensitive technology. Under U.S. cyberdoctrine,
but they are

intelligence budget. The NSA appears to be planning a rapid expansion of those numbers, which were limited until recently by the
need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use

For GENIEs next phase, according to an


authoritative reference document, the NSA has brought online an automated
system, code-named TURBINE, that is capable of managing potentially millions of
implants for intelligence gathering and active attack. The ROC When it comes time to fight the
of only 8,448 of the 68,975 machines with active implants in 2011.

cyberwar against the best of the NSAs global competitors, the TAO calls in its elite operators, who work at the agencys Fort Meade
headquarters and in regional operations centers in Georgia, Texas, Colorado and Hawaii. The NSAs organizational chart has the
main office as S321. Nearly everyone calls it the ROC, pronounced rock: the Remote Operations Center. To the NSA as a whole,
the ROC is where the hackers live, said a former operator from another section who has worked closely with the exploitation teams.
Its basically the one-stop shop for any kind of active operation thats not defensive. Once the hackers find a hole in an

[t]argeted systems are compromised electronically, typically


providing access to system functions as well as data. System logs and processes are
modified to cloak the intrusion, facilitate future access, and accomplish other
operational goals, according to a 570-page budget blueprint for what the government calls its Consolidated Cryptologic
adversarys defense,

Program, which includes the NSA. Teams from the FBI, the CIA and U.S. Cyber Command work alongside the ROC, with overlapping
missions and legal authorities. So do the operators from the NSAs National Threat Operations Center, whose mission is focused
primarily on cyberdefense. That was Snowdens job as a Booz Allen Hamilton contractor, and it required him to learn the NSAs best

the ROC teams give Cyber Command specific


target related technical and operational material (identification/recognition), tools
and techniques that allow the employment of U.S. national and tactical specific
computer network attack mechanisms. The intelligence communitys cybermissions include
defense of military and other classified computer networks against foreign attack , a
hacking techniques. According to one key document,

task that absorbs roughly one-third of a total cyber operations budget of $1.02 billion in fiscal 2013, according to the Cryptologic
Program budget. The ROCs breaking-and-entering mission, supported by the GENIE infrastructure, spends nearly twice as much:

Most GENIE operations aim for exploitation of foreign systems, a term


defined in the intelligence budget summary as surreptitious virtual or physical
access to create and sustain a presence inside targeted systems or facilities. The
document adds: System logs and processes are modified to cloak the intrusion,
facilitate future access, and accomplish other operational goals. The NSA designs most of its
$651.7 million.

own implants, but it devoted $25.1 million this year to additional covert purchases of software vulnerabilities from private malware

vendors, a growing gray-market industry based largely in Europe.

2NC Link - Zero Days


Zero-days are key to the cyber-arsenal
Cushing 14
(Cushing, Seychelle. Cushing received her MA with Distinction in Political Science and her BA in
Political Science from Simon Fraser Unversity. She is the Manager of Strategic Initiatives and Special
Projects at the Office of the Vice-President, Research. Leveraging Information as Power: Americas
Pursuit of Cyber Security, Simon Fraser University. 11-28-2014.
http://summit.sfu.ca/system/files/iritems1/14703/etd8726_SCushing.pdf//ghs-kw)
Nuclear or conventional weapons, once developed, can remain dormant yet functional until needed. In comparison,

zero-days used in cyber weapons require the US to constantly discover new


vulnerabilities to maintain a deployable cyber arsenal. Holding a specific
zero-day does not guarantee that the vulnerability will remain unpatched for a
prolonged period of time by the targeted state. 59 Complicating this is the fact that undetected
the

vulnerabilities, once acquired, are rarely used immediately given the time and resources it takes to construct a
cyber attack.60 In the time between acquisition and use, a patch for the vulnerability may be released, whether
through routine patches or a specific identification of a security hole, rendering the vulnerability obsolete. To

America deploys several zero-days at once in a cyber attack to increase


the odds that at least one (or more) of the vulnerabilities remains open to provide
system access.6 2.4. One Attack, Multiple Vulnerabilities Multiple backdoor entry points are
preferable given that America cannot be absolutely certain of what vulnerabilities
the target system will contain62 despite extensive pre-launch cyber attack testing63 and
customization.64 A successful cyber attack needs a minimum of one undetected
vulnerability to gain access to the target system. Each successive zero-day that
works adds to the strength and sophistication of a cyber assault. 65 As one vulnerability is
minimize this,

patched, America can still rely on the other undetected vulnerabilities to continue its cyber strike.

Incorporating multiple undetected vulnerabilities into a cyber attack reduces the


need to create new cyber attacks after each zero-day fails. Stuxnet , a joint US-Israel
operation, was a cyber attack designed to disrupt Irans progress on its nuclear
weapons program.66 The attack was designed to alter the code of Natanzs computers and industrial control
systems to induce chronic fatigue, rather than destruction, of the nuclear centrifuges.67 The precision of Stuxnet

What is
notable about Stuxnet is its use of four zero-day exploits (of which one was
allegedly purchased)69 in the attack. 70 That is, to target one system, Stuxnet entered
through four different backdoors. A target state aware of a specific vulnerability in its system will enact
a patch upon detection and likely assume that the problem is fixed. Exploiting multiple vulnerabilities
creates variations in how the attack is executed given that different backdoors alter
how the attack enters the target system.71 One patch does not stop the cyber
attack. The use of multiple zero-days thus capitalizes on a states limited awareness of the vulnerabilities in its
ensured that all other control systems were ignored except for those regulating the centrifuges.68

system. Each phase of Stuxnet was different from its previous phase which created confusion among the Iranians.

Yet even upon the initial


discovery of the attack, who the attacker was remained unclear. The failures in the
Natanz centrifuges were first attributed to insider error73 and later to China 74 before
Launched in 2009, Stuxnet was not discovered by the Iranians until 2010.72

finally discovering the true culprits.75 The use of multiple undetected vulnerabilities helped to obscure the US and

The Stuxnet case helps illustrate the efficacy of zeroday attacks as a means of attaining political goals. Although Stuxnet did not produce
Israel as the actual attackers.76

immediate results in terminating Irans nuclear program, it helped buy time for the Americans to consider other
options against Iran. A nuclear Iran would not only threaten American security but possibly open a third conflict for
America77 in the Middle East given Israels proclivity to strike a nuclear Iran first. Stuxnet allowed the United States
to delay Irans nuclear program without resorting to kinetic action.78

Zero-days are key to effective cyber-war offensive capabilities


Gjelten 13
(Gjelten, Tom. TOM GJELTEN is a correspondent for NPR. Over the years, he has reported extensively

from Europe and Latin America, including Cuba. He was reporting live from the Pentagon when it was
attacked on September 11, 2001. Subsequently, he covered the war in Afghanistan and Iraq invasion
as NPR's lead Pentagon correspondent. Gjelten also covered the first Gulf War and the wars in Croatia
and Bosnia, Nicaragua, El Salvador, Guatemala, and Colombia. From Berlin (19901994), he covered
Europes political and economic transition after the fall of the Berlin Wall. Gjeltens series From Marx
to Markets, documenting Eastern Europes transition to a market economy, earned him an Overseas
Press Club award for the the Best Business or Economic Reporting in Radio or TV. His reporting from
Bosnia earned him a second Overseas Press Club Award, a George Polk Award, and a Robert F Kennedy
Journalism Award. Gjeltens books include Sarajevo Daily: A City and Its Newspaper Under Siege, which
the New York Times called a chilling portrayal of a citys slow murder. His 2008 book, Bacardi and
the Long Fight for Cuba: The Biography of a Cause, was selected as a New York Times Notable
Nonfiction Book. "First Strike: US Cyber Warriors Seize the Offensive," World Affairs Journal.
January/February 2013. http://www.worldaffairsjournal.org/article/first-strike-us-cyber-warriors-seizeoffensive//ghs-kw)

Much of the cyber talk around the Pentagon these days is about offensive
operations. It is no longer enough for cyber troops to be deployed along network
perimeters, desperately trying to block the constant attempts by adversaries to
penetrate front lines. The US militarys geek warriors are now prepared to go on the
attack, armed with potent cyberweapons that can break into enemy computers with
pinpoint precision. The new emphasis is evident in a program launched in October 2012 by the Defense Advanced
That was then.

Research Projects Agency (DARPA), the Pentagons experimental research arm. DARPA funding enabled the invention of the Internet,

DARPA
managers said the Plan X goal was to create revolutionary technologies for
understanding, planning, and managing cyberwarfare. The US Air Force was also signaling its
stealth aircraft, GPS, and voice-recognition software, and the new program, dubbed Plan X, is equally ambitious.

readiness to go into cyber attack mode, announcing in August that it was looking for ideas on how to destroy, deny, degrade,
disrupt, deceive, corrupt, or usurp the adversaries [sic] ability to use the cyberspace domain for his advantage. The new interest in
attacking enemies rather than simply defending against them has even spread to the business community. Like their military
counterparts, cybersecurity experts in the private sector have become increasingly frustrated by their inability to stop intruders

The new idea is


to pursue the perpetrators back into their own networks. Were following a failed
security strategy in cyber, says Steven Chabinsky, formerly the head of the FBIs cyber intelligence section and now
from penetrating critical computer networks to steal valuable data or even sabotage network operations.

chief risk officer at CrowdStrike, a startup company that promotes aggressive action against its clients cyber adversaries.

Theres no way that we are going to win the cybersecurity effort on defense. We
have to go on offense. The growing interest in offensive operations is bringing changes in the cybersecurity industry.
Expertise in patching security flaws in ones own computer network is out; expertise in finding those flaws in the other guys

Among the hot jobs listed on the career page at the National Security
Agency are openings for computer scientists who specialize in vulnerability
discovery. Demand is growing in both government and industry circles for technologists with the skills to develop ever more
network is in.

sophisticated cyber tools, including malicious softwaremalwarewith such destructive potential as to qualify as cyberweapons
when implanted in an enemys network.

Offense is the biggest growth sector in the cyber industry

right now,

says Jeffrey Carr, a cybersecurity analyst and author of Inside Cyber Warfare. But have we given sufficient thought
to what we are doing? Offensive operations in the cyber domain raise a host of legal, ethical, and political issues, and governments,
courts, and business groups have barely begun to consider them. The move to offensive operations in cyberspace was actually
under way even as Pentagon officials were still insisting their strategy was defensive. We just didnt know it. The big revelation came
in June 2012, when New York Times reporter David Sanger reported that the United States and Israel were behind the development
of the Stuxnet worm, which had been used to damage computer systems controlling Irans nuclear enrichment facilities.
Sanger, citing members of President Obamas national security team, said the attacks were code-named Olympic Games and

constituted Americas first sustained use of cyberweapons. The highly sophisticated Stuxnet
worm delivered computer instructions that caused some Iranian centrifuges to spin uncontrollably and self-destruct. According to
Sanger, the secret cyber attacks had begun during the presidency of George W. Bush but were accelerated on the orders of Obama.
The publication of such a highly classified operation provoked a firestorm of controversy, but government officials who took part in

In the
aftermath of the Stuxnet revelations, discussions about cyber war became more
realistic and less theoretical. Here was a cyberweapon that had been designed and
used for the same purpose and with the same effect as a kinetic weapon : like a missile or a
discussions of Stuxnet have not denied the accuracy of Sangers reporting. He nailed it, one participant told me.

bomb, it caused physical destruction. Security experts had been warning that a US adversary could use a cyberweapon to destroy

the Stuxnet
story showed how the American military itself could use an offensive cyberweapon
against an enemy. The advantages of such a strike were obvious. A cyberweapon
could take down computer networks and even destroy physical equipment without
power plants, water treatment facilities, or other critical infrastructure assets here in the United States, but

the civilian casualties that a bombing mission would entail. Used preemptively, it could keep a
conflict from evolving in a more lethal direction. The targeted country would have a hard time
determining where the cyber attack came from. In fact, the news that the United States had actually
developed and used an offensive cyberweapon gave new significance to hints US officials had quietly dropped on previous occasions
about the enticing potential of such tools. In remarks at the Brookings Institution in April 2009, for example, the then Air Force chief
of staff, General Norton Schwartz, suggested that cyberweapons could be used to attack an enemys air defense system.

Traditionally, Schwartz said, we take down integrated air defenses via kinetic
means. But if it were possible to interrupt radar systems or surface to air missile
systems via cyber, that would be another very powerful tool in the tool kit allowing
us to accomplish air missions. He added, We will develop thathave [that]
capability. A full two years before the Pentagon rolled out its defensive cyber strategy, Schwartz was clearly suggesting an
offensive application. The Pentagons reluctance in 2011 to be more transparent about its interest in offensive cyber capabilities
may simply have reflected sensitivity to an ongoing dispute within the Obama administration. Howard Schmidt, the White House
Cybersecurity Coordinator at the time the Department of Defense strategy was released, was steadfastly opposed to any use of the
term cyber war and had no patience for those who seemed eager to get into such a conflict. But his was a losing battle.

Pentagon planners had already classified cyberspace officially as a fifth domain of


warfare, alongside land, air, sea, and space. As the 2011 cyber strategy noted, that designation allows
DoD to organize, train, and equip for cyberspace as we do in air, land, maritime, and space to support national security interests.

Once the US
military accepts the challenge to fight in a new domain, it aims for superiority in
that domain over all its rivals, in both offensive and defensive realms. Cyber is no
exception. The US Air Force budget request for 2013 included $4 billion in proposed spending to achieve cyberspace
That statement by itself contradicted any notion that the Pentagons interest in cyber was mainly defensive.

superiority, according to Air Force Secretary Michael Donley. It is hard to imagine the US military settling for any less, given the
importance of electronic assets in its capabilities. Even small unit commanders go into combat equipped with laptops and video
links. Were no longer just hurling mass and energy at our opponents in warfare, says John Arquilla, professor of defense analysis
at the Naval Postgraduate School. Now were using information, and the more you have, the less of the older kind of weapons you
need. Access to data networks has given warfighters a huge advantage in intelligence, communication, and coordination. But their
dependence on those networks also creates vulnerabilities, particularly when engaged with an enemy that has cyber capabilities of
his own. Our adversaries are probing every possible entry point into the network, looking for that one possible weak spot, said
General William Shelton, head of the Air Force Space Command, speaking at a CyberFutures Conference in 2012. If we dont do this
right, these new data links could become one of those spots. Achieving cyber superiority in a twenty-first-century battle space is
analogous to the establishment of air superiority in a traditional bombing campaign. Before strike missions begin against a set of
targets, air commanders want to be sure the enemys air defense system has been suppressed. Radar sites, antiaircraft missile
batteries, enemy aircraft, and command-and-control facilities need to be destroyed before other targets are hit. Similarly, when an
information-dependent combat operation is planned against an opposing military, the operational commanders may first want to
attack the enemys computer systems to defeat his ability to penetrate and disrupt the US militarys information and communication
networks. Indeed, operations like this have already been carried out. A former ground commander in Afghanistan, Marine Lieutenant
General Richard Mills, has acknowledged using cyber attacks against his opponent while directing international forces in southwest
Afghanistan in 2010. I was able to use my cyber operations against my adversary with great impact, Mills said, in comments
before a military conference in August 2012. I was able to get inside his nets, infect his command-and-control, and in fact defend
myself against his almost constant incursions to get inside my wire, to affect my operations. Mills was describing offensive cyber
actions. This is cyber war, waged on a relatively small scale and at the tactical level, but cyber war nonetheless. And, as DARPAs
Plan X reveals, the US military is currently engaged in much larger scale cyber war planning. DARPA managers want contractors to
come up with ideas for mapping the digital battlefield so that commanders could know where and how an enemy has arrayed his
computer networks, much as they are now able to map the location of enemy tanks, ships, and aircraft. Such visualizations would
enable cyber war commanders to identify the computer targets they want to destroy and then assess the battle damage
afterwards. Plan X would also support the development of new cyber war architecture. The DARPA managers envision operating
systems and platforms with mission scripts built in, so that a cyber attack, once initiated, can proceed on its own in a manner
similar to the auto-pilot function in modern aircraft. None of this technology exists yet, but neither did the Internet or GPS when

the government role is to fund and


facilitate, but much of the experimental and research work would be done in the private sector. A computer worm with a
DARPA researchers first dreamed of it. As with those innovations,

destructive code like the one Stuxnet carried can probably be designed only with state sponsorship, in a research lab with resources
like those at the NSA. But private contractors are in a position to provide many of the tools needed for offensive cyber activity,

software bugs that can be exploited to provide a back door into


a computers operating system. Ideally, the security flaw or vulnerability that
can be exploited for this purpose will be one of which the network operator is totally
unaware. Some hackers specialize in finding these vulnerabilities, and as the
interest in offensive cyber operations has grown, so has the demand for their
services. The world-famous hacker conference known as Defcon attracts a wide and interesting assortment of people each
including the

year to Las Vegas: creative but often antisocial hackers who identify themselves only by their screen names, hackers who have gone
legit as computer security experts, law enforcement types, government spies, and a few curious academics and journalists. One can
learn whats hot in the hacker world just by hanging out there. In August 2012, several attendees were seated in the Defcon cafe
when a heavy-set young man in jeans, a t-shirt, and a scraggly beard strolled casually up and dropped several homemade calling
cards on the table. He then moved to the next table and tossed down a few more, all without saying a word. There was no company
logo or brand name on the card, just this message: Paying top dollar for 0-day and offensive technologies... The card identified
the buyer as zer0daybroker and listed an e-mail address.

A zero-day is the most valuable of computer

vulnerabilities, one unknown to anyone but the researcher who finds it. Hackers
prize zero-days because no one knows to have prepared a defense against them. The
growing demand for these tools has given rise to brokers like Zer0day, who identified himself in a subsequent e-mail exchange as
Zer0 Day Haxor but provided no other identifying information. As a broker, he probably did not intend to hack into a computer
network himself but only to act as an intermediary, connecting sellers who have discovered system vulnerabilities with buyers who
want to make use of the tools and are willing to pay a high price for them. In the past, the main market for these vulnerabilities was
software firms themselves who wanted to know about flaws in their products so that they could write patches to fix them. Big
companies like Google and Microsoft employ penetration testers whose job it is to find and report vulnerabilities that would allow
someone to hack into their systems. In some cases, such companies have paid a bounty to freelance cyber researchers who

the rise in offensive cyber operations has


transformed the vulnerability market, and hackers these days are more inclined to
sell zero-days to the highest bidder. In most cases, these are governments. The
market for back-door exploits has been boosted in large part by the burgeoning
demand from militaries eager to develop their cyber warfighting capabilities. The
discover a vulnerability and alert the company engineers. But

designers of the Stuxnet code cleared a path into Iranian computers through the use of four or five separate zero-day vulnerabilities,
an achievement that impressed security researchers around the world. The next Stuxnet would require the use of additional

If the president asks the US military to launch a cyber operation


in Iran tomorrow, its not the time to start looking for exploits, says
Christopher Soghoian, a Washington-based cybersecurity researcher.
They need to have the exploits ready to go. And you may not know what
kind of computer your target uses until you get there. You need a whole
arsenal [of vulnerabilities] ready to go in order to cover every possible
configuration you may meet. Not surprisingly, the National Security Agencybuying
through defense contractorsmay well be the biggest customer in the vulnerability market,
largely because it pays handsomely. The US militarys dominant presence in the
market means that other possible purchasers cannot match the militarys price.
Instead of telling Google or Mozilla about a flaw and getting a bounty for two
thousand dollars, researchers will sell it to a defense contractor like Raytheon or
SAIC and get a hundred thousand for it, says Soghoian, now the principal technologist in the Speech, Privacy
vulnerabilities.

and Technology Project at the American Civil Liberties Union and a prominent critic of the zero-day market. Those companies will
then turn around and sell the vulnerability upstream to the NSA or another defense agency. They will outbid Google every time.

2NC China
Cyber capabilities are key to deterrence and defending against
China
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

the United States regards cyber war during armed conflict with a cybercapable enemy as probable, if not inevitable. It both assumes that the computer
systems on which its own forces rely to deploy, receive support and strike will be
attacked, and intends to attack the computer systems that enable opposing forces
to operate as well. Thus, the United States has said that it can and would conduct cyber war
to support operational and contingency plans a euphemism for attacking computer systems that enable
enemy war fighting. US military doctrine now regards non-kinetic (that is, cyber) measures
as an integral aspect of US joint offensive operations. 8 Even so, the stated purposes of the US military
At the same time,

regarding cyber war stress protecting the ability of conventional military forces to function as they should, as well as avoiding and
preventing escalation, especially to non-military targets. Apart from its preparedness to conduct counter-military cyber operations
during wartime, the United States has been reticent about using its offensive capabilities. While it has not excluded conducting

Broadly speaking, US
policy is to rely on the threat of retaliation to deter a form of warfare it is keen to
avoid. Chinese criticism that the US retaliatory policy and capabilities will up the ante on the Internet arms race is disingenuous
cyber operations to coerce hostile states or non-state actors, it has yet to brandish such a threat.9

in that China has been energetic in forming and using capabilities for cyber operations.10 Chinese criticism is disingenuous
Notwithstanding the defensive bias in US attitudes toward cyber war, the dual missions of deterrence and preparedness for offensive
operations during an armed conflict warrant maintaining superb, if not superior, offensive capabilities. Moreover, the case can be

the United States should have superiority in offensive


capabilities in order to control escalation.11 The combination of significant capabilities and declared
made and we have made it that

reluctance to wage cyber war raises a question that is not answered by any US official public statements: when it comes to offence,

To be clear, we do not
take issue with the basic US stance of being at once wary and capable of cyber war.
Nor do we think that the United States should advertise exactly when and how it
would conduct offensive cyber war. However, the very fact that the United States maintains options for
what are US missions, desired effects, target sets and restraints in short, what is US policy?

offensive operations implies the need for some articulation of policy. After all, the United States was broadly averse to the use of
nuclear weapons during the Cold War, yet it elaborated a declaratory policy governing such use to inform adversaries, friends and
world opinion, as well as to forge domestic consensus. Indeed, if the United States wants to discourage and limit cyber war
internationally, while keeping its options open, it must offer an example. For that matter, the American people deserve to know what
national policy on cyber war is, lest they assume it is purely defensive or just too esoteric to comprehend. Whether to set a
normative example, warn potential adversaries or foster national consensus, US policy on waging cyber war should be coherent. At
the same time, it must encompass three distinguishable offensive missions: wartime counter-military operations, which the United
States intends to conduct; retaliatory missions, which the US must have the will and ability to conduct for reasons of deterrence; and
coercive missions against hostile states, which could substitute for armed attack.12 Four cases serve to highlight the relevant issues
and to inform the elaboration of an overall policy to guide US conduct of offensive cyber war. The first involves wartime countermilitary cyber operations against a cyber-capable opponent, which may also be waging cyber war; the second involves retaliation
against a cyber-capable opponent for attacking US systems other than counter-military ones; the third involves coercion of a cyberweak opponent with little or no means to retaliate against US cyber attack; and the fourth involves coercion of a cyber-strong
opponent with substantial means to retaliate against US cyber attack. Of these, the first and fourth imply a willingness to initiate
cyber war. Counter-military cyber war during wartime Just as cyber war is war, armed hostilities will presumably include cyber war if
the belligerents are both capable of and vulnerable to it. The reason for such certainty is that impairing opposing military forces use
of computer systems is operationally compelling. Forces with requisite technologies and skills benefit enormously from data
communications and computation for command and control, intelligence, surveillance and reconnaissance (ISR), targeting,
navigation, weapon guidance, battle assessment and logistics management, among other key functions. If the performance of forces
is dramatically enhanced by such systems, it follows that degrading them can provide important military advantages. Moreover,
allowing an enemy to use cyber war without reciprocating could mean military defeat. Thus,

the United States and

other advanced states are acquiring capabilities not only to use and protect
computer systems, but also to disrupt those used by enemies. The intention to
wage cyber war is now prevalent in Chinese planning for war with the United States
and vice versa. Chinese military planners have long made known their belief that,
because computer systems are essential for effective US military operations, they
must be targeted. Chinese cyber capabilities may not (yet) pose a threat to US
command, control, communications, computers, intelligence, surveillance and
reconnaissance (C4ISR) networks, which are well partitioned and protected.
However, the networks that enable logistical support for US forces are inviting
targets. Meant to disable US military operations, Chinese use of cyber war during an armed conflict
would not be contingent on US cyber operations. Indeed, it could come
early, first or even as a precursor of armed hostilities. For its part, the US
military is increasingly aware not only that sophisticated adversaries like
China can be expected to use cyber war to degrade the performance of US
forces, but also that US forces must integrate cyber war into their
capabilities and operations. Being more dependent on computer networks to enhance military performance
than are its adversaries, including China, US forces have more to lose than to gain from the outbreak of cyber war during an armed
conflict. This being so, would it make sense for the United States to wait and see if the enemy resorts to cyber war before doing so

Given US conventional military superiority, it can be assumed that any


adversary that can use cyber war against US forces will do so. Moreover, waiting for the other
itself?

side to launch a cyber attack could be disadvantageous insofar as US forces would be the first to suffer degraded performance.
Thus, rather than waiting, there will be pressure for the United States to commence cyber attacks early, and perhaps first. Moreover,
leading US military officers have strongly implied that cyber war would have a role in attacking enemy anti-access and area-denial
(A2AD) capabilities irrespective of the enemys use of cyber war.13 If the United States is prepared to conduct offensive cyber
operations against a highly advanced opponent such as China, it stands to reason that it would do likewise against lesser opponents.

The nature of US countermilitary cyber attacks during wartime should derive from the mission of gaining, or
denying the opponent, operational advantage. Primary targets of the United States should mirror those of
In sum, offensive cyber war is becoming part and parcel of the US war-fighting doctrine.

a cyber-capable adversary: ISR, command and control, navigation and guidance, transport and logistics support. Because this
mission is not coercive or strategic in nature, economic and other civilian networks should not be targeted. However, to the extent
that networks that enable military operations may be multipurpose, avoidance of non-military harm cannot be assured. There are no
sharp firebreaks in cyber war.14

China would initiate preemptive cyber strikes on the US


Freedberg 13
(Freedberg, Sydney J. Sydney J. Freedberg Jr. is the deputy editor for Breaking Defense. He graduated
summa cum laude from Harvard with an AB in History and holds an MA in Security Studies from
Georgetown University and a MPhil in European Studies from Cambridge University. During his 13
years at National Journal magazine, he wrote his first story about what became known as "homeland
security" in 1998, his first story about "military transformation" in 1999, and his first story on
"asymmetrical warfare" in 2000. Since 2004 he has conducted in-depth interviews with more than 200
veterans of Afghanistan and Iraq about their experiences, insights, and lessons-learned, writing
stories that won awards from the association of Military Reporters & Editors in 2008 and 2009, as well
as an honorable mention in 2010. "Chinas Fear Of US May Tempt Them To Preempt: Sinologists,"
Breaking Defense. 10-1-2013. http://breakingdefense.com/2013/10/chinas-fear-of-us-may-tempt-themto-preempt-sinologists/2///ghs-kw)

Because China believes it is much weaker than the United


States, they are more likely to launch a massive preemptive strike in a crisis.
Heres the other bad news: The current US concept for high-tech warfare , known as Air-Sea Battle,
might escalate the conflict even further towards a limited nuclear war, says one of the top American experts
WASHINGTON:

on the Chinese military. [This is one in an occasional series on the crucial strategic relationship and the military
capabilities of the US, its allies and China.] What US analysts call an anti-access/area denial strategy is what

the Chinese approach is born of a deep


sense of vulnerability that dates back 200 years, China analyst Larry Wortzel said at the Institute
of World Politics: The Peoples Liberation Army still sees themselves as an inferior force
to the American military, and thats who they think their most likely enemy is. Thats
fine as long as it deters China from attacking its neighbors. But if deterrence fails, the Chinese are
likely to go big or go home. Chinese military history from the Korean War in 1950 to
China calls counter-intervention and active defense, and

the Chinese invasion of Vietnam in 1979 to more recent, albeit vigorous but nonviolent, grabs for the disputed Scarborough Shoal suggests a preference for a
sudden use of overwhelming force at a crucial point , what Clausewitz would call the enemys
center of gravity. What they do is very heavily built on preemption, Wortzel said. The problem with the
striking the enemys center of gravity is, for the United States, they see it as being
in Japan, Hawaii, and the West Coast.Thats very escalatory. (Students of the American
military will nod sagely, of course, as we remind everyone that President George Bush made preemption a
centerpiece of American strategy after the terror attacks of 2001.) Wortzel argued that the current version of US AirSea Battle concept is also likely to lead to escalation. Chinas dependent on these ballistic missiles and anti-ship
missiles and satellite links, he said. Since those are almost all land-based, any attack on them involves striking
the Chinese mainland, which is pretty escalatory. You dont know how theyre going to react, he said. They do
have nuclear missiles. They actually think were more allergic to nuclear missiles landing on our soil than they are
on their soil. They think they can withstand a limited nuclear attack, or even a big nuclear attack, and retaliate.

So how would Chinas preemptive attack unfold? First would come


weeks of escalating rhetoric and cyberattacks. Theres no evidence the Chinese
favor a bolt out of the blue without giving the adversary what they believe is a
chance to back down, agreed retired Rear Adm. Michael McDevitt and Dennis Blasko, former Army defense
What War Would Look Like

attache in Beijing, speaking on a recent Wilson Center panel on Chinese strategy where they agreed on almost
nothing else. Thats not much comfort, though, considering that Imperial Japan showed clear signs they might

When the blow does fall, the experts


believe it would be sudden. Stuxnet-style viruses, electronic jamming, and Israeli-designed Harpy
radar-seeking cruise missiles (similar to the American HARM but slower and longer-ranged) would try to
blind every land-based and shipborne radar. Long-range anti-aircraft missiles like the
attack and still caught the US flat-footed at Pearl Harbor.

Russian-built S-300 would go for every plane currently in the air within 125 miles of Chinas coast, a radius that
covers all of Taiwan and some of Japan. Salvos of ballistic missiles would strike every airfield within 1,250 miles.
Thats enough range to hit the four US airbases in Japan and South Korea which are, after all, static targets you
can look up on Google Maps to destroy aircraft on the ground, crater the runways, and scatter the airfield with
unexploded cluster bomblets to defeat repair attempts. Long-range cruise missiles launched from shore, ships, and
submarines then go after naval vessels. And if the Chinese get really good and really lucky, they just might get a
solid enough fix on a US Navy aircraft carrier to lob a precision-guided ballistic missile at it. But would this work?
Maybe. This is fundamentally terra incognita, Heritage Foundation research fellow Dean Cheng told me. There has
been no direct conventional clash between major powers since Korea in the 1950s, no large-scale use of anti-ship
missiles since the Falklands in 1982, and no war ever where both sides possessed todays space, cyber, electronic
warfare, and precision-guided missile capabilities. Perhaps the least obvious but most critical uncertainty in a Pacific

I dont think weve seen electronic warfare on a scale that wed


see in a US-China confrontation, said Cheng. I doubt very much they are behind
us when it comes to electronic warfare, [and] the Chinese are training every day on
cyber: all those pings, all those attacks, all those attempts to penetrate. While the US
war would be invisible.

has invested heavily in jamming and spoofing over the last decade, much of the focus has been on how to disable

China, however, has focused its


electronic warfare and cyber attack efforts on the United States. Conceptually,
China may well be ahead of us in linking the two. (F-35 supporters may well disagree with this
insurgents roadside bombs, not on how to counter a high-tech nation-state.

conclusion.) Traditional radar jammers, for example, can also be used to insert viruses into the highly computerized

Where
there has been a fundamental difference, and perhaps the Chinese are better than
we are at this, is the Chinese seem to have kept cyber and electronic warfare as a
single integrated thing, Cheng said. We are only now coming round to the idea that electronic warfare is
linked to computer network operations. In a battle for the electromagnetic spectrum, Cheng said, the worst
case is that you thought your jammers, your sensors, everything was working
great, and the next thing you know missiles are penetrating [your defenses], planes
are being shot out of the sky.
AESA radars (active electronically scanned array) that are increasingly common in the US military.

China/Taiwan war goes nuclear


Glaser 11
(Charles, Professor of Political Science and International Affairs at the Elliott School of International
Affairs at George Washington University, Director of the Institute for Security and Conflict Studies,
Will Chinas Rise lead to War? , Foreign Affairs March/April 2011,
http://web.clas.ufl.edu/users/zselden/coursereading2011/Glaser.pdf)

THE PROSPECTS for avoiding intense military competition and war may be good, but growth in China's power may
nevertheless require some changes in U.S. foreign policy that Washington will find disagreeable--particularly
regarding Taiwan. Although it lost control of Taiwan during the Chinese Civil War more than six decades ago,

China still considers Taiwan to be part of its homeland, and unification remains a key political goal
for Beijing. China has made clear that it will use force if Taiwan declares
independence, and much of China's conventional military buildup has been dedicated
to increasing its ability to coerce Taiwan and reducing the United States' ability to intervene.
Because China places such high value on Taiwan and because the United States and
China--whatever they might formally agree to-- have such different attitudes regarding the
legitimacy of the status quo, the issue poses special dangers and challenges for the U.S.-Chinese
relationship, placing it in a different category than Japan or South Korea. A crisis over Taiwan could
fairly easily escalate to nuclear war, because each step along the way
might well seem rational to the actors involved. Current U.S. policy is designed to reduce the
probability that Taiwan will declare

independence and to make clear that the United States will not come to

the United States would find itself under pressure to


protect Taiwan against any sort of attack, no matter how it originated. Given the different
Taiwan's

aid if it does. Nevertheless,

interests and perceptions of the various parties and the limited control Washington has over Taipei's behavior, a
crisis could unfold in which the United States found itself following events rather than leading them. Such

ongoing improvements in China's military


capabilities may make Beijing more willing to escalate a Taiwan crisis. In addition
to its improved conventional capabilities, China is modernizing its nuclear forces to
increase their ability to survive and retaliate following a large-scale U.S. attack. Standard
dangers have been around for decades, but

deterrence theory holds that Washington's

current ability to destroy most or all of China's nuclear force enhances

bargaining position. China's nuclear modernization might remove that check on


Chinese action, leading Beijing to behave more boldly in future crises than it has in
past ones. A U.S. attempt to preserve its ability to defend Taiwan, meanwhile, could fuel a
conventional and nuclear arms race. Enhancements to U.S. offensive targeting capabilities and

its

strategic ballistic missile defenses might be interpreted by China as a signal of malign U.S. motives, leading to
further Chinese military efforts and a general poisoning of U.S.-Chinese relations.

2NC Cyber-Deterrence
Cyber-offensive strengths are key to cyber-deterrence and
minimizing damage
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

there is a danger that US counter-military cyber operations will


infect and damage systems other than those targeted, including civilian systems,
because of the technical difficulties of controlling effects, especially for systems that
support multiple services. As we have previously noted in these pages, an attack that uses a replicable
Even with effective C2,

agent, such as a virus or worm, has substantial potential to spread, perhaps uncontrollably.19 The dangers of
collateral damage on non-combatants imply not only the possibility of violating the laws of war (as they might apply
to cyber war), but also of provoking escalation. While the United States would like there to be strong technical and

US
doctrine concerning the conduct of wartime counter-military offensive operations
must account for these risks. This presents a dilemma, for dedicated military systems tend
to be harder to access and disrupt than multipurpose or civilian ones. Chinas
military, for example, is known for its attention to communications security, aided
by its reliance on short-range and land-based (for example, fibre-optical)
transmission of C4ISR. Yet, to attack less secure multipurpose systems on which the
Chinese military depends for logistics is to risk collateral damage and heighten the
risk of escalation. Faced with this dilemma, US policy should be to exercise care in attacking
military networks that also support civilian services. The better its offensive cyber-war
capabilities, the more able the United States will be to disrupt critical
enemy military systems and avoid indiscriminate effects. Moreover, US
offensive strength could deter enemy escalation. As we have argued before, US
superiority in counter-military cyber war would have the dual advantage of
delivering operational benefits by degrading enemy forces and averting a more
expansive cyber war than intended. While the United States should avoid the
spread of cyber war beyond military systems, it should develop and maintain an
unmatched capability to conduct counter-military cyber war. This would give it
operational advantages and escalation dominance. Such capabilities might enable
the United States to disrupt enemy C4ISR systems used for the control and
operation of nuclear forces. However, to attack such systems would risk causing the enemy to perceive
C2 safeguards against unwanted effects and thus escalation, it is not clear that there are. It follows that

that the United States was either engaged in a non-nuclear-disarming first strike or preparing for a nucleardisarming first strike. Avoiding such a misperception requires the avoidance of such systems, even if they also

US policy should be to create,


maintain and be ready to use superior cyber-war capabilities for counter-military
operations during armed conflict. Such an approach would deny even the most
capable of adversaries, China included, an advantage by resorting to cyber war in an
armed conflict. The paramount goal of the United States should be to retain its
military advantage in the age of cyber war a tall order, but a crucial one
for US interests.
support enemy non-nuclear C4ISR (as Chinas may do). In sum,

2NC Russia
Deterrence solves cyber-war and Russian aggression
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

While the United States should be ready to conduct cyber attacks against
military forces in an armed conflict, it should in general otherwise try to avoid and
prevent cyber war. (Possible exceptions to this posture of avoidance are taken up later in the cases
Retaliation

concerning coercion.) In keeping with its commitment to an open, secure, interoperable and reliable internet that

the United States should


seek to minimise the danger of unrestricted cyber war, in which critical economic,
governmental and societal systems and services are disrupted. 20 Given how difficult it is to
protect such systems, the United States must rely to a heavy extent on deterrence
and thus the threat of retaliation. To this end, the US Defense Department has stated that a
would-be attacker could suffer unacceptable costs if it launches a cyber attack on
the United States.21 While such a warning is worth issuing, it raises the question of how these unacceptable
enables prosperity, public safety, and the free flow of commerce and ideas,

costs could be defined and levied. Short of disclosing specific targets and methods, which we do not advocate, the
United States could strengthen both the deterrence it seeks and the norms it favours by indicating what actions
might constitute retaliation. This is especially important because the most vulnerable targets of cyber retaliation

cyber retaliation that


extends beyond military capabilities, as required for strong deterrence, might be
considered indiscriminate. Whether it is also disproportionate depends in part on the enemy attack that
are computer networks that serve civilian life, starting with the internet. By definition,

precipitated it. We can posit, for purposes of analysis, that an enemy attack would be aimed at causing severe
disruptions of such economic and societal functions as financial services, power-grid management, transport
systems, telecommunications services, media and government services, along with the expected military and
intelligence functions. In considering how the United States should retaliate, the distinction between the population
and the state of the attacker is useful. The United States would hold the latter, not the former, culpable, and thus
the rightful object of retaliation. This would suggest targeting propaganda and other societal-control systems;
government financial systems; state access to banks; political and economic elites on which the state depends;
industries on which the state depends, especially state-owned enterprises; and internal security forces and

consider the case of Russia. The


Russian state is both sprawling and centralised: within Russias economy and
society, it is pervasive, heavy-handed and exploitative; power is concentrated in the
Kremlin; and elites of all sorts are beholden to it. Although the Russian state is well
entrenched and not vulnerable to being overthrown, it is porous and exposed,
especially in cyberspace. Even if the computer systems of the innermost circle of
Russian state decision-making may be inaccessible, there are many important
systems that are not. Insofar as those who control the Russian state are more concerned about their own
functions. To judge how effective such a retaliation strategy could be,

well-being than that of the masses, targeting their apparatus would cause acute apprehension. Of course, the

even if Russia
were to launch indiscriminate cyber attacks on the US economy and society, the
United States might get more bang for its bytes by retaliating against systems that
support Russian state power. Of course, US cyber targeting could also include the
systems on which Russian leaders rely to direct military and other security forces,
which are the ultimate means of state power and control. Likewise, Russian military and intelligence
systems would be fair game for retaliation. At the same time, it would be vital to observe the
more important a computer system is to the state, the less accessible it is likely to be. Still,

stricture against disabling nuclear C2 systems, lest the Kremlin perceive that a US strategic strike of some sort was

the Russian states cyber vulnerabilities should be exploited


as much as possible. The United States could thus not only meet the standard of
unacceptable costs on which deterrence depends, but also gain escalation control
by giving Russias leaders a sense of their vulnerability. In addition to preventing further
escalation, this US targeting strategy would meet, more or less, normative standards of
discrimination and proportionality.
in the works. With this exception,

And the cyberthreat is real Mutliple Countries and Terrorists


are acquiring capabilities increases the risk of nuclear
nuclear war and collapsing agriculture and the power grid
Habiger, 2k10
(Eugue Retired Air Force General, Cyberwarfare and Cyberterrorism, The Cyber Security
Institute, p. 11-19)
However, there are reasons to believe that what is going on now amounts to a fundamental shift as opposed to business as usual. Todays network exploitation or information
operation trespasses possess a number of characteristics that suggest that the line between espionage and conflict has been, or is close to being, crossed. (What that suggests

the number of cyberattacks we are facing is


growing significantly. Andrew Palowitch, a former CIA official now consulting with the US Strategic
Command (STRATCOM), which oversees the Defense Departments Joint Task ForceGlobal Network Operations, recently told a meeting of experts that
the Defense Department has experienced almost 80,000 computer
attacks, and some number of these assaults have actually reduced the militarys
operational capabilities.20 Second, the nature of these attacks is starting to
shift from penetration attempts aimed at gathering intelligence (cyber spying) to offensive efforts aimed at taking down systems
for the proper response is a different matter.) First,

(cyberattacks). Palowitch put this in stark terms last November, We are currently in a cyberwar and war is going on today.21 Third, these recent attacks need to be taken in a

Russia and China have stepped up their offensive efforts


and taken a much more aggressive cyberwarfare posture. The Chinese have
broader strategic context. Both

developed an openly discussed cyberwar strategy aimed at achieving electronic dominance over the U.S. and its allies by 2050. In 2007 the Department of Defense reported

China has developed first strike viruses , marking a major


shift from prior investments in defensive measures.22 And in the intervening period China has launched
that for the first time

a series of offensive cyber operations against U.S. government and private sector networks and infrastructure. In 2007, Gen. James Cartwright, the former head of STRATCOM
and now the Vice Chairman of the Joint Chiefs of Staff, told the USChina Economic and Security Review Commission that Chinas ability to launch denial of service attacks to

Russia also has already begun to wage offensive


cyberwar. At the outset of the recent hostilities with Georgia, Russian assets launched a series of cyberattacks against the Georgian government and its critical
overwhelm an IT system is of particular concern. 23

infrastructure systems, including media, banking and transportation sites.24 In 2007, cyberattacks that many experts attribute, directly or indirectly, to Russia shut down the
Estonia governments IT systems. Fourth, the current geopolitical context must also be factored into any effort to gauge the degree of threat of cyberwar. The start of the new
Obama Administration has begun to help reduce tensions between the United States and other nations. And, the new administration has taken initial steps to improve bilateral
relations specifically with both China and Russia. However, it must be said that over the last few years the posture of both the Chinese and Russian governments toward America

. Some commentators have talked about the prospects of a cyber


Pearl Harbor, and the pattern of Chinese and Russian behavior to date gives
reason for concern along these lines: both nations have offensive cyberwarfare
strategies in place; both nations have taken the cyber equivalent of building up their
forces; both nations now regularly probe our cyber defenses looking for gaps to be exploited; both nations have
begun taking actions that cross the line from cyberespionage to cyberaggression; and, our bilateral relations with both
nations
are increasingly fractious and complicated by areas of marked, direct
competition. Clearly, there a sharp differences between current U.S. relations with these two nations and relations between the US and Japan just prior to
World War II. However, from a strategic defense perspective, there are enough warning signs to warrant preparation. In addition to the threat of cyberwar, the
limited resources required to carry out even a large scale cyberattack also makes likely the
potential for a significant cyberterror attack against the United States. However, the lack of a long
list of specific incidences of cyberterrorism should provide no comfort. There is strong evidence to suggest that
al Qaeda has the ability to conduct cyberterror attacks against the United States and its allies. Al
has clearly become more assertive, and at times even aggressive

Qaeda and other terrorist organizations are extremely active in cyberspace, using these technologies to communicate among themselves and others, carry out logistics, recruit
members, and wage information warfare. For example, al Qaeda leaders used email to communicate with the 911 terrorists and the 911 terrorists used the Internet to make

there
is evidence of efforts that al Qaeda and other terrorist organizations are
actively developing cyberterrorism capabilities and seeking to carry out cyberterrorist attacks.
travel plans and book flights. Osama bin Laden and other al Qaeda members routinely post videos and other messages to online sites to communicate. Moreover,

For example, the Washington Post has reported that U.S. investigators have found evidence in the logs that mark a browser's path through the Internet that al Qaeda operators
spent time on sites that offer software and programming instructions for the digital switches that run power, water, transport and communications grids. In some interrogations .
. . al Qaeda prisoners have described intentions, in general terms, to use those tools.25 Similarly, a 2002 CIA report on the cyberterror threat to a member of the Senate stated
that al Qaeda and Hezbollah have become "more adept at using the internet and computer technologies.26 The FBI has issued bulletins stating that, U. S. law enforcement
and intelligence agencies have received indications that Al Qaeda members have sought information on Supervisory Control And Data Acquisition (SCADA) systems available on
multiple SCADArelated web sites.27 In addition a number of jihadist websites, such as 7hj.7hj.com, teach computer attack and hacking skills in the service of Islam.28 While al
Qaeda may lack the cyberattack capability of nations like Russia and China, there is every reason to believe its operatives, and those of its ilk, are as capable as the cyber
criminals and hackers who routinely effect great harm on the worlds digital infrastructure generally and American assets specifically. In fact, perhaps, the most troubling
indication of the level of the cyberterrorist threat is the countless, serious non terrorist cyberattacks routinely carried out by criminals, hackers, disgruntled insiders, crime
syndicates and the like. If runofthemill criminals and hackers can threaten powergrids, hack vital military networks, steal vast sums of money, take down a citys of traffic
lights, compromise the Federal Aviation Administrations air traffic control systems, among other attacks, it is overwhelmingly likely that terrorists can carry out similar, if not
more malicious attacks. Moreover, even if the worlds terrorists are unable to breed these skills, they can certainly buy them. There are untold numbers of cybermercenaries
around the worldsophisticated hackers with advanced training who would be willing to offer their services for the right price. Finally, given the nature of our understanding of
cyber threats, there is always the possibility that we have already been the victim or a cyberterrorist attack, or such an attack has already been set but not yet effectuated, and

a welldesigned cyberattack has the capacity cause widespread


chaos, sow societal unrest, undermine national governments, spread paralyzing fear and anxiety, and
create a state of utter turmoil, all without taking a single life. A sophisticated cyberattack
could throw a nations banking and finance system into chaos causing markets
to crash, prompting runs on banks, degrading confidence in markets, perhaps even putting the nations
currency in play and making the government look helpless and hapless. In todays difficult economy, imagine how Americans would
react if vast sums of money were taken from their accounts and their supporting financial records were destroyed. A truly
we dont know it yet. Instead,

nefarious cyberattacker could carry out an attack in such a way (akin to Robin Hood) as to engender populist support and deepen rifts within our society, thereby making efforts

A modestly advanced enemy could use a cyberattack


to shut down (if not physically damage) one or more regional power grids. An entire region could be cast into total darkness, power dependent
systems could be shutdown. An attack on one or more regional power grids could also cause cascading
effects that could jeopardize our entire national grid . When word
leaks that the blackout was caused by a cyberattack, the specter of a foreign enemy
capable of sending the entire nation into darkness would only increase
the fear, turmoil and
unrest. While the finance and energy sectors are considered prime targets for a cyberattack, an attack on any of the 17 delineated critical infrastructure sectors could
to restore the system all the more difficult.

have a major impact on the United States. For example, our healthcare system is already technologically driven and the Obama Administrations e health efforts will only
increase that dependency. A cyberattack on the U.S. ehealth infrastructure could send our healthcare system into chaos and put countless of lives at risk. Imagine if emergency

A cyberattack on our nations water


systems could likewise cause widespread disruption. An attack on the control systems for one
or more dams could put entire communities at risk of being inundated, and could create ripple effects across the
water, agriculture, and energy sectors. Similar water control system
attacks could be used to at least temporarily deny water to otherwise arid regions, impacting
everything from the quality of life in these areas to agriculture. In 2007, the U.S. Cyber Consequences Unit determined that the destruction from a single wave of
room physicians and surgeons were suddenly no longer able to access vital patient information.

cyberattacks on critical infrastructures could exceed $700 billion, which would be the rough equivalent of 50 Katrina esque hurricanes hitting the United States all at the same
time.29 Similarly, one IT security source has estimated that the impact of a single day cyberwar attack that focused on and disrupted U.S. credit and debit card transactions
would be approximately $35 billion.30 Another way to gauge the potential for harm is in comparison to other similar noncyberattack infrastructure failures. For example, the
August 2003 regional power grid blackout is estimated to have cost the U.S. economy up to $10 billion, or roughly .1 percent of the nations GDP. 31 That said, a cyberattack of
the exact same magnitude would most certainly have a much larger impact. The origin of the 2003 blackout was almost immediately disclosed as an atypical system failure
having nothing to do with terrorism. This made the event both less threatening and likely a single time occurrence. Had it been disclosed that the event was the result of an
attack that could readily be repeated the impacts would likely have grown substantially, if not exponentially. Additionally, a cyberattack could also be used to disrupt our nations
defenses or distract our national leaders in advance of a more traditional conventional or strategic attack. Many military leaders actually believe that such a disruptive cyber
preoffensive is the most effective use of offensive cyber capabilities. This is, in fact, the way Russia utilized cyberattackerswhether government assets, governmentdirected/
coordinated assets, or allied cyber irregularsin advance of the invasion of Georgia. Widespread distributed denial of service (DDOS) attacks were launched on the Georgian
governments IT systems. Roughly a day later Russian armor rolled into Georgian territory. The cyberattacks were used to prepare the battlefield; they denied the Georgian
government a critical communications tool isolating it from its citizens and degrading its command and control capabilities precisely at the time of attack. In this way, these
attacks were the functional equivalent of conventional air and/or missile strikes on a nations communications infrastructure.32 One interesting element of the Georgian
cyberattacks has been generally overlooked: On July 20th, weeks before the August cyberattack, the website of Georgian President Mikheil Saakashvili was overwhelmed by a
more narrowly focused, but technologically similar DDOS attack.33 This should be particularly chilling to American national security experts as our systems undergo the same
sorts of focused, probing attacks on a constant basis. The ability of an enemy to use a cyberattack to counter our offensive capabilities or soften our defenses for a wider
offensive against the United States is much more than mere speculation. In fact, in Iraq it is already happening. Iraq insurgents are now using off theshelf software (costing just

insurgents
have succeeded in greatly reducing one of our most valuable sources
of realtime intelligence and situational awareness. If our enemies in Iraq are capable of such an effective cyberattack against one of
$26) to hack U.S. drones (costing $4.5 million each), allowing them to intercept the video feed from these drones.34 By hacking these drones the

our more sophisticated systems, consider what a more technologically advanced enemy could do. At the strategic level, in 2008, as the United States Central Command was
leading wars in both Iraq and Afghanistan, a cyber intruder compromised the security of the Command and sat within its IT systems, monitoring everything the Command was

the attacker could have used


this access to wage cyberwaraltering information, disrupting the
flow of information, destroying information, taking down
systemsagainst the United States forces already at war. Similarly, during 2003 as the United States prepared for and began the War in Iraq, the IT networks of
the Department of Defense were hacked 294 times.36 By August of 2004, with America at war, these ongoing attacks compelled
thenDeputy Secretary of Defense Paul Wolfowitz to write in a memo that, " Recent exploits have reduced
operational capabilities on our networks."37 This wasnt the first time that our national security IT
doing. 35 This time the attacker simply gathered vast amounts of intelligence. However, it is clear that

infrastructure was penetrated immediately in advance of a U.S. military option.38 In February of 1998 the Solar Sunrise attacks systematically compromised a series of
Department of Defense networks. What is often overlooked is that these attacks occurred during the ramp up period ahead of potential military action against Iraq. The

attackers were able to obtain vast amounts of sensitive informationinformation that would have certainly been of value to an enemys military leaders. There is no way to
prove that these actions were purposefully launched with the specific intent to distract American military assets or degrade our capabilities. However, such ambiguitiesthe
inability to specifically attribute actions and motives to actorsare the very nature of cyberspace. Perhaps, these repeated patterns of behavior were mere coincidence, or
perhaps they werent. The potential that an enemy might use a cyberattack to soften physical defenses, increase the gravity of harms from kinetic attacks, or both, significantly
increases the potential harms from a cyberattack. Consider the gravity of the threat and risk if an enemy, rightly or wrongly, believed that it could use a cyberattack to degrade

Such an enemy might be convinced that it could win a


warconventional or even nuclear against the U nited S tates. The effect of this would be
to undermine our deterrencebased defenses, making us significantly more at
risk of a major war .
our strategic weapons capabilities.

And we control probability and magnitude- it causes extinction


Bostrom, 2k2
(Nick Bostrom, Ph.D. and Professor of Philosophy at Oxford University, March 2002, Journal of
Evolution and Technology, Existential Risks: Analyzing Human Extinction Scenarios and
Related Hazards)

A much greater existential risk emerged with the build-up of nuclear


arsenals in the US and the USSR. An all-out nuclear war was a possibility
with both a substantial probability and with consequences that might have been
persistent enough to qualify as global and terminal. There was a real worry among those best acquainted with the information
available at the time that a nuclear Armageddon would occur and that it might annihilate our species or permanently destroy human civilization. Russia and
the US retain large nuclear arsenals that could be used in a future
confrontation, either accidentally or deliberately. There is also a risk that other states may one day build up large
nuclear arsenals. Note however that a smaller nuclear exchange , between India and Pakistan for instance, is not an
existential risk, since it would not destroy or thwart humankinds potential permanently. Such a war might however be a local terminal risk for the cities most
likely to be targeted. Unfortunately, we shall see that nuclear Armageddon and comet or asteroid strikes are mere preludes to the existential risks that we will encounter in the
21st century.

2NC T/ Case
Cyber-deterrence turns terrorism, war, prolif, and human rights
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

Given that retaliation and counter-military cyber war require copious offensive
capabilities, questions arise about whether these means could and should also be
used to coerce hostile states into complying with US demands without requiring the
use of armed force. Examples include pressuring a state to cease international
aggression, intimidating behaviour or support for terrorists; or to abandon
acquisition of weapons of mass destruction; or to end domestic human-rights
violations. If, as some argue, it is getting harder, costlier and riskier for the
United States to use conventional military force for such ends, threatening
or conducting cyber war may seem to be an attractive alternative. 25 Of course,
equating cyber war with war suggests that conducting or threatening it to impose Americas will is an idea not to be
treated lightly. Whereas counter-military cyber war presupposes a state of armed conflict, and retaliation

coercion (as meant here) presupposes


neither a state of armed conflict nor an enemy attack. This means , in essence, the
United States would threaten to start a cyber war outside of an armed conflict
presupposes that the United States has suffered a cyber attack,

something US policy has yet to address. While the United States has intimated that it would conduct cyber war
during an armed conflict and would retaliate if deterrence failed, it is silent about using or threatening cyber war as
an instrument of coercion. Such reticence fits with the general US aversion to this form of warfare, as well as a

the
use of cyber war for coercion can be more attractive than the use of conventional
force: it can be conducted without regard to geography, without threatening death
and physical destruction, and with no risk of American casualties. While the United
States has other non-military options, such as economic sanctions and supporting regime opponents,
none is a substitute for cyber war. Moreover, in the case of an adversary with little
or no ability to return fire in cyberspace, the United States might have an even
greater asymmetric advantage than it does with its conventional military
capabilities.
possible preference to carry out cyber attacks without attribution or admission. Notwithstanding US reticence,

China Tech DA

CX Questions
Customers are shifting to foreign products now why does the
plan reverse that trend?

1NC
NSA spying shifts tech dominance to China but its fragile
reversing the trend now kills China
Li and McElveen 13
(Cheng Li; Ryan Mcelveen. Cheng Li received a M.A. in Asian studies from the University of California,
Berkeley and a Ph.D. in political science from Princeton University. He is director of the John L.
Thornton China Center and a senior fellow in the Foreign Policy program at Brookings. He is also a
director of the Nationsal Committee on U.S.-China Relations. Li focuses on the transformation of
political leaders, generational change and technological development in China. "NSA Revelations Have
Irreparably Hurt U.S. Corporations in China," Brookings Institution. 12-12-2013.
http://www.brookings.edu/research/opinions/2013/12/12-nsa-revelations-hurt-corporations-china-limcelveen//ghs-kw)

The first story about the NSA


appeared in The Guardian on June 5. When Obama and Xi met in California two days
later, the United States had lost all credibility on the cyber security issue. Instead of
For the Obama administration, Snowdens timing could not have been worse.

providing Obama with the perfect opportunity to confront China about its years of intellectual property theft from U.S. firms, the
Sunnylands meeting forced Obama to resort to a defensive posture. Reflecting on how the tables had turned, the media reported

the
Chinese government turned to official media to launch a public campaign
against U.S. technology firms operating in China through its de-Cisco (qu
Sike hua) movement. By targeting Cisco, the U.S. networking company that had helped many
local Chinese governments develop and improve their IT infrastructures beginning in the mid-1990s, the Chinese
government struck at the very core of U.S.-China technological and economic
collaboration. The movement began with the publication of an issue of China Economic Weekly titled Hes
Watching You that singled out eight U.S. firms as guardian warriors who had infiltrated
the Chinese market: Apple, Cisco, Google, IBM, Intel, Microsoft, Oracle and
Qualcomm. Cisco, however, was designated as the most horrible of these warriors because of its pervasive reach into
Chinas financial and governmental sectors. For these U.S. technology firms, China is a vital source
of business that represents a fast-growing slice of the global technology market.
After the Chinese official media began disparaging the guardian
warriors in June, the sales of those companies have fallen precipitously.
With the release of its third quarter earnings in November, Cisco reported that orders from China fell 18
percent from the same period a year earlier and projected that overall revenue would fall 8 to 10 percent as a result, according
to Reuters. IBM reported that its revenue from the Chinese market fell 22 percent , which
resulted in a 4 percent drop in overall profit. Similarly, Microsoft has said that China had become its
weakest market. However, smaller U.S. technology firms working in China have not seen the same slowdown in business.
that President Xi chose to stay off-site at a nearby Hyatt hotel out of fear of eavesdropping. After the Sunnylands summit,

Juniper Networks, a networking rival to Cisco, and EMC Corp, a storage system maker, both saw increased business in the third

the Chinese continue to shun the guardian warriors, they may turn to similar but smaller
U.S. firms until domestic Chinese firms are ready to assume their role. In the meantime, trying to completely
de-Cisco would be too costly for China , as Ciscos network infrastructure has become too deeply embedded
around the country. Chinese technology firms have greatly benefited in the aftermath of the
Snowden revelations. For example, the share price of China National Software has increased 250 percent since June. In
addition, the Chinese government continues to push for faster development of its
technology industry, in which it has invested since the early 1990s, by funding the development of
supercomputers and satellite navigation systems . Still, Chinas current investment in cyber security
quarter. As

cannot compare with that of the United States. The U.S. government spends $6.5 billion annually on cyber security, whereas China

The Chinese
governments investment in both cyber espionage and cyber security will continue
to increase, and that investment will overwhelmingly benefit Chinese technology
corporations. Chinas reliance on the eight American guardian warrior
corporations will diminish as its domestic firms develop commensurate
capabilities. Bolstering Chinas cyber capabilities may emerge as one of the goals of Chinas National Security Committee,
spends $400 million, according to NetentSec CEO Yuan Shengang. But that will not be the case for long.

which was formed after the Third Plenary Meeting of the 18th Party Congress in November. Modeled on the U.S. National Security

Council and led by President Xi Jinping, the committee was established to centralize coordination and quicken response time,
although it is not yet clear how much of its efforts will be focused domestically or internationally. The Third Plenum also brought
further reform and opening of Chinas economy, including encouraging more competition in the private sector. The Chinese
leadership continues to solicit foreign investment, as evidenced by in the newly established Shanghai Free Trade Zone. However,

there is no doubt that investments by foreign technology companies are


less welcome than investments from other sectors because of the
Snowden revelations.

The AFF reclaims US tech leadership from China


Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.
He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

it could very well be that one of the


themes will be how the United States lost its global technology leadership to other
nations. And clearly one of the factors they would point to is the long-standing privileging of
U.S. national security interests over U.S. industrial and commercial interests when it comes to U.S.
foreign policy. This has occurred over the last few years as the U.S. government has done relatively little
to address the rising commercial challenge to U.S. technology companies, all the while putting intelligence
gathering first and foremost. Indeed, policy decisions by the U.S. intelligence
community have reverberated throughout the global economy. If the U.S. tech
industry is to remain the leader in the global marketplace, then the U.S.
government will need to set a new course that balances economic interests with national
security interests. The cost of inaction is not only short-term economic losses for U.S. companies, but a
wave of protectionist policies that will systematically weaken U.S. technology
competiveness in years to come, with impacts on economic growth, jobs, trade balance, and
national security through a weakened industrial base. Only by taking decisive steps to reform
its digital surveillance activities will the U.S. government enable its tech
industry to effectively compete in the global market.
CONCLUSION When historians write about this period in U.S. history

Growth is slowing nowinnovation and tech are key to sustain


CCP legitimacy
Ebner 14
(Julia Ebner. Julia Ebner received her MSc in International Relations and Affairs and her MSc in Political
Economy, Development Economics, and Natural Resources from Peking University. She was a
researcher at the European Institute of Asia Studies. "Entrepreneurs: Chinas Next Growth Engine?,"
Diplomat. 8-7-2014. http://thediplomat.com/2014/08/entrepreneurs-chinas-next-growth-engine///ghskw)

China want to remain an international economic superpower, it will need to substitute its
current growth model one largely based on abundant, cheap labor with a different
comparative advantage that can lay the foundation for a new, more sustainable
growth strategy. Chinese policymakers are hoping now that an emerging entrepreneurship
may fit that bill, with start-ups and family-run enterprises potentially
Should

becoming a major driver of sustainable growth and thus replacing the


countrys current economic model. In 2014, international conferences on private
entrepreneurship and innovation were organized all across China : The China Council for the
Promotion of International Trade organized its first annual Global Innovation Economic Congress, while

numerous innovation-related conferences were held at well-known Chinese


universities such as Tsinghua University, Jilin University and Wuhan University. New

Growth Model Needed Although China still ranks among the fastest growing economies in the world, the countrys

From the 1990s until the 2008


financial crisis, Chinas GDP growth was consistently in the double digits with only a brief
growth rates have decreased notably over the past few years.

interruption following the Asian financial crisis of 1997. Despite a relatively quick recovery after the global financial
crisis, declining export rates resulting from the economic distress of Chinas main trading partners have left their

Todays GDP growth of 7.8 percent is just half level recorded


immediately before the 2008 crisis, according to the latest data provided by the World Bank. This
recent slowdown in Chinas economic growth has naturally been a source of concern for
the government. A continuation of the countrys phenomenal economic
growth is needed to maintain both social stability and the Communist
Partys legitimacy. Sustainable economic growth has thus been identified
as one of Chinas key challenges for the coming decade. That challenge is
mark on the Chinese economy.

complicated by demographic trends, which are set to have a strongly negative impact on the Chinese economy
within the next decade. Researchers anticipate that as a consequence of the countrys one-child policy, introduced
in 1977, China will soon experience a sharp decline of its working-age population, leading to a substantial labor
force bottleneck. A labor shortage is likely to mean climbing wages, threatening Chinas cheap labor edge. The
challenge is well described in a recent article published by the International Monetary Fund. Replacing the Cheap

Entrepreneurship is widely recognized as an important engine for


economic growth: It contributes positively to economic development by fuelling job
markets through the creation of new employment opportunities, by stimulating
technological change through increased levels of innovation, and by enhancing the
market environment through an intensification of market competition.
Entrepreneurship and innovation have the potential to halt the contraction
in China economic growth and to replace the countrys unsustainable
comparative advantage of cheap labor over the long term. As former Chinese
President Hu Jintao stressed in 2006, if China can transform its current growth strategy
into one based on innovation and entrepreneurship, it could sustain its growth
rates and secure a key role in the international world order. Indeed, increasing
levels of entrepreneurship in the Chinese private sector are likely to lead to
technological innovation and productivity increases. This could prove particularly useful in
offsetting the workforce bottleneck created by demographic trends. Greater
innovation would also make China more competitive and less dependent on the knowledge and
Labor Strategy

technology of traditional Western trading partners such as the EU and the U.S.

Economic growth is key to prevent CCP collapse and lashout


Friedberg 10, Professor of Politics and International Affairs Princeton,
Asia Expert CFR (Aaron, Implications of the Financial Crisis for the USChina Rivalry, Survival, Volume 52, Issue 4, August, p. 31 54)

Beijing's stimulusprogrammewas insufficient to forestall a sizeable spike in


unemployment. The regime acknowledges that upwards of 20 million migrant workers lost their jobs
in the first year of the crisis, with many returning to their villages, and 7m recent college graduates are reportedly on the streets in search of work.9 Not
surprisingly, tough times have been accompanied by increased social turmoil. Even before the crisis hit,
the number of so-called 'mass incidents' (such as riots or strikes) reported each year in China had been rising. Perhaps
because it feared that the steep upward trend might be unnerving to foreign investors, Beijing stopped publishing aggregate, national statistics in 2005.10 Nevertheless, there
is ample, if fragmentary, evidence that things got worse as the economy slowed . In Beijing, for example,
salary cuts, layoffs, factory closures and the failure of business owners to pay back
wages resulted in an almost 100% increase in the number of labour disputes brought before
Despite its magnitude,

the courts.11 Since the early days of the current crisis, the regime has clearly been bracing itself for trouble. Thus, at the start of 2009, an official news-agency story candidly warned

the regime for


the first time summoned all 3,080 county-level police chiefs to the capital to learn the latest riot-control
tactics, and over 200 intermediate and lower-level judges were also called in for special training.13 Beijing's stimulus was insufficient At least for the moment, the
Chinese Communist Party (CCP) appears to be weathering the storm. But if in the next several years the economy
slumps again or simply fails to return to its previous pace, Beijing's troubles will mount. The regime probably
has enough repressive capacity to cope with a good deal more turbulence than it has thus far encountered,
but a protracted crisis could eventually pose a challenge to the solidarity of the party's
leadership and thus to its continued grip on political power. Sinologist MinxinPei points out that the greatest
danger to CCP rule comes not from below but from above . Rising societal discontent
'might be sufficient to tempt some members of the elite to exploit the situation to their own
political advantage' using 'populist appeals to weaken their rivals and, in the process, open[ing] up divisions within
Chinese readers that the country was, 'without a doubt entering a peak period of mass incidents'.12 In anticipation of an expected increase in unrest,

a bloody civil
war, will suddenly become plausible. Precisely because it is aware of this danger, the regime has been very careful to keep whatever differences exist
the party's seemingly unified upper ranks'.14 If this happens, all bets will be off and a very wide range of outcomes, from a democratic transition to

over how to deal with the current crisis within bounds and out of view. If there are significant rifts they could become apparent in the run-up to the pending change in leadership

Short of causing the regime to unravel, a sustained economic crisis could induce it to
abandon its current, cautious policy of avoiding conflict with other countries while patiently accumulating all the
elements of 'comprehensive national power'. If they believe that their backs are to the wall, China's leaders
might even be tempted to lash out, perhaps provoking a confrontation with a foreign
power in the hopes of rallying domestic support and deflecting public attention from their day-to-day troubles. Beijing
might also choose to implement a policy of 'military Keynesianism', further accelerating its already ambitious plans for
military construction in the hopes of pumping up aggregate demand and resuscitating a sagging domestic economy.15 In sum, despite its impressive initial
performance, Beijing is by no means on solid ground . The reverberations from the 2008-09 financial
crisismay yet shake the regime to its foundations, and could induce it to behave in
unexpected, and perhaps unexpectedly aggressive, ways.
scheduled for 2012.

Chinese lashout goes nuclear


Epoch Times 4
(The Epoch Times, Renxing San, 8/4/2004, 8/4, http://english.epochtimes.com/news/5-84/30931.html//ghs-kw)

Since the Partys life is above all else, it would not be surprising if the CCP resorts to the
use of biological, chemical, and nuclear weapons in its attempt to extend its life.
The CCP, which disregards human life, would not hesitate to kill two hundred million Americans,
along with seven or eight hundred million Chinese, to achieve its ends. These speeches let the public
see the CCP for what it really is. With evil filling its every cell the CCP intends to wage a war against
humankind in its desperate attempt to cling to life. That is the main theme of the speeches. This
theme is murderous and utterly evil. In China we have seen beggars who coerced people to give them money by
threatening to stab themselves with knives or pierce their throats with long nails. But we have never, until now,
seen such a gangster who would use biological, chemical, and nuclear weapons to threaten the world, that all will
die together with him. This bloody confession has confirmed the CCPs nature: that of a monstrous murderer who
has killed 80 million Chinese people and who now plans to hold one billion people hostage and gamble with their
lives.

2NC O/V
Disad outweighs and turns the AFFNSA backdoors are
causing foreign customers to switch to Chinese tech now but
the plan reverses that by closing backdoors and reclaiming US
tech leadership. That kills Chinese growth and results in a loss
of CCP legitimacy, which causes CCP lashout and extinction:
<insert o/w and t/ args>

2NC UQ
Extend uniquenessperception of NSA backdoors incentivizes
the Chinese government and foreign customers to shift to
Chinese tech, which boosts Chinese techUS company foreign
sales have been falling fastthats Li and McElveen
NSA spying boosts Chinese tech firms
Kan 13
(Kan, Michael. Michael Kan covers IT, telecommunications, and the Internet in China for the IDG News
Service. "NSA spying scandal accelerating China's push to favor local tech vendors," PCWorld. 12-32013. http://www.pcworld.com/article/2068900/nsa-spying-scandal-accelerating-chinas-push-to-favorlocal-tech-vendors.html//ghs-kw)

the tech services market may be shrinking for


U.S. enterprise vendors. Security concerns over U.S. secret surveillance are
giving the Chinese government and local companies more reason to trust
domestic vendors, according to industry experts. The country has always tried to support its
homegrown tech industry, but lately it is increasingly favoring local brands over foreign
competition. Starting this year, the nations government tenders have required IT suppliers
to source more products from local Chinese firms, said an executive at a U.S.-based storage
supplier that sells to China. In some cases, the tenders have required 50 percent or more of the
equipment to come from domestic brands, said the executive, who requested anonymity. Recent
While Chinas demand for electronics continues to soar,

leaks by former U.S. National Security Agency contractor, Edward Snowden, about the U.S.s secret spying program

China wants to favor local brands; they feel their


technology is getting better, the executive said. Snowden has just caused this to accelerate
incrementally. Last month, other U.S. enterprise vendors including Cisco and Qualcomm said the U.S.
spying scandal has put strains on their China business. Cisco reported its revenue from
the country fell 18 percent year-over-year in the last fiscal quarter. The Chinese government has yet to release
arent helping the matter. I think in general

an official document telling companies to stay away from U.S. vendors, said the manager of a large data center,

state-owned telecom operators have already


stopped orders for certain U.S. equipment to power their networks , he added. Instead,
the operators are relying on Chinese vendors such as Huawei Technologies , to supply
their telecommunications equipment. It will be hard for certain networking equipment made
in the U.S. to enter the Chinese market , the manager said. Its hard for them (U.S.
vendors) to get approval, to get certification from the related government
departments. Other companies, especially banks, are concerned that buying
enterprise gear from U.S. vendors may lead to scrutiny from the central
government, said Bryan Wang, an analyst with Forrester Research. The NSA issue has been
having an impact, but it hasnt been black and white, he added. In the future, China could
create new regulations on where certain state industries should source
their technology from, a possibility some CIOs are considering when
making IT purchases, Wang said. The obstacles facing U.S. enterprise vendors come
at a time when Chinas own homegrown companies are expanding in the enterprise
market. Huawei Technologies, a major vendor for networking equipment, this August came out with a new
networking switch that will put the company in closer competition with Cisco. Lenovo and ZTE are also
targeting the enterprise market with products targeted at government, and closing
the technology gap with their foreign rivals , Wang said. Overall in the longer-term, the
environment is positive for local vendors. We definitely see them taking market share
from multinational firms in China, he added. Chinese vendors are also expanding
outside the country and targeting the U.S. market. But last year Huawei and ZTE saw a
who has knowledge of such developments. But

push back from U.S. lawmakers concerned with the two companies alleged ties to the Chinese government. A
Congressional panel eventually advised that U.S. firms buy networking gear from other vendors, calling Huawei and
ZTE a security threat.

Europe is shifting to China now


Ranger 15
(Steve Ranger. "Rise of China tech, internet surveillance
revelations form background to CeBIT show," ZDNet. 3-172015. http://www.zdnet.com/article/rise-of-china-tech-internetsurveillance-revelations-form-background-to-cebit-show///ghskw)
CeBIT technology
show in Hannover reflects a gradual but important shift taking place in the European
technology world. Whereas in previous years US companies would have taken centre stage, this year
the emphasis is on China, both as a creator of technology and as a huge
potential market. "German business values China, not just as our most
important trade partner outside of Europe, but also as a partner in
developing sophisticated technologies," said Angela Merkel as she opened the
As well as showcasing new devices, from tablets to robotic sculptors and drones, this year's

show. "Especially in the digital economy, German and Chinese companies have core strengths ... and that's why
cooperation is a natural choice," she said. Chinese vice premier Ma Kai also attended the show, which featured a
keynote from Alibaba founder Jack Ma. China is CeBIT's 'partner country' this year, with over 600 Chinese

The UK is
also keen on further developing a historically close relationship: the China-Britain
Business Council is in Hannover to help UK firms set up meetings with Chinese
companies, and to provide support and advice to UK companies interested in doing
business in China. "China is mounting the biggest CeBIT partner country showcase ever. Attendees will
companies - including Huawei, Xiaomi, ZTE, and Neusoft - presenting their innovations at the show.

clearly see that Chinese companies are up there with the biggest and best of the global IT industry," said a

this activity is a result of the increasingly sophisticated output


of Chinese tech companies who are looking for new markets for their products.
Firms that have found it hard to make headway in the US, such as Huawei, have
been focusing their efforts on Europe instead. European tech companies are equally
keen to access the rapidly growing Chinese market. Revelations about mass
interception of communications by the US National Security Agency (including allegations
that spies had even tapped Angela Merkel's phone) have not helped US-European relations, either. So it's
spokesman for CeBIT. Some of

perhaps significant that an interview with NSA contractor-turned-whistleblower Edward Snowden is closing the
Hannover show.

2NC UQ: US Failing Now


US tech falling behind other countries
Kevin Ashton 06/2015 [the co-founder and former executive director of
the MIT Auto-ID Center, coined the term Internet of Things. His book How
to Fly a Horse: The Secret History of Creation, Invention, and Discovery was
published by Doubleday earlier this year] "America last?," The Agenda,
http://www.politico.com/agenda/story/2015/06/kevin-ashton-internet-ofthings-in-the-us-000102
In 2005,
Chinas high-tech exports exceeded Americas for the first time . In 2009, just after Wen
Jiabao spoke about the Internet of Things, Germanys high-tech exports exceeded Americas as
well. Today, Germany produces five times more high tech per capita than the U nited
States. Singapore and Koreas high-tech exporters are also far more productive than Americas and, according to
the most recent data, are close to pushing the U.S. down to fifth place in the world s high-tech
economy. And, as the most recent data are for 2013, that may have happened already. This decline will
surprise many Americans, including many American policymakers and pundits, who
assume U.S. leadership simply transfers from one tech revolution to the next. After all,
And, while they were not mentioning it, some key indicators began swinging away from the U.S.

that next revolution, the Internet of Things, was born in America, so perhaps it seems natural that America will lead.
Many U.S. commentators spin a myth that America is No. 1 in high tech, then extend it to claims that Europe is
lagging because of excessive government regulation, and hints that Asians are not innovators and entrepreneurs,
but mere imitators with cheap labor. This is jingoistic nonsense that could not be more wrong. Not only does
Germany, a leader of the European Union, lead the U.S. in high tech, but EU member states fund CERN, the
European Organization for Nuclear Research, which invented the World Wide Web and built the Large Hadron
Collider, likely to be a source of several centuries of high-tech innovation. (U.S. government intervention killed
Americas equivalent particle physics program, the Superconducting Super Collider, in 1993 an early symptom of

Apples iPhone, for


example, so often held up as the epitome of American innovation, looked a lot like a
Korean phone, the LG KE850, which was revealed and released before Apples product. Most of the
declining federal investment in basic research.) Asia, the alleged imitator, is anything but.

technology in the iPhone was invented in, and is exported by, Asian countries.

2NC Link
Extend the linkthe AFF stops creation of backdoors and
perpetuates the perception that US tech is safe, which means
the US regains customers and tech leadership from China
thats Castro and McQuinn
If the US loses its tech dominance, Chinese and Indian
innovation will quickly replace it
Fannin 13 (Rebecca Fannin, 7-12-2013, forbes magazine contributor "China Still Likely
To Take Over Tech Leadership If And When Silicon Valley Slips," Forbes,
http://www.forbes.com/sites/rebeccafannin/2013/07/12/china-still-likely-to-take-over-techleadership-if-and-when-silicon-valley-slips)

? Its a question thats


often pondered and debated, especially in the Valley, which has the most
to lose if the emerging markets of China or India take over
leadership. KPMG took a look at this question and other trends in its annual
Technology Innovation Survey, and found that the center of gravity may not
be shifting quite so fast to the East as once predicted. The KPMG survey of
811 technology executives globally found that one-third believe the Valley
Will Silicon Valley continue to maintain its market-leading position for technology innovation

will likely lose its tech trophy to an overseas market within just four years.
That percentage might seem high, but it compares with nearly half (44 percent) in last years survey. Its a notable improvement for
the Valley, as the U.S. economy and tech sector pick up. Which country will lead in disruptive breakthroughs? Here, the U.S. again
solidifies its long-standing reputation as the worlds tech giant while China has slipped in stature from a year ago, according to the
survey. In last years poll, the U.S. and China were tied for the top spot. But today, some 37 percent predict that the U.S. shows the
most promise for tech disruptions, little surprise considering Google GOOG +2.72%s strong showing in the survey as top company

China, which is
progressing from a reputation for just copying to also innovating or microinnovating. India, with a heritage of leadership in outsourcing, a large talent pool
of engineers, ample mentoring from networking groups such as TiE, and a vibrant
mobile communications market, ranked right behind the U.S. and China two years in
a row. Even though Chinas rank slid in this years tech innovation survey, its Silicon
Dragon tech economy is still regarded as the leading challenger and most likely to
replace the Valley, fueled by the markets huge, fast-growing and towering brands
such as Tencent, Baidu BIDU -1.13%and Alibaba, and a growing footprint overseas.
KPMG partner Egidio Zarrella notes that China is innovating at an impressive
speed, driven by domestic consumption for local brands that are unique to the
market. China will innovate for Chinas sake, he observes, adding that with
improved research and development capabilities, China will bridge the gap in
expanding globally. For another appraisal of Chinas tech innovation prowess, see Forbes post detailing how Mary
innovator in the world with its Google glass and driver-less cars. Meanwhile, about one-quarter pick

Meekers annual trends report singles out the markets merits, including the fact that China leads the world for the most Internet
and mobile communications users and has a tech-savvy consumer class that embraces new technologies. Besides China, its India
that shines in the KPMG survey.

India scores as the second-most likely country to topple the


U.S. for tech leadership. And, significantly, this emerging tiger nation ranks first on
an index that measures each countrys confidence in its own tech innovation
abilities. Based on ten factors, India rates highest on talent, mentoring, and
customer adoption of new technologies. The U.S. came in third on the confidence
index, while Israels Silicon Wadi ranked second. I srael was deemed strong in disruptive technologies,
talent and technology infrastructure. The U.S. was judged strongest in tech infrastructure, access to alliances and partnerships,
talent, and technology breakthroughs, and weakest in educational system and government incentives. Those weaknesses for the
U.S. are points that should be underscored in Americas tech clusters and in the nations capital as future tech leadership unfolds .

A second part of the comprehensive survey covering tech sectors pinpointed cloud
computing and mobile communications as hardly a fad but here to stay at least for

the next three years as the most disruptive technologies. Both were highlighted in
the 2012 report a well. In a change from last year, however, big data and biometrics
(face, voice and hand gestures that are digitally read) were identified as top sectors
that will see big breakthroughs. Its brave new tech world.

2NC Perception Link


The AFF restores trust in internet tech
Danielle Kehl et al 14, Senior Policy Analyst at New Americas Open
Technology Institute. Kevin Bankston is a Policy Director at OTI, Robyn
Greene is a Policy Counsel at OTI, Robert Morgus is a Research Associate at
OTI, Surveillance Costs: The NSAs Impact on the Economy, Internet
Freedom & Cybersecurity, July 2014, pg 40-1
The U.S. government should not require or request that new surveillance capabilities or security
vulnerabilities be built into communications technologies and services, even if these are intended only to facilitate

lawful surveillance. There is a great deal of evidence that backdoors fundamentally weaken the security of
hardware and software, regardless of whether only the NSA purportedly knows about said vulnerabilities, as some
of the documents suggest. A policy state- ment from the Internet Engineering Task Force in 2000 emphasized that
adding a requirement for wiretapping will make affected protocol designs considerably more complex. Experience
has shown that complexity almost inevitably jeopardizes the security of communications. 355 More recently, a May
2013 paper from the Center for Democracy and Technology on the risks of wiretap modifications to endpoints
concludes that deployment of an intercept capability in communications services, systems and applica- tions
poses serious security risks. 356 The authors add that on balance mandating that endpoint software vendors build
intercept functionality into their products will be much more costly to personal, economic and governmental
security overall than the risks associated with not being able to wiretap all communications. 357 While NSA
programs such as SIGINT Enablingmuch like proposals from domestic law enforcement agen- cies to update the
Communications Assistance for Law Enforcement Act (CALEA) to require dig- ital wiretapping capabilities in modern
Internet- based communications services 358 may aim to promote national security and law enforcement by
ensuring that federal agencies have the ability to intercept Internet communications, they do so at a huge cost to
online security overall. Because of the associated security risks, the U.S. government should not mandate or
request the creation of surveillance backdoors in prod- ucts, whether through legislation, court order, or the
leveraging industry relationships to convince companies to voluntarily insert vulnerabilities. As Bellovin et al.
explain, complying with these types of requirements would also hinder innovation and impose a tax on software
development in addition to creating a whole new class of vulnerabilities in hardware and software that un- dermines
the overall security of the products. 359 An amendment offered to the NDAA for Fiscal Year 2015 (H.R. 4435) by
Representatives Zoe Lofgren (D-CA) and Rush Holt (D-NJ) would have prohibited inserting these kinds of
vulnerabilities outright. 360 The Lofgren-Holt proposal aimed to prevent the funding of any intelligence agency,
intelligence program, or intelligence related activity that mandates or requests that a device manufacturer,
software developer, or standards organization build in a backdoor to circumvent the encryption or privacy
protections of its products, unless there is statutory authority to make such a mandate or request. 361 Although
that measure was not adopted as part of the NDAA, a similar amendment sponsored by Lofgren along with
Representatives Jim Sensenbrenner (D-WI) and Thomas Massie (R-KY), did make it into the House-approved version
of the NDAAwith the support of Internet companies and privacy orga- nizations 362 passing on an
overwhelming vote of 293 to 123. 363 Like Representative Graysons amendment on NSAs consultations with NIST
around encryption, it remains to be seen whether this amendment will end up in the final appropri- ations bill that
the President signs. Nonetheless, these legislative efforts are a heartening sign and are consistent with
recommendations from the Presidents Review Group that the U.S. govern- ment should not attempt to deliberately
weaken the security of commercial encryption products. Such mandated vulnerabilities, whether required under
statute or by court order or inserted simply by request, unduly threaten innovation in secure Internet technologies

A clear policy against


such vulnerability mandates is necessary to restore international trust in
U.S. companies and technologies.
while introducing security flaws that may be exploited by a variety of bad actors.

Policies such as the Secure Data Act are perceived as


strengthening security
Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.

He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

the U.S. government should draw a clear line in the sand and declare
that the policy of the U.S. government is to strengthen not weaken
information security. The U.S. Congress should pass legislation, such as
the Secure Data Act introduced by Sen. Wyden (D-OR), banning any
government efforts to introduce backdoors in software or weaken
encryption.43 In the short term, President Obama, or his successor, should sign an executive order formalizing this policy as
Second,

well. In addition, when U.S. government agencies discover vulnerabilities in software or hardware products, they should responsibly
notify these companies in a timely manner so that the companies can fix these flaws. The best way to protect U.S. citizens from
digital threats is to promote strong cybersecurity practices in the private sector.

2NC Chinese Markets Link


Domestic markets are key to Chinese techplan steals Chinese
market share
Lohr 12/2
(Steve Lohr. "In 2015, Technology Shifts Accelerate and China
Rules, IDC Predicts," NYT. 12-2-2014.
http://bits.blogs.nytimes.com/2014/12/02/in-2015-technologyshifts-accelerate-and-china-rules-idc-predicts///ghs-kw)
China. Most of the reporting and commentary
information
technology, its just the opposite, Frank Gens, IDCs chief analyst, said in an interview. China has a
roaring domestic market in technology. In 2015, IDC estimates that nearly 500 million
smartphones will be sold in China, three times the number sold in the United States
and about one third of global sales. Roughly 85 percent of the smartphones sold in
China will be made by its domestic producers like Lenovo, Xiaomi, Huawei, ZTE and
Coolpad. The rising prowess of Chinas homegrown smartphone makers will make it tougher on outsiders, as
Samsungs slowing growth and profits recently reflect. More than 680 million people in China will
be online next year, or 2.5 times the number in the United States. And the China
numbers are poised to grow further , helped by its national initiative, the Broadband China Project,
Beyond the detail, a couple of larger themes stand out. First is

recently on the Chinese economy has been about its slowing growth and challenges. In

intended to give 95 percent of the countrys urban population access to high-speed broadband networks. In all,

Chinas spending on information and communications technology will be more than


$465 billion in 2015, a growth rate of 11 percent. The expansion of the China tech
market will account for 43 percent of tech-sector growth worldwide.

The Chinese market is key to Chinese tech growth


Mozur 1/28
(Paul Mozur. Reporter for the NYT. "New Rules in China Upset Western Tech Companies," New York
Times. 1-28-2015. http://www.nytimes.com/2015/01/29/technology/in-china-new-cybersecurity-rulesperturb-western-tech-companies.html//ghs-kw)

servers and mainframes in China were still produced by


multinationals. Still, Chinese companies are catching up at the lower end . For all
enterprise hardware, local brands represented 21.3 percent revenue share
in 2010 in P.R.C. market and we expect in 2014 that number will reach
43.1 percent, he said, using the abbreviation for the Peoples Republic of China. Thats a huge
jump.
Mr. Yao said 90 percent of high-end

Chinese tech is key to the global industry


Lohr 12/2
(Steve Lohr. "In 2015, Technology Shifts Accelerate and China
Rules, IDC Predicts," NYT. 12-2-2014.
http://bits.blogs.nytimes.com/2014/12/02/in-2015-technologyshifts-accelerate-and-china-rules-idc-predicts///ghs-kw)

China. Most of the reporting and commentary


information
technology, its just the opposite, Frank Gens, IDCs chief analyst, said in an interview. China has a
roaring domestic market in technology. In 2015, IDC estimates that nearly 500 million
smartphones will be sold in China, three times the number sold in the United States
and about one third of global sales. Roughly 85 percent of the smartphones sold in
China will be made by its domestic producers like Lenovo, Xiaomi, Huawei, ZTE and
Beyond the detail, a couple of larger themes stand out. First is

recently on the Chinese economy has been about its slowing growth and challenges. In

Coolpad.

The rising prowess of Chinas homegrown smartphone makers will make it tougher on outsiders, as

More than 680 million people in China will


be online next year, or 2.5 times the number in the United States. And the China
numbers are poised to grow further , helped by its national initiative, the Broadband China Project,
Samsungs slowing growth and profits recently reflect.

intended to give 95 percent of the countrys urban population access to high-speed broadband networks. In all,

Chinas spending on information and communications technology will be


more than $465 billion in 2015, a growth rate of 11 percent. The
expansion of the China tech market will account for 43 percent of techsector growth worldwide.

2NC Tech K2 China Growth


Tech is key to Chinese growth
Xinhua 7/24
(Xinhua. Major Chinese news agency. "Industrial profits decline while high-tech sector shines in China
WCT. 7-24-2015. http://www.wantchinatimes.com/news-subclass-cnt.aspx?
id=20150328000036&amp;cid=1102//ghs-kw)

China's
high-tech industry flourished with the value-added output of the high-tech sector
growing 12.3% year-on-year in 2014. The high-tech industry accounted for 10.6% of
the country's overall industrial value-added output in 2014, which rose 7% from 2013 to 22.8
trillion yuan (US$3.71 trillion). The fast expansion of the high-tech and modern
service industries shows China's economy is advancing to the "middle and
high end," said Xie Hongguang, deputy chief of the NBS. China should work toward greater investment in "soft
Driven by the country's restructuring efforts amid the economic "new normal" of slow but quality growth,

infrastructure"like innovationinstead of "hard infrastructure" to climb the global value chain, said Zhang Monan,
an expert with the China Center for International Economic Exchanges. Indeed,

boosting innovation has

been put at the top of the government's agenda

as China has pledged to boost the


implementation of the "Made in China 2025" strategy, which will upgrade the manufacturing sector and help the
country achieve a medium-high level of economic growth.

China transitioning to tech-based economy


Barry

van Wyk Upstart: Chinas emergence in technology and innovation

---- by Barry van Wyk, The

Beijing Axis First published: May 27, 2010 Last updated: June 3, 2010

Significant progress has already been achieved with the MLP, and it is not hard to identify signs
of Chinas rapidly improving innovative abilities. GERD increased to 1.54 per cent in 2008 from 0.57
per cent in 1995. Occurring at a time when its GDP was growing exceptionally fast, Chinas GERD now
ranks behind only the US and Japan. The number of triadic patents (granted in all three of the major

patent offices in the US, Japan and Europe) granted to China remains relatively small, reaching 433 in 2005
(compared to 652 for Sweden and 3,158 for Korea), yet Chinese patent applications are increasing rapidly.
Chinese patent applications to the World Intellectual Property Office (WIPO), for example, increased by 44 per
cent in 2005 and by a further 57 per cent in 2006. From a total of about 20,000 in 1998, Chinas output
of scientific papers has increased fourfold to about 112,000 as of 2008, moving China to second
place in the global rankings, behind only the US. In the period 2004 to 2008, China produced about
400,000 papers, with the major focus areas being material science, chemistry, physics, mathematics and
engineering, but new fields like biological and medical science also gaining prominence.

China transitioning now


Trends in China's Transition toward a Knowledge Economy Authors:

Adam

Segal, Ira A. Lipman Senior

Fellow for Counterterrorism and National Security Studies Ernest J. Wilson III January/February 20 06 Asian
Survey http://www.cfr.org/publication/9924/trends_in_chinas_transition_toward_a_knowledge_economy.html
During the past decade, China has arguably placed more importance on reforming and modernizing
its information and communication technology (ICT) sector than any other developing country in the
world. Under former Premier Zhu Rongji, the Chinese leadership was strongly committed to making ICT
central to its national goalsfrom transforming Chinese society at home to pursuing its ambitions as a world
economic and political power. In one of his final speeches, delivered at the first session of the 10th National
Peoples Congress in 2003, Zhu implored his successors to energetically promote information
technology (IT) applications and use IT to propel and accelerate industrialization so that the
Chinese Communist Party (CCP) can continue to build a well-off society.1

2NC Global Econ I/L


China economic crash goes globaloutweighs the US and
disproves resiliency empirics
Pesek 14
(Writer for Bloomberg, an edited economic publication What to Fear If China Crashes, Bloomberg
View, http://www.bloombergview.com/articles/2014-07-16/what-to-fear-if-china-crashes)

Few moments in modern financial history were scarier than the week of Sept. 15, 2008,
when first Lehman Brothers and then American International Group collapsed. Who
could forget the cratering stock markets, panicky bailout negotiations, rampant foreclosures, depressing job losses

Yet a Chinese
crash might make 2008 look like a garden party. As the risks of one increase, it's worth
exploring how it might look. After all, China is now the world's biggest trading nation ,
the second-biggest economy and holder of some $4 trillion of foreign-currency
reserves. If China does experience a true credit crisis, it would be felt around the
world. "The example of how the global financial crisis began in one poorly-understood financial market and
and decimated retirement accounts -- not to mention the discouraging recovery since then?

spread dramatically from there illustrates the capacity for misjudging contagion risk," Adam Slater wrote in a July 14

Lehman and AIG, remember, were just two financial firms out of
dozens. Opaque dealings and off-balance-sheet investment vehicles made it virtually impossible even for the
managers of those companies to understand their vulnerabilities -- and those of the broader financial system. The
term "shadow banking system" soon became shorthand for potential instability
and contagion risk in world markets. Well, China is that and more. China surpassed
Oxford Economics report.

Japan in 2011 in gross domestic product and it's gaining on the U.S. Some World Bank researchers even think China
is already on the verge of becoming No. 1 (I'm skeptical). China's world-trade weighting has doubled in the last
decade. But the real explosion has been in the financial sector. Since 2008, Chinese stock valuations surged from
$1.8 trillion to $3.8 trillion and bank-balance sheets and the money supply jumped accordingly. China's broad
measure of money has surged by an incredible $12.5 trillion since 2008 to roughly match the U.S.'s monetary stock.
This enormous money buildup fed untold amounts of private-sector debt along with public-sector institutions. Its
scale, speed and opacity are fueling genuine concerns about a bad-loan meltdown in an economy that's 2 1/2 times
bigger than Germany's. If that happens, at a minimum it would torch China's property markets and could take down
systemically important parts of Hong Kong's banking system. The reverberations probably wouldn't stop there,
however, and would hit resource-dependent Australia, batter trade-driven economies Japan, Singapore, South Korea
and Taiwan and whack prices of everything from oil and steel to gold and corn. " Chinas

importance for

the world economy and the rapid growth of its financial system, mean that there are widespread
concerns that a financial crisis in China would also turn into a global crisis ," says
London-based Slater. "A bad asset problem on this scale would dwarf that seen in the major emerging financial
crises seen in Russia and Argentina in 1998 and 2001, and also be more severe than the Japanese bad loan problem
of the 1990s." Such risks belie President Xi Jinping's insistence that China's financial reform process is a domestic
affair, subject neither to input nor scrutiny by the rest of the world. That's not the case. Just like the Chinese
pollution that darkens Asian skies and contributes to climate change, China's financial vulnerability is a global
problem. U.S. President Barack Obama made that clear enough in a May interview with National Public Radio. We
welcome Chinas peaceful rise," he said. In many ways, it would be a bigger national security problem for us if
China started falling apart at the seams. China's ascent obviously preoccupies the White House as it thwarts U.S.
foreign-policy objectives, taunts Japan and other nations with territorial claims in the Pacific and casts aspersions on

The
potential for things careening out of control in China are real . What worries bears
America's moral leadership. But China's frailty has to be on the minds of U.S. policy makers, too

such as Patrick Chovanec of Silvercrest Asset Management in New York, is Chinas unaltered obsession with building
the equivalent of new Manhattans almost overnight even as the nation's financial system shows signs of buckling.
As policy makers in Beijing generate even more credit to keep bubbles from bursting, the shadow banking system
continues to grow. The longer China delays its reckoning, the worst it might be for China -- and perhaps the rest of
us.

CCP collapse causes the second Great Depression


BHANDARI. 10.
Maya. Head of Emerging Markets Analysis, Lombard Street Research. If the
Chinese Bubble Bursts THE INTERNATIONAL ECONOMY.
http://www.international-economy.com/TIE_F10_ChinaBubbleSymp.pdf
The

latest financial crisis proved the central role of China in driving global economic

outcomes. China is the chief overseas surplus country corresponding to the U.S. deficit, and it
was excess ex ante Chinese savings which prompted ex post U _S. dis-saving. The massive ensuing build-up
of debt triggered a Great Recession almost as bad as the Great Depression . This causal
direction, from excess saving to excess spending, is confirmed by low global real interest rates through much of the

Had over-borrowing been the cause rather than effect , then real interest
rates would have been bid up to attract the required capital. A prospective hard landing
in China might thus be expected to have serious global implications. The Chinese economy
Goldilocks period.

did slow sharply over the last eighteen months, but only briefly, as large-scale Irhind-the-scenes stimulus meant
that it quickly retumed to overheating. Given its 910 percent "trend" growth rate, and 30 per. cent import ratio,
China is nearly twice as powerful a global growth locomotive as the United States, based on its implied import gain.

surrounding export hubs, whose growth prospects are a "second derivative" of what transpires
would suffer most directly from Chinese slowing, the knock to global growth
would be significant. Voracious Chinese demand has also been a crucial driver of global
commodity prices, particularly metals and oil, so they too may face a hard landing if
Chinese demand dries up.
So while the
in China,

CCP collapse deals a massive deflationary shock to the world.


ZHAO. 10.
Chen. Chief Global Strategist and Managing Editor for Global Investment
Strategy, BCA Research Group. If the Chinese Bubble Bursts THE
INTERNATIONAL ECONOMY. http://www.internationaleconomy.com/TIE_F10_ChinaBubbleSymp.pdf
At the onset, I believe the odds of a China asset bub- ble bursting are very low. It is difficult to argue that Chinese
asset markets, particularly real estate, are indeed already in a 'bubble. " Property prices in tier two and tier three
cities are actually quite cheap, but for pur- poses of discussion, there is always the danger that asset values could
get massively inflated over the next few years. If so, a crash would be inevitable. In fact, China experienced a
devastating real estate meltdown and "growth recession" in 199394, when then-premier Zhu Rongii initiated a
credit crackdown to rein in spreading inflation and real estate speculation. Property prices in major cities dropped
by over 40 per- cent and private sector GDP growth dropped to 3 per. cent from double-digit levels. Non-performing
loans soared to 30 pernt of total banking sector assets. It took more than seven years for the government to clean

If another episode of a bursting asset


bubble were to happen in China, the damage to the banking sector could be rather
severe. History has repeatedly show-n that credit inflation begets asset bubbles and,
almost by definition, a bursting asset bubble always leads to a banking crisis and severe
credit contraction. In China's case, bank credit is the lifeline for large state-owned companies, and a credit
up the financial mess and recapitalize the banking system.

crunch could choke off growth of these enterprises quickly. The big difference between today's situation and the
early 1990s, however, is that the Chinese authorities have accumulated '.ast reserves _ China also runs a huge cunent account surplus. In the early 1990s, China's reserves had dwindled to almost nothing and the current account
was in massive deficit. As a real estate meltdown led to a collapse in the Chinese currency in 199293. In other
words, Beijing today has a lot of resources at its disposal to stimulate the economy or to recapitalize the banking
system, whenever necessary. Therefore, the impact of a bursting bubble on growth could be very sham and even

bursting China
bubble would also be felt acutely in commodity prices. The commodity story has been
built around the China story. Naturally, a bursting China bub- ble would deal a devastating
blow to the commodities as well as commodity producers such as Latin America, Australia, and
Canada, among others. Asia as a whole, and Japan in particular, would also be acutely
affected by a "growth recession" in China. The economic integration between China and the
rest of Asia is welldocumented but it is important to note that there has been virtually no domestic
severe, but it would be short-lived because of supp-an from public sector spending _ A

spending in Japan in recent years and the country's economic growth has been leveraged almost entirely on exports

A bursting China bubble could seriously impair Japan's economic and asset market
performance Finally, a bursting China bubble would be a mas- sive deflationary
shock to the world economy. With China in growth recession, global saving excesses could
surge and world aggregate demand would vastly defi- cient. Bond yields could move
to new lows and stocks would drop, probably precipitouslyin short, investors would face very
bleak and frightening prospects.
to China

2NC US Econ I/L


Chinese growth turns the case --- strong Chinese technological
power forms linkages with US companies --- drives growth of
US companies
NRC 10 National Research Council The Dragon and the Elephant: Understanding the Development of
Innovation Capacity in China and India: Summary of a Conference www.nap.edu/openbook.php?
record_id=12873&page=13

Wadhwa found in his surveys that companies go offshore for reasons of cost and where the
markets are. Meanwhile, Asian immigrants are driving enterprise growth in the United States. Twenty-five
percent of technology and engineering firms launched in the last decade and 52% of Silicon Valley startups
had immigrant founders. Indian immigrants accounted for one-quarter of these. Among Americas new
immigrant entrepreneurs, more than 74 percent have a masters or a PhD degree. Yet the backlog of U.S.
immigration applications puts this stream of talent in limbo. One million skilled immigrants are
waiting for the annual quota of 120,000 visas, with caps of 8,400 per country. This is causing a reverse

brain drain from the U nited S tates back to countries of origin, the majority to India and China.
This endangers U.S. innovation and economic growth. There is a high likelihood, however, that
returning skilled talent will create new linkages to U.S. companies , as they are
doing within General Electric, IBM, and other companies. Jai Menon of IBM Corporation began his
survey of IBMs view of global talent recruitment by suggesting that aa. IBM pursues growth of its operations
as a global entity. There are 372,000 IBMers in 172 countries; 123,000 of these are in the Asia-Pacific region.
Eighty percent of the firms R&D activity is still based in the United States. IBM supports open standards
development and networked business models to facilitate global collaboration. Three factors drive the firms
decisions on staff placement and location of recruitment -- economics, skills and environment. IBM India has
grown its staff tenfold in five years; its $6 billion investment in three years represents a tripling of resources in
people, infrastructure and capital. Increasingly, as Vivek Wadhwa suggested, people get degrees in the United
States and return to India for their first jobs. IBM follows a comparable approach in China, with
10,000+ IBM employees involved in R&D, services and sales. In 2006, for the first time the number
of service workers overtook the number of agricultural laborers worldwide. Thus the needs of a service
economy comprise an issue looming for world leaders.

CCP collapse hurts US economy


Karabell 13
(Zachary. American author, historian, money manager and economist. Karabell is President of River
Twice Research, where he analyzes economic and political trends. He is also a Senior Advisor for
Business for Social Responsibility. Previously, he was Executive Vice President, Head of Marketing and
Chief Economist at Fred Alger Management, a New York-based investment firm, and President of Fred
Alger and Company, as well as Portfolio Manager of the China-US Growth Fund, which won both a
Lipper Award for top performance and a 5-star designation from Morningstar, Inc.. He was also
Executive Vice President of Alger's Spectra Funds, a no-load family of mutual funds that launched the
$30 million Spectra Green Fund, which was based on the idea that profit and sustainability are linked.
At Alger, he oversaw the creation, launch and marketing of several funds, led corporate strategy for
acquisitions, and represented the firm at public forums and in the media. Educated at Columbia,
Oxford, and Harvard, where he received his Ph.D., he is the author of several books. The U.S. cant
afford a Chinese economic collapse. Reuters. http://blogs.reuters.com/edgy-optimist/2013/03/07/theu-s-cant-afford-a-chinese-economic-collapse/)
Is China about to collapse? That question has been front and center in the past weeks as the country completes its
leadership transition and after the exposure of its various real estate bubbles during a widely watched 60 Minutes
expos this past weekend. Concerns about soaring property prices throughout China are hardly new, but they have
been given added weight by the government itself. Recognizing that a rapid implosion of the property market would
disrupt economic growth, the central government recently announced far-reaching measures designed to dent the
rampant speculation. Higher down payments, limiting the purchases of investment properties, and a capital gains
tax on real estate transactions designed to make flipping properties less lucrative were included. These measures,
in conjunction with the new governments announcing more modest growth targets of 7.5 percent a year, sent
Chinese equities plunging and led to a slew of commentary in the United States saying China would be the next
shoe to drop in the global system. Yet there is more here than simple alarm over the viability of Chinas economic
growth. There is the not-so-veiled undercurrent of rooting against China. It is difficult to find someone who explicitly
wants it to collapse, but the tone of much of the discourse suggests bloodlust. Given that China largely escaped the
crises that so afflicted the United States and the eurozone, the desire to see it stumble may be understandable. No

one really likes a global winner if that winner isnt you. The need to see China fail verges on jingoism. Americans
distrust the Chinese model, find that its business practices verge on the immoral and illegal, that its reporting and
accounting standards are sub-par at best and that its system is one of crony capitalism run by crony communists.
On Wall Street, the presumption usually seems to be that any Chinese company is a ponzi scheme masquerading as
a viable business. In various conversations and debates, I have rarely heard Chinas economic model mentioned
without disdain. Take, as just one example, Gordon Chang in Forbes: Beijings technocrats can postpone a

consequences of a
Chinese collapse, however, would be severe for the United States and for the world .
There could be no major Chinese contraction without a concomitant contraction in the
United States. That would mean sharply curtailed Chinese purchases of U.S. Treasury bonds ,
far less revenue for companies like General Motors, Nike, KFC and Apple that have robust
business in China (Apple made $6.83 billion in the fourth quarter of 2012, up from $4.08 billion a year prior),
and far fewer Chinese imports of high-end goods from American and Asian companies. It
would also mean a collapse of Chinese imports of materials such as copper, which would in
turn harm economic growth in emerging countries that continue to be a prime market
for American, Asian and European goods. China is now the worlds second-largest
economy, and property booms have been one aspect of its growth. Individual Chinese cannot invest outside of
reckoning, but they have not repealed the laws of economics. There will be a crash. The

the country, and the limited options of Chinas stock exchanges and almost nonexistent bond market mean that if
you are middle class and want to do more than keep your money in cash or low-yielding bank accounts, you buy
either luxury goods or apartments. That has meant a series of property bubbles over the past decade and a series
of measures by state and local officials to contain them. These recent measures are hardly the first, and they are
not likely to be the last. The past 10 years have seen wild swings in property prices, and as recently as 2011 the
government took steps to cool them; the number of transactions plummeted and prices slumped in hot markets like
Shanghai as much as 30, 40 and even 50 percent. You could go back year by year in the 2000s and see similar
bubbles forming and popping, as the government reacted to sharp run-ups with restrictions and then eased them
when the pendulum threatened to swing too far. China has had a series of property bubbles and a series of property
busts. It has also had massive urbanization that in time has absorbed the excess supply generated by massive
development. Today much of that supply is priced far above what workers flooding into Chinas cities can afford. But
that has always been true, and that housing has in time been purchased and used by Chinese families who are
moving up the income spectrum, much as U.S. suburbs evolved in the second half of the 20th century. More to the
point, all property bubbles are not created equal. The housing bubbles in the United States and Spain, for instance,
would never had been so disruptive without the massive amount of debt and the financial instruments and
derivatives based on them. A bursting housing bubble absent those would have been a hit to growth but not a
systemic crisis. In China, most buyers pay cash, and there is no derivative market around mortgages (at most
theres a small shadow market). Yes, there are all sorts of unofficial transactions with high-interest loans, but even
there, the consequences of busts are not the same as they were in the United States and Europe in recent years.
Two issues converge whenever China is discussed in the United States: fear of the next global crisis, and distrust
and dislike of the country. Concern is fine; we should always be attentive to possible risks. But Chinas property
bubbles are an unlikely risk, because of the absence of derivatives and because the central government is clearly
alert to the markets behavior. Suspicion and antipathy, however, are not constructive. They speak to the ongoing
difficulty China poses to Americans sense of global economic dominance and to the belief in the superiority of freemarket capitalism to Chinas state-managed capitalism. The U.S. system may prove to be more resilient over time;

Its success does not require Chinas failure, nor will


Chinas success invalidate the American model. For our own self-interest we should
be rooting for their efforts, and not jingoistically wishing for them to fail.
it has certainly proven successful to date.

2NC Impact UQ
Latest data show Chinese economy is growing nowignore
stock market claims which dont accurately reflect economic
fundamentals
Miller and Charney 7/15
(Miller, Leland R. and Charney, Craig. Mr. Miller is president and Mr. Charney is research director of
China Beige Book International, a private economic survey. Chinas Economy Is Recovering, Wall
Street Journal, 7/15/2015. http://www.wsj.com/articles/chinas-economy-is-recovering-1436979092//ghskw)

China released second-quarter statistics Wednesday that showed the economy growing
at 7%, the same real rate as the first quarter but with stronger nominal growth. That
result, higher than expected and coming just after a stock-market panic, surprised some
commentators and even aroused suspicion that the government cooked the numbers for political reasons.
While official data is indeed unreliable, our firm's latest research confirms that the
Chinese economy is improving after several disappointing quarters -- just not
for the reasons given by Beijing. The China Beige Book (CBB), a private survey of more than 2,000
Chinese firms each quarter, frequently anticipates the official story. We documented the
2012 property rebound, the 2013 interbank credit crunch and the 2014 slowdown in capital expenditure before any

The modest but broad-based improvement in the


Chinese economy that we tracked in the second quarter may seem at odds with the headlines
of carnage in the country's financial markets. But stock prices in China have almost
nothing to do with the economy's fundamentals. Our data show sales revenue, capital
expenditure, new domestic orders, hiring, wages and profits were all
better in the second quarter, making the improvement unmistakable -- albeit
not outstanding in any one category. In the labor market, both employment and wage growth
strengthened, and prospects for hiring look stable. This is not new: Our data have shown the
labor market remarkably steady over the past year, despite the economy's overall deceleration. Inflation data
are also a reason for optimism. Along with wages, input costs and sales prices grew
faster in the second quarter. The rate is still slower than a year ago, but at least this is a break from the
previously unstoppable tide of price deterioration. While it is just one quarter, our data suggest deflation may
have peaked. With the explosive stock market run-up occupying all but the final weeks of the quarter, it might
of them showed up in official statistics.

seem reasonable to conclude that this rally was the impetus behind the better results. Not so. Of all our indicators,
capital expenditure should have responded most positively to a boom in equities prices, but the uptick was barely

The strength of the second-quarter performance is instead found in


widespread expanding sales volumes, which firms were able to accomplish without
sacrificing profit margins. The fact that stronger sales, rather than greater
investment, was the driving force this quarter is itself an encouraging sign in light of
China's longstanding problem of excess investment and inadequate consumption.
These gains also track across sectors, highlighted by a welcome resurgence in both
property and retail. Property saw its strongest results in 18 months, buoyed
by stronger commercial and residential realty as well as transportation
construction. Six of our eight regions were better than last quarter , led by the Southwest
noticeable.

and North. The results were also an improvement over the second quarter of last year, if somewhat less so, with

Retailers, meanwhile, reported a second


consecutive quarter of improvement, both on-quarter and on-year, with growth
accelerating. For the first time in 18 months, the retail sector also had faster growth
than manufacturing, underscoring the danger of treating manufacturing as the bellwether for the economy.
residential construction the sector's major remaining black eye.

Chinas economy is stabilizing now but its fragile


AFP and Reuters 7/15

(Agence France-Presse and Reuters on Deutsche Welle. "China beats expectations on economic
growth," DW. 07-15-2015. http://www.dw.com/en/china-beats-expectations-on-economic-growth/a18584453//ghs-kw)
Slowing growth in key areas like foreign trade, state investment and domestic demand had prompted

economists to predict a year-on-year GDP increase of just under 7 percent for the April-June
quarter. The figure, released by the National Bureau of Statistics (NBS) on Wednesday, matched first-quarter growth

The government has officially set 7 percent as its target for GDP growth
this year. "We are aware that the domestic and external economic conditions are
still complicated, the global economic recovery is slow and tortuous and the
foundation for the stabilization of China's economy needs to be further
consolidated," NBS spokesman Sheng Laiyun told reporters. However, "the major indicators of the
second quarter showed that the growth was stabilized and ready to pick up, the
economy developed with positive changes and the vitality of the economic
development was strengthened," Sheng added. Industrial output, including production at
factories, workshops and mines also rose by 6.8 percent in June compared to 6.1 percent in May, the NBS
said. Tough transition, stock market fluctuating The robust growth comes despite a difficult
economic year for China. Figures released on Monday showed a dipped in foreign trade in the first half of
the year - with exports up slightly but imports well down. Public investment, for years the driver of double-digit
in China exactly.

percentage growth in China, is down as the government seeks to rely more on consumer demand - itself slow to
pick up. In recent weeks, the Shanghai stock market has been falling sharply, albeit after a huge boom in months
leading up to the crash.

Surveys prove China is experiencing growth now


Reuters 6/23
(Reuters. "Chinas Economy Appears to Be Stabilizing, Reports Show," International New York Times. 623-2015. http://www.nytimes.com/2015/06/24/business/international/chinas-economy-appears-to-bestabilizing-reports-show.html//ghs-kw)

Chinas factory activity showed signs of stabilizing in June, with two


nongovernment surveys suggesting that the economy might be regaining some
momentum, while many analysts expected further policy support to ensure a more sure-footed recovery. The
SHANGHAI

preliminary purchasing managers index for China published by HSBC and compiled by Markit, a data analysis firm,
edged up to 49.6 in June. It was the surveys highest level in three months but still below the 50 mark, which would

The pickup in new orders which


returned to positive territory at 50.3 in June was driven by a strong rise in the new
export orders subcomponent, suggesting that foreign demand may finally be
turning a corner, Capital Economics analysts wrote in a research note. Todays P.M.I. reading reinforces our
have pointed to an expansion. The final reading for May was 49.2.

view that the economy has started to find its footing. But companies stepped up layoffs, the survey showed,
shedding jobs at the fastest pace in more than six years. Annabel Fiddes, an economist at Markit, said:
Manufacturers continued to cut staff. This suggests companies have relatively muted growth expectations. She

A much rosier
picture was painted by a separate survey, a quarterly report by China Beige Book International, a
data analysis firm, describing a broad-based recovery in the second quarter, led
primarily by Chinas interior provinces. Among major sectors, two developments
stand out: a welcome resurgence in retail which saw rising revenue growth
despite a slip in prices and a broad-based rebound in property, said the reports
authors, Leland Miller and Craig Charney. Manufacturing, services, real estate,
agriculture and mining all had year-on-year and quarterly gains, they said.
said that she expected Beijing to step up their efforts to stimulate growth and job creation.

2NC US Heg I/L


Chinese growth is key to US hegemony

Yiwei 07 Wang yiwei, Center for American Studies @ Fudan University, China's Rise: An Unlikely Pillar of US Hegemony, Harvard
International Review, Volume 29, Issue 1 Spring7, pp. 60-63.

Chinas rise is taking place in this context. That is to say, Chinese development is merely one facet of Asian
and developing states economic progress in general. Historically, the United States has provided the
dominant development paradigm for the world. But today, China has come up with development strategies
that are different from that of any other nation-state in history and are a consequence of the global migration
of industry along comparative advantage lines. Presently, the movement of light industry and consumer goods
production from advanced industrialized countries to China is nearly complete, but heavy industry is only
beginning to move. Developed countries dependence on China will be far more pronounced

following this movement. As global production migrates to China and other developing
countries, a feedback loop will emerge and indeed is already beginning to emerge. Where
globalization was once an engine fueled by Western muscle and steered by Western policy,
there is now more gas in the tank but there are also more hands on the steering wheel. In the
past, developing countries were often in a position only to respond to globalization, but now, developed
countries must respond as well. Previously the United States believed that globalization was synonymous with
Americanization, but todays world has witnessed a United States that is feeling the influence of the world as
well. In the past, a sneeze on Wall Street was followed by a downturn in world markets. But in February 2007,
Chinese stocks fell sharply and Wall Street responded with its steepest decline in several years. In this way,
the whirlpool of globalization is no longer spinning in one direction. Rather, it is generating feedback
mechanisms and is widening into an ellipse with two focal points: one located in the United States, the
historical leader of the developed world, and one in the China, the strongest country in the new developing
world power bloc. Combating Regionalization It is important to extend the discussion beyond platitudes
regarding US decline or the rise of China and the invective-laden debate over threats and security issues
that arises from these. We must step out of a narrowly national mindset and reconsider what Chinese
development means for the United States. One of the consequences of globalization has been that

countries such as China, which depend on exporting to US markets, have accumulated large
dollar reserves. This has been unavoidable for these countries, as they must purchase dollars in order to
keep the dollar strong and thus avoid massive losses. Thus, the United States is bound to bear a trade
deficit, and moreover, this deficit is inextricably tied to the dollars hegemony in todays
markets. The artificially high dollar and the US economy at large depend in a very real sense
on Chinas investment in the dollar. Low US inflation and interest rates similarly depend on
the thousands of Made in China labels distributed across the United States. As Paul Krugman
wrote in The New York Times, the situation is comparable to one in which the American sells the house but
the money to buy the house comes from China. Former US treasury secretary Lawrence Summers even
affirms that China and the United States may be in a kind of imprudent balance of financial terror. Today,
the US trade deficit with China is US$200 billion. China holds over US$1 trillion in foreign exchange reserves
and US$350 billion in US bonds. Together, the Chinese and US economies account for half of global economic
growth. Thus, a fantastic situation has arisen: Chinas rise is actually supporting US hegemony. Taking US

hegemony and Western preeminence as the starting point, many have concluded that the
rise of China presents a threat. The premise of this logic is that the international system predicated on
US hegemony and Western preeminence would be destabilized by the rise of a second major power. But this
view is inconsistent with the phenomenon of one-way globalization. The so-called process of
one-way globalization can more truly be called Westernization. Todays globalization is still in
large part driven by the West, inasmuch as it is tinged by Western unilateralism and entails the
dissemination of essentially Western standards and ideology. For example, Coca Cola has become a
Chinese cultural icon, Louis Vuitton stores crowd high-end shopping districts in Shanghai, and, as gender
equality progresses, Chinese women look to Western women for inspiration. In contrast, Haier,
the best-known Chinese brand in the United States, is still relatively unknown, and Wang Fei, who is widely
regarded in China as the pop star who was able to make it in the United States, has less name-recognition
there than a first-round American Idol cut.

2NC Growth Impacts


Chinese growth prevents global economic collapse, war over
Taiwan and CCP collapse
Lewis 08 [Dan, Research Director Economic Research Council, The
Nightmare of a Chinese Economic Collapse, World Finance, 5/13,
http://www.worldfinance.com/news/home/finalbell/article117.html]
In 2001, Gordon Chang authored a global bestseller "The Coming Collapse of China." To suggest that the worlds
largest nation of 1.3 billion people is on the brink of collapse is understandably for many, a deeply unnerving
theme. And many seasoned China Hands rejected Changs thesis outright. In a very real sense, they were of
course right. Chinas expansion has continued over the last six years without a hitch . After
notching up a staggering 10.7 percent growth last year, it is now the 4th largest economy in the world with a
nominal GDP of $2.68trn. Yet there are two Chinas that concern us here; the 800 million who live in the cities,
coastal and southern regions and the 500 million who live in the countryside and are mainly engaged in agriculture.
The latter which we in the West hear very little about are still very poor and much less happy. Their poverty and
misery do not necessarily spell an impending cataclysm after all, that is how they have always have been. But it
does illustrate the inequity of Chinese monetary policy. For many years, the Chinese yen has been held at an
artificially low value to boost manufacturing exports. This has clearly worked for one side of the economy, but not
for the purchasing power of consumers and the rural poor, some of who are getting even poorer. The central reason
for this has been the inability of Chinese monetary policy to adequately support both Chinas. Meanwhile, rural

unrest in China is on the rise fuelled not only by an accelerating income gap with the
coastal cities, but by an oft-reported appropriation of their land for little or no compensation
by the state. According to Professor David B. Smith, one of the Citys most accurate and respected economists in
recent years, potentially far more serious though is the impact that Chinese monetary policy could have on many
Western nations such as the UK. Quite simply, Chinas undervalued currency has enabled Western governments to
maintain artificially strong currencies, reduce inflation and keep interest rates lower than they might otherwise be.
We should therefore be very worried about how vulnerable Western economic growth is to an upward revaluation of
the Chinese yuan. Should that revaluation happen to appease Chinas rural poor, at a stroke, the dollar, sterling and
the euro would quickly depreciate, rates in those currencies would have to rise substantially and the yield on
government bonds would follow suit. This would add greatly to the debt servicing cost of budget deficits in the USA,
the UK and much of euro land. A reduction in demand for imported Chinese goods would quickly entail a decline in
Chinas economic growth rate. That is alarming. It has been calculated that to keep Chinas society

stable ie to manage the transition from a rural to an urban societywithout devastating


unemployment - the minimum growth rate is 7.2 percent. Anything less than that and
unemployment will rise and the massive shift in population from the country to the cities
becomes unsustainable. This is when real discontent with communist party rulebecomes
vocal and hard to ignore. It doesnt end there. That will at best bring a global recession. The
crucial point is that communist authoritarian states have at least had some success in
keeping a lid on ethnic tensions so far. But when multi-ethnic communist countries fall
apartfrom economic stress and the implosion of central power, history suggests that they
dont become successful democracies overnight. Far from it. Theres a very real chance that
China might go the way of Yugoloslavia or the Soviet Union chaos, civil unrestand
internecine war. In the very worst case scenario,a Chinese government might seek to maintain
national cohesion by going to war with Taiwan whom America is pledged to defend.

Chinese economic growth prevents global nuclear war


Kaminski 7 (Antoni Z., Professor Institute of Political Studies, World
Order: The Mechanics of Threats (Central European Perspective), Polish
Quarterly of International Affairs, 1, p. 58)
As already argued, the economic advance of China has taken place with relatively few corresponding changes in the
political system, although the operation of political and economic institutions has seen some major changes. Still,
tools are missing that would allow the establishment of political and legal foundations for the modem economy, or
they are too weak. The tools are efficient public administration, the rule of law, clearly defined ownership rights,

an economic crisis in China.


Considering the importance of the state for the development of the global economy, the crisis would have
serious global repercussions. Its political ramifications could be no less dramatic owing to the special
efficient banking system, etc. For these reasons, many experts fear

position the military occupies in the Chinese political system, and the existence of many potential vexed issues in

A potential hotbed of conflict is also


Taiwan's status. Economic recession and the related destabilization of internal policies could lead
to apolitical, or even military crisis. The likelihood of the global escalation of the conflict is
high, as the interests of Russia, China, Japan, Australia and , first and foremost, the US
clash in the region.
East Asia (disputes over islands in the China Sea and the Pacific).

Chinas economic rise is good --- theyre on the brink of


collapse --- causes CCP instability and lashout --- also tubes the
global economy, US primacy, and Sino relations
Mead 9 Walter Russell Mead, Henry A. Kissinger Senior Fellow in U.S.
Foreign Policy at the Council on Foreign Relations, Only Makes You
Stronger, The New Republic, 2/4/9, http://www.tnr.com/story_print.html?
id=571cbbb9-2887-4d81-8542-92e83915f5f8
The greatest danger both to U.S.-China relations and to American power itself is
probably not that China will rise too far, too fast; it is that the current crisis might end
China's growth miracle. In the worst-case scenario, the turmoil in the international economy will plunge
China into a major economic downturn. The Chinese financial system will implode
as loans to both state and private enterprises go bad. Millions or even tens of millions of Chinese will be
unemployed in a country without an effective social safety net. The collapse of asset
bubbles in the stock and property markets will wipe out the savings of a generation of the Chinese
middle class. The political consequences could include dangerous unrest--and a
bitter climate of anti-foreign feeling that blames others for China's woes. (Think of

Weimar Germany, when both Nazi and communist politicians blamed the West for Germany's economic
travails.) Worse, instability could lead to a vicious cycle , as nervous investors moved their
money out of the country, further slowing growth and, in turn, fomenting ever-greater
bitterness. Thanks to a generation of rapid economic growth, China has so far been able to manage
the stresses and conflicts of modernization and change; nobody knows what will happen if
the growth stops.

Growth decline threatens CCP rule---theyll start diversionary


wars in response
Shirk 7 Susan L. Shirk is an expert on Chinese politics and former Deputy
Assistant Secretary of State during the Clinton administration. She was in the
Bureau of East Asia and Pacific Affairs (People's Republic of China, Taiwan,
Hong Kong and Mongolia). She is currently a professor at the Graduate
School of International Relations and Pacific Studies at the University of
California, San Diego. She is also a Senior Director of Albright Stonebridge
Group, a global strategy firm, where she assists clients with issues related to
East Asia. China: Fragile Superpower, Book

By sustaining high rates of economic growth, Chinas leaders create new jobs and
limit the number ofunemployed workers who might go to the barricades. Binding the public to
the Party through nationalism also helps preempt opposition. The trick is to find a foreign policy approach that can achieve both these vital objectives
simultaneously. How long can it last? Viewed objectively, Chinas communist regime looks surprisingly resil- ient. It may be capable of surviving for years
to come so long as the economy continues to grow and create jobs. Survey research in Beijing shows wide- spread support (over 80 percent) for the
political system as a whole linked to sentiments of nationalism and acceptance of the CCPs argument about stability first.97 Without making any
fundamental changes in the CCP- dominated political systemleaders from time to time have toyed with reform ideas such as local elections but in each
instance have backed away for fear of losing controlthe Party has bought itself time. As scholar Pei Minxin notes, the ability of communist regimes to use
their patronage and coercion to hold on to power gives them little incentive to give up any of that power by introducing gradual democratization from

the greatest
political risk lying ahead of them is the possibility of an economic crash that throws
millions of workers out of their jobs or sends millions of depositors to withdraw their savings from the shaky banking system.
A massive environmental or public health disaster also could trigger regime collapse , especially if
peoples lives are endangered by a media cover-up imposed by Party authorities. Nationwide rebellion becomes a real
possibility when large numbers of people are upset about the same issue at the same time. Another
dangerous scenario is a domesticor international crisis in which the CCP leaders feel
compelled to lash out against Japan, Taiwan, or the United States because from
their point of view not lashing out might endanger Party rule .
above. Typically, only when communist systems implode do their political fun- damentals change.98 As Chinas leaders well know,

Chinese Growth Key to Military Restraint on Taiwan- Decline of


Economic Influence Causes China to Resort to Military
Aggression
Lampton, 3 (David, Director Chinese Studies, Nixon Center, FDCH, 3/18)
The Chinese realize that power has different faces--military, economic, and
normative (ideological) power. Right now, China is finding that in the era of
globalization, economic power (and potential economic power) is the form of power it
has in greatest abundance and which it can use most effectively. As long as economic
influence continues to be effective for Beijing, as it now seems to be in dealing with Taiwan,
for example, China is unlikely to resort to military intimidation as its chief foreign policy
instrument.

Decline causes lashout- nationalists target the US and Taiwan


Friedberg professor of IR at Princeton2011 (July/August, Aaron L., professor of
politics and international affairs at the Woodrow Wilson School at Princeton University, Hegemony with Chinese
Characteristics, The National Interest, lexis)

fears of aggression are heightened by an awareness that anxiety over a lack of


legitimacy at home can cause nondemocratic governments to try to deflect popular
frustration and discontent toward external enemies . Some Western observers worry, for
example, that if Chinas economy falters its rulers will try to blame foreigners and even
manufacture crises with Taiwan, Japan or the United States in order to rally their
people and redirect the populations anger. Whatever Beijings intent, such
confrontations couldeasilyspiral out of control .Democratic leaders are hardly immune to the
Such

temptation of foreign adventures. However, because the stakes for them are so much lower (being voted out of
office rather than being overthrown and imprisoned, or worse), they are less likely to take extreme risks to retain
their hold on power.

2NC China-India War Impact


Economic collapse will crush party legitimacy and ignite social
instability Li 9 (Cheng, Dir. of Research, John L. Thornton China Center,
Chinas Team of Rivals Brookings Foundation Article
series,Marcyhttp://www.brookings.edu/articles/2009/03_china_l
i.aspx)
The two dozen senior politicians who walk the halls of Zhongnanhai, the compound of the Chinese Communist
Partys leadership in Beijing, are worried. What was inconceivable a year ago now threatens their

Exports, critical to Chinas searing economic growth,


have plunged. Thousands of factories and businesses, especially those in the
prosperous coastal regions, have closed. In the last six months of 2008, 10
million workers, plus 1 million new college graduates, joined the already
gigantic ranks of the countrys unemployed. During the same period, the
Chinese stock market lost 65 percent of its value, equivalent to $3 trillion.
The crisis, President Hu Jintao said recently, is a test of our ability to control
a complex situation, and also a test of our partys governing ability.With this
rule: an economy in freefall.

rapid downturn, the Chinese Communist Party suddenly looks vulnerable. Since Deng Xiaoping
initiated economic reforms three decades ago, the partys legitimacy has relied upon its ability to

keep the economy running at breakneck pace.If China is no longer able to maintain a high
growth rate or provide jobs for its ever growing labor force, massive public dissatisfaction
and social unrest could erupt. No one realizes this possibility more than the handful of people who steer
Chinas massive economy. Double-digit growth has sheltered them through a SARS epidemic, massive earthquakes,
and contamination scandals. Now, the crucial question is whetherthey are equipped to handle an
economic crisis of this magnitudeand survive the political challenges it will bring . This year marks the
60th anniversary of the Peoples Republic, and the ruling party is no longer led by one strongman, like
Mao Zedong or Deng Xiaoping. Instead, the Politburo and its Standing Committee, Chinas most powerful
body, are run by two informal coalitions that compete against each other for power, influence, and control
over policy. Competition in the Communist Party is, of course, nothing new. But the jockeying today is no
longer a zero-sum game in which a winner takes all. It is worth remembering that when Jiang Zemin
handed the reins to his successor, Hu Jintao, in 2002, it marked the first time in the republics history that the
transfer of power didnt involve bloodshed or purges. Whats more, Hu was not a protg of Jiangs; they belonged

post-Deng China has


been run by a team of rivals. This internal competition was enshrined as party practice a little more
to competing factions. To borrow a phrase popular in Washington these days,

than a year ago. In October 2007, President Hu surprised many China watchers by abandoning the partys
normally straightforward succession procedure and designating not one but two heirs apparent. The Central
Committee named Xi Jinping and Li Keqiangtwo very different leaders in their early 50s to the
nine-member Politburo Standing Committee, where the rulers of China are groomed. The future roles of these two
men, who will essentially share power after the next party congress meets in 2012, have since been refined: Xi will
be the candidate to succeed the president, and Li will succeed Premier Wen Jiabao. The two rising stars
share little in terms of family background, political association, leadership skills, and policy orientation. But they are
each heavily involved in shaping economic policyand they are expected to lead the two competing

coalitions that will be relied upon to craft Chinas political and economic trajectory in the
next decade and beyond.

Regime collapse causes China-India war


Cohen 02 (Stephen, Senior Fellow Brookings Institution, Nuclear Weapons and Nuclear War in South Asia: An
Unknowable Future, May, http://www.brookings.edu/dybdocroot/views/speeches/cohens20020501.pdf)

A similar argument may be made with respect to China. China is a country that has had its share of upheavals in
the past. While there is no expectation today of renewed internal turmoil, it is important to remember that closed
authoritarian societies are subject to deep crisis in moments of sudden change. The breakup of the Soviet Union

and Yugoslavia, and the turmoil that has ravaged many members of the former communist bloc are examples of
what could happen to China. A severe economic crisis, rebellions in Tibet and Xinjiang, a reborn democracy
movement and a party torn by factions could be the ingredients of an unstable situation. A vulnerable Chinese
leadership determined to bolster its shaky position by an aggressive policy toward India or the United States or both
might become involved in a major crisis with India, perhaps engage in nuclear saber-rattling. That would encourage
India to adopt a stronger nuclear posture, possibly with American assistance.

Causes nuclear use

Jonathan S. Landay, National Security and Intelligence Correspondent, -2K [Top Administration Officials Warn
Stakes for U.S. Are High in Asian Conflicts, Knight Ridder/Tribune News Service, March 10, p. Lexis]
Few if any experts think China and Taiwan, North Korea and South Korea, or India and Pakistan

are spoiling to fight. But even a minor miscalculation by any of them could destabilize Asia,
jolt the global economy and even start a nuclear war. India, Pakistan and China all have
nuclear weapons, and North Korea may have a few, too. Asia lacks the kinds of organizations,
negotiations and diplomatic relationships that helped keep an uneasy peace for five decades
in Cold War Europe. Nowhere else on Earth are the stakes as high and relationships so fragile, said Bates Gill,
director of northeast Asian policy studies at the Brookings Institution, a Washington think tank. We see the
convergence of great power interest overlaid with lingering confrontations with no institutionalized security
mechanism in place. There are elements for potential disaster. In an effort to cool the regions tempers, President
Clinton, Defense Secretary William S. Cohen and National Security Adviser Samuel R. Berger all will hopscotch
Asias capitals this month. For America, the stakes could hardly be higher. There are 100,000 U.S. troops in Asia
committed to defending Taiwan, Japan and South Korea, and the United States would instantly become embroiled if
Beijing moved against Taiwan or North Korea attacked South Korea. While Washington has no defense

commitments to either India or Pakistan, a conflict between the two could end the global
taboo against using nuclear weapons and demolishthe already shaky international
nonproliferation regime. In addition, globalization has made a stable Asia _ with its massive markets, cheap
labor, exports and resources _ indispensable to the U.S. economy. Numerous U.S. firms and millions of American
jobs depend on trade with Asia that totaled $600 billion last year, according to the Commerce Department.

2NC Bioweapons Impact


The CCP would lash out for power, and they would use
bioweapons
Renxin 05Renxin, Journalist, 8-3-2K5 (San, CCP Gambles Insanely to Avoid Death, Epoch Times,
www.theepochtimes.com/news/5-8-3/30931.html)

Since the Partys life is above all else, it would not be surprising if the CCP resorts
to the use of biological, chemical, and nuclear weapons in its attempt to postpone
its life. The CCP,that disregards human life, would not hesitate to kill two hundred million
Americans, coupled with seven or eight hundred million Chinese, to achieve its
ends. The speech, free of all disguises, lets the public see the CCP for what it really is: with evil filling its every
cell, the CCP intends to fightall of mankind in its desperate attempt to clingto life. And

that is the theme of the speech. The theme is murderous and utterly evil. We did witness in China beggars who
demanded money from people by threatening to stab themselves with knives or prick their throats on long nails.
But we have never, until now, seen a rogue who blackmails the world to die with it by wielding biological, chemical,
and nuclear weapons. Anyhow, the bloody confession affirmed the CCPs bloodiness: a monstrous murderer, who
has killed 80 million Chinese people, now plans to hold one billion people hostage and gamble with their lives. As
the CCP is known to be a clique with a closed system, it is extraordinary for it to reveal its top secret on its own.
One might ask: what is the CCPs purpose to make public its gambling plan on its deathbed? The answer is: the
speech would have the effect of killing three birds with one stone. Its intentions are the following: Expressing the
CCPs resolve that it not be buried by either heaven or earth (direct quote from the speech). But then, isnt the
CCP opposed to the universe if it claims not to be buried by heaven and earth? Feeling the urgent need to harden
its image as a soft egg in the face of the Nine Commentaries. Preparing publicity for its final battle with mankind by
threatening war and trumpeting violence. So, strictly speaking, what the CCP has leaked out is more of an attempt
to clutch at straws to save its life rather than to launch a trial balloon. Of course, the way the speech was
presented had been carefully prepared. It did not have a usual opening or ending, and the audience, time, place,
and background related to the speech were all kept unidentified. One may speculate or imagine as one may, but
never verify. The aim was obviously to create a mysterious setting. In short, the speech came out as something
one finds difficult to tell whether it is false or true.

Outweighs and causes extinction


Ochs 2Past president of the Aberdeen Proving Ground Superfund Citizens Coalition, Member of the Depleted
Uranium Task force of the Military Toxics Project, and M of the Chemical Weapons Working Group [Richard Ochs, ,
June 9, 2002, Biological Weapons Must Be Abolished Immediately,
http://www.freefromterror.net/other_articles/abolish.html]

genetically engineered biological weapons, many


without a known cure or vaccine, are an extreme danger to the continued survival
of life on earth. Any perceived military value or deterrence pales in comparison to the great risk these
weapons pose just sitting in vials in laboratories. While a nuclear winter, resulting from a massive exchange of
Of all the weapons of mass destruction, the

nuclear weapons, could also kill off most of life on earth and severely compromise the health of future generations,

are easier to control. Biological weapons , on the other hand, can get out of
control very easily, as the recent anthrax attacks has demonstrated. There is no way to guarantee the
security of these doomsday weapons because very tiny amounts can be stolen or accidentally released and
then grow or be grown to horrendous proportions. The Black Death of the Middle Ages would be
they

small in comparison to the potential damage bioweapons could cause. Abolition of chemical weapons is less of a
priority because, while they can also kill millions of people outright, their persistence in the environment would be
less than nuclear or biological agents or more localized. Hence, chemical weapons would have a lesser effect on
future generations of innocent people and the natural environment. Like the Holocaust, once a localized chemical
extermination is over, it is over. With nuclear and biological weapons, the killing will probably never end.
Radioactive elements last tens of thousands of years and will keep causing cancers virtually forever. Potentially

agents by the hundreds with no known cure could wreck


even greater calamity on the human race than could persistent radiation. AIDS and ebola viruses
worse than that, bio-engineered

are just a small example of recently emerging plagues with no known cure or vaccine. Can we imagine hundreds of
such plagues? HUMAN EXTINCTION IS NOW POSSIBLE. Ironically, the Bush administration has just
changed the U.S. nuclear doctrine to allow nuclear retaliation against threats upon allies by conventional weapons.
The past doctrine allowed such use only as a last resort when our nations survival was at stake. Will the new policy
also allow easier use of US bioweapons? How slippery is this slope?

2NC AT Collapse Good


Reject their collapse good argumentstheyre racist and
incoherentChinese collapse decimates the U.S. for several
reasons
Karabell, 13PhD @ Harvard, President of River Twice Research
Zachary, The U.S. cant afford a Chinese economic collapse, The Edgy
Optimist, a Reuters blog run by Karabell, March 7,
http://blogs.reuters.com/edgy-optimist/2013/03/07/the-u-s-cant-afford-achinese-economic-collapse/ --BR
Is China about to collapse? That question has been front and center in the past weeks as
the country completes its leadership transition and after the exposure of its various real estate bubbles during a

Concerns about soaring property prices throughout China


are hardly new, but they have been given added weight by the government itself .
widely watched 60 Minutes expos this past weekend.

Recognizing that a rapid implosion of the property market would disrupt economic growth, the central government
recently announced far-reaching measures designed to dent the rampant speculation. Higher down payments,
limiting the purchases of investment properties, and a capital gains tax on real estate transactions designed to
make flipping properties less lucrative were included. These measures, in conjunction with the new governments
announcing more modest growth targets of 7.5 percent a year, sent Chinese equities plunging and led to a slew of

Yet there is
more here than simple alarm over the viability of Chinas economic growth. There is
the not-so-veiled undercurrent of rooting against China . It is difficult to find
someone who explicitly wants it to collapse, but the tone of much of the discourse
suggests bloodlust. Given that China largely escaped the crises that so afflicted the
United States and the eurozone, the desire to see it stumble may be
understandable. No one really likes a global winner if that winner isnt you. The
need to see China fail verges on jingoism. Americans distrust the Chinese
model, find that its business practices verge on the immoral and illegal, that its
reporting and accounting standards are sub-par at best and that its system is one of
crony capitalism run by crony communists. On Wall Street, the presumption usually
seems to be that any Chinese company is a ponzi scheme masquerading as a viable
business. In various conversations and debates, I have rarely heard Chinas
economic model mentioned without disdain. Take, as just one example, Gordon
Chang in Forbes: Beijings technocrats can postpone a reckoning, but they have not
repealed the laws of economics. There will be a crash. The consequences of a
Chinese collapse, however, would be severe for the United States and for the
world. There could be no major Chinese contraction without a concomitant
contraction in the United States. That would mean sharply curtailed Chinese
purchases of U.S. Treasury bonds, far less revenue for companies like General
Motors, Nike, KFC and Apple that have robust business in China (Apple made $6.83 billion in
the fourth quarter of 2012, up from $4.08 billion a year prior), and far fewer Chinese imports of highend goods from American and Asian companies. It would also mean a collapse of
Chinese imports of materials such as copper, which would in turn harm economic
growth in emerging countries that continue to be a prime market for American,
Asian and European goods. China is now the worlds second-largest economy, and property booms have
commentary in the United States saying China would be the next shoe to drop in the global system.

been one aspect of its growth. Individual Chinese cannot invest outside of the country, and the limited options of
Chinas stock exchanges and almost nonexistent bond market mean that if you are middle class and want to do
more than keep your money in cash or low-yielding bank accounts, you buy either luxury goods or apartments. That
has meant a series of property bubbles over the past decade and a series of measures by state and local officials to
contain them. These recent measures are hardly the first, and they are not likely to be the last. The past 10 years
have seen wild swings in property prices, and as recently as 2011 the government took steps to cool them; the
number of transactions plummeted and prices slumped in hot markets like Shanghai as much as 30, 40 and even
50 percent. You could go back year by year in the 2000s and see similar bubbles forming and popping, as the

government reacted to sharp run-ups with restrictions and then eased them when the pendulum threatened to
swing too far. China has had a series of property bubbles and a series of property busts. It has also had massive
urbanization that in time has absorbed the excess supply generated by massive development. Today much of that
supply is priced far above what workers flooding into Chinas cities can afford. But that has always been true, and
that housing has in time been purchased and used by Chinese families who are moving up the income spectrum,
much as U.S. suburbs evolved in the second half of the 20th century. More to the point, all property bubbles are not
created equal. The housing bubbles in the United States and Spain, for instance, would never had been so
disruptive without the massive amount of debt and the financial instruments and derivatives based on them. A
bursting housing bubble absent those would have been a hit to growth but not a systemic crisis. In China, most
buyers pay cash, and there is no derivative market around mortgages (at most theres a small shadow market). Yes,
there are all sorts of unofficial transactions with high-interest loans, but even there, the consequences of busts are

Two issues converge


whenever China is discussed in the United States: fear of the next global crisis, and
distrust and dislike of the country. Concern is fine; we should always be attentive to
possible risks. But Chinas property bubbles are an unlikely risk, because of the absence of derivatives and
because the central government is clearly alert to the markets behavior. Suspicion and antipathy,
however, are not constructive. They speak to the ongoing difficulty China poses to
Americans sense of global economic dominance and to the belief in the superiority
of free-market capitalism to Chinas state-managed capitalism. The U.S. system
may prove to be more resilient over time; it has certainly proven successful to date.
Its success does not require Chinas failure, nor will Chinas success invalidate
the American model. For our own self-interest we should be rooting for their
efforts, and not jingoistically wishing for them to fail.
not the same as they were in the United States and Europe in recent years.

2NC AT Collapse Inevitable


Status quo isnt sufficient to trigger collapse because the US is
lagging behind
Forbes, 7/9/2014
US Finance/Economics News Report Service
(John Kerry In Beijing: Four Good Reasons Why The Chinese View American
Leaders As Empty
Suits,http://www.forbes.com/sites/eamonnfingleton/2014/07/09/john-kerryin-beijing-four-good-reasons-why-the-chinese-treat-american-leaders-asjackasses/)
2. American policymakers have procrastinated in meeting the Chinese
challenge because they have constantly for more than a decade now
been misled by siren American voices predicting an imminent Chinese
financial collapse. China is a big economy and large financial collapses are
not inconceivable. But even the most disastrous such collapse would be
unlikely to stop the Chinese export drive in its tracks. American policymakers
have failed to pay sufficient attention to the central objective of Chinese
policy, which is to take over from the United States, Japan and Germany as
the worlds premier source of advanced manufactured products.

Consensus exists and the best markers point to a slow decline,


and the worst markers make sense in the context of china
Huang, 2/11, a senior associate in the Carnegie Asia Program, where his
research focuses on Chinas economic development and its impact on Asia
and the global economy (Yukon, Do Not Fear a Chinese Property Bubble,
Carnegie Endowment for International Peace,
http://carnegieendowment.org/2014/02/11/do-not-fear-chinese-propertybubble/h0oz)
Yet when analysts drill into the balance sheets of borrowers and banks, they find
little evidence of impending disaster. Government debt ratios are not high by global
standards and are backed by valuable assets at the local level. Household debt is a fraction
of what it is in the west, and it is supported by savings and rising incomes. The profits and cash
positions of most firms for which data are available have not deteriorated significantly
while sovereign guarantees cushion the more vulnerable state enterprises. The consensus, therefore, is
that Chinas debt situation has weakened but is manageable. Why are the views from

detailed sector analysis so different from the red flags signalled by the broader macro debt indicators? The answer
lies in the role that land values play in shaping these trends. Take the two most pressing concerns: rising debt
levels as a share of gross domestic product and weakening links between credit expansion and GDP growth. The
first relates to the surge in the ratio of total credit to GDP by about 50-60 percentage points over the past five
years, which is viewed as a strong predictor of an impending crash. Fitch, a rating agency, is among those who see
this as the fallout from irresponsible shadow-banking which is being channelled into property development, creating
a bubble. The second concern is that the credit impulse to growth has diminished, meaning that more and more
credit is needed to generate the same amount of GDP, which reduces prospects for future deleveraging. Linking
these two concerns is the price of land including related mark-ups levied by officials and developers. But its
significance is not well understood because Chinas property market emerged only in the late 1990s, when the
decision was made to privatise housing. A functioning resale market only began to form around the middle of the
last decade. That is why the large stimulus programme in response to the Asia financial crisis more than a decade
ago did not manifest itself in a property price surge, whereas the 2008-9 stimulus did. Over the past decade, no
other factor has been as important as rising property values in influencing growth patterns and perceptions of
financial risks. The weakening impact of credit on growth is largely explained by the divergence between fixed asset
investment (FAI) and gross fixed capital formation (GFCF). Both are measures of investment. FAI measures
investment in physical assets including land while GFCF measures investment in new equipment and structures,

excluding the value of land and existing assets. This latter feeds directly into GDP, while only a portion of FAI shows
up in GDP accounts. Until recently, the difference between the two measures did not matter in interpreting
economic trends: both were increasing at the same rate and reached about 35 per cent of GDP by 2002-03. Since
then, however, they have diverged and GFCF now stands at 45 per cent of GDP while the share of FAI has jumped to
70 per cent. Overall credit levels have increased in line with the rapid growth in FAI rather than the more modest
growth in GFCF. Most of the difference between the ratios is explained by rising asset prices. Thus a large share of
the surge in credit is financing property related transactions which explains why the growth impact of credit has

Is the increase in property and underlying land prices sustainable, or is it a


bubble? Part of the explanation is unique to China. Land in China is an asset whose market value went largely
declined.

unrecognised when it was totally controlled by the State. Once a private property market was created, the process
of discovering lands intrinsic value began, but establishing such values takes time in a rapidly changing economy.

Price Index indicates that from 2004-2012, land prices have


increased approximately fourfold nationally, with more dramatic increases in major cities such as
The Wharton/NUS/Tsinghua Land

Beijing balanced by modest rises in secondary cities. Although this may seem excessive, such growth rates are
similar to what happened in Russia after it privatised its housing stock. Once the economy stabilised, housing prices

Could investors have overshot the mark in China?


Possibly, but the land values should be high given Chinas large population, its
in Moscow increased six fold in just six years.

shortage of plots that are suitable for construction and its rapid economic growth. Nationally, the ratio of incomes to
housing prices has improved and is now comparable to the levels found in Australia, Taiwan and the UK. In Beijing

Much of the recent


surge in the credit to GDP ratio is actually evidence of financial deepening rather
than financial instability as China moves toward more market-based asset values . If
so, the higher credit ratios are fully consistent with the less alarming impressions
that come from scrutiny of sector specific financial indicators.
and Shanghai prices are similar to or lower than Delhi, Singapore and Hong Kong.

2NC AT Stocks
Chinas stock market is loosely tied to its economystructural
factors are fine and stock declines dont accurately reflect
growth
Rapoza 7/9
(Kenneth Rapoza. Contributing Editor at Forbes. "Don't Mistake China's Stock Market For China's
Economy," Forbes. 7-9-2015. http://www.forbes.com/sites/kenrapoza/2015/07/09/dont-mistake-chinasstock-market-for-chinas-economy///ghs-kw)

Chinas A-share market is rebounding, but whether or not it has hit bottom is beside
the point. What matters is this: the equity market in China is a more or less a
gambling den dominated by retail investors who make their investment
decisions based on what they read in investor newsletters. Its a herd
mentality. And more importantly, their trading habits do not reflect
economic fundamentals. The countrys stock market plays a smaller role in its
economy than the U.S. stock market does in ours, and has fewer linkages to the rest
of the economy, says Bill Adams, PNC Financials senior international economist in Pittsburgh. The fact
that the two are unhinged limits the potential for Chinas equity correction or a
bubble to trigger widespread economic distress. The recent 25% decline in the
Deutsche X-Trackers China A-Shares (ASHR) fund, partially recuperated on Thursday, is not a
signal of an impending Chinese recession. PNCs baseline forecast for Chinese
real GDP growth in 2015 remains unchanged at 6.8% despite the correction , a
correction which has been heralded by the bears as the beginning of the end for Chinas capitalist experiment.

Chinas economy, like its market, is transforming. China is moving away from being a
low-cost producer and exporter, to becoming a consumer driven society. It wants to
professionalize its financial services sector, and build a green-tech economy to help
eliminate its pollution problems. Its slowly opening its capital account and taking
steps to reforming its financial markets. There will be errors and surprises, and anyone who thinks
otherwise will be disappointed. Over the last four weeks, the Chinese government misplayed its
hand when it decided to use tools for the economy mainly an interest rate
reduction and reserve ratio requirement cuts for banks in an effort to provide the market with more
liquidity. It worked for a little while, and recent moves to change rules on margin, and even utilize a circuit-breaker
mechanism to temporarily delist fast-tanking companies from the mainland stock market, might have worked if the

The timing was terrible. And it pushed people into


panic selling, turning China into the biggest financial market headline this side of Athens. For better or for
worse, Beijing now has no choice but to go all-in to defend equities, some investors told FORBES. But Chinas
real economy is doing much better than the Shanghai and Shenzhen
exchanges suggest. According to China Beige Book, the Chinese economy actually
recovered last quarter. Markets are focusing on equities and PMI indicators from
the state and HSBC as a gauge, but it should become clear in the coming weeks
that Chinas stock market is not a reflection of the fundamentals. The Good, The Bad and
Greece crisis didnt pull the plug on global risk.

the Ugly To get a more detailed picture of what is driving Chinas growth slowdown, it is necessary to look at a
broader array of economic and financial indicators. The epicenter of Chinas problems are the industrial and
property sectors. Shares of the Shanghai Construction Group, one of the largest developers listed on the Shanghai
stock exchange, is down 42.6% in the past four weeks, two times worse than the Shanghai Composite Index. China
Railway Group is down 33%, also an underperformer. Growth in real industrial output has declined from 14% in mid2011 to 5.9% in April, growth in fixed-asset investment declined 50% over the same period and electricity
consumption by primary and secondary industries is in decline. Chinas trade with the outside world is also falling,
though this data does not always match up with other countries trade figures. Real estate is in decline as Beijing
has put the breaks on its housing bubble. Only the east coast cities are still seeing price increases, but construction
is not booming in Shanghai anymore. The two main components of that have prevented a deeper downturn in
activity are private spending on services, particularly financial services, and government-led increases in
transportation infrastructure like road and rail. Retail sales, especially e-commerce sales that have benefited the
likes of Alibaba and Tencent, both of which have outperformed the index, have been growing faster than the overall
economy. Electricity consumption in the services sector is expanding strongly. Growth in household incomes is

outpacing GDP growth. China has begun the necessary rebalancing towards a more sustainable, consumption-led
growth model, says Jeremy Lawson, chief economist at Standard Life Investments in the U.K. He warns that its
still too early to claim success. Since 2011, developed markets led by the S&P 500 have performed better than
China, but for one reason and one reason only: The central banks of Europe, the U.K., Japan and of course the U.S.
have bought up assets in unprecedented volumes using printed money, or outright buying securities like the Feds
purchase of bonds and mortgage backed securities. Why bemoan Chinas state intervention when central bank
intervention has been what kept southern Europe afloat, and the U.S. stock market on fire since March 2009?

Companies in China are still making money. I think people have no clue on China,
says Jan Dehn, head of research at Ashmore in London, a $70 billion emerging market fund manager with money at

They dont see the big picture. And they forget it is still
an emerging market. The Chinese make mistakes and will continue to make
mistakes like all governments. However, they will learn from their mistakes. The
magnitude of most problems are not such that they lead to systematic meltdown.
Each time the market freaks out, value often deep value starts to
emerge. Long term, these volatile episodes are mere blips . They will not change the course
of internationalization and maturing of the market, Dehn told FORBES. China is still building markets . It
has a large environmental problem that will bode well for green tech firms like BYD. Its middle class is not
shrinking. Its billionaires are growing in numbers. They are reforming all the time.
work in mainland China securities.

And in the long term, China is going to win. Markets are impatient and love a good drama. But investing is not a
soap opera. Its not Keeping up with the Kardashians youre buying, youre buying the worlds No. 2 economy, the
biggest commodity consumer in the world, and home to 1.4 billion people, many of which have been steadily
earning more than ever. Chinas transition will cause temporary weakness in growth and volatility, maybe even

Why The Stock Market


Correction Wont Hurt China The Chinese equity correction is healthy and unlikely to
have major adverse real economy consequences for several reasons: First, Chinas
A-shares are still up 79% over the past 12 months. A reversal of fortunes was a
shoo-in to occur. Second, Chinese banks are basically not involved in providing
leverage and show no signs of stress. The total leverage in Chinese financial
markets is about four trillion yuan ($600 billion). Stock market leverage is concentrated in the
crazy volatility. But you have to break eggs to make an omelette, says Dehn.

informal sector with trust funds and brokerages accounting for a little over half of the leverage. Margin financing
via brokerages is down from 2.4 trillion yuan to 1.9 trillion yuan and lets not forget that Chinese GDP is about 70

Third, there is very little evidence that the moves in the stock market will
have a major impact on the real economy and consumption via portfolio loss. Stocks
comprise only 15% of total wealth. Official sector institutions are large holders of
stocks and their spending is under control of the government. As for the retail
investor, they spend far less of their wealth than other countries. China has a 49%
savings rate. Even if they lost half of it, they would be saving more than Americans,
the highly indebted consumer society the world loves to love. During the rally over the past
twelve months, the stock market bubble did not trigger a boost in consumption
indicating that higher equity gains didnt impact spending habits too much. The
Chinese stock market is only 5% of total social financing in China. Stock markets
only finance 2% of Chinese fixed asset investment. Only 1% of company loans have
been put up with stocks as collateral, so the impact on corporate activity is going to
be limited. The rapid rally and the violent correction illustrate the challenges of capital account liberalization,
trillion yuan.

the need for a long-term institutional investor base, index inclusion and deeper financial markets, including foreign
institutional investors, Dehn says. The A-shares correction is likely to encourage deeper financial reforms, not a
reversal.

Plan Flaw

1NCs

1NC CT
Counterplan text: The United States federal government
should neither mandate the creation of surveillance backdoors
in products nor request privacy keys and should terminate
current backdoors created either by government mandates or
government requested keys.
Three arguments here:
1. A. Mandate means to make required
Merriam-Websters Dictionary of Law 96
(Merriam-Websters Dictionary of Law, 1996,
http://dictionary.findlaw.com/definition/mandate.html//ghs-kw)

mandate n [Latin mandatum , from neuter of mandatus , past participle of mandare to entrust, enjoin,

probably irregularly from manus hand + -dere to put] 1 a : a formal communication from a reviewing court
notifying the court below of its judgment and directing the lower court to act accordingly b : mandamus 2
in the civil law of Louisiana : an act by which a person gives another person the power to transact for him
or her one or several affairs 3 a : an authoritative command : a clear authorization or direction [the of the
full faith and credit clause "National Law Journal "] b : the authorization to act given by a constituency to
its elected representative vt mandated mandating : to make mandatory or required
[the Pennsylvania Constitution s a criminal defendant's right to confrontation "National Law Journal "]

B. Circumvention: companies including those under PRISM


agree to provide data because the government pays them
Timberg and Gecllman 13
(Timberg, Craig and Gellman, Barton. Reporters for the Washington Post, citing government
budgets and internal documents. NSA paying U.S. companies for access to communications
networks, Washington Post. 8/29/2013. https://www.washingtonpost.com/world/nationalsecurity/nsa-paying-us-companies-for-access-to-communicationsnetworks/2013/08/29/5641a4b6-10c2-11e3-bdf6-e4fc677d94a1_story.html//ghs-kw)

The National Security Agency is paying hundreds of millions of dollars a year


to U.S. companies for clandestine access to their communications networks ,
filtering vast traffic flows for foreign targets in a process that also sweeps in large volumes of American telephone calls, e-

The bulk of the spending, detailed in a multi-volume intelligence budget


goes to participants in a Corporate Partner Access Project for major U.S.
telecommunications providers. The documents open an important window into surveillance operations
mails and instant messages.

obtained by The Washington Post,

on U.S. territory that have been the subject of debate since they were revealed by The Post and Britains Guardian
newspaper in June. New details of the corporate-partner project, which falls under the NSAs Special Source Operations,

the agency taps into high volume circuit and packet-switched


networks, according to the spending blueprint for fiscal 2013. The program was expected to cost $278 million in
confirm that

the current fiscal year, down nearly one-third from its peak of $394 million in 2011. Voluntary cooperation from the
backbone providers of global communications dates to the 1970s under the cover name BLARNEY, according to
documents provided by former NSA contractor Edward Snowden. These relationships long predate the PRISM program
disclosed in June, under which American technology companies hand over customer data after receiving orders from the
Foreign Intelligence Surveillance Court. In briefing slides, the NSA described BLARNEY and three other corporate projects
OAKSTAR, FAIRVIEW and STORMBREW under the heading of passive or upstream collection. They capture data as
they move across fiber-optic cables and the gateways that direct global communications traffic. Read the documents
Budget Inside the secret 'black budget' View select pages from the Office of the Director of National Intelligence's topsecret 2013 budget with key sections annotated by The Washington Post. The documents offer a rare view of a secret
surveillance economy in which government officials set financial terms for programs capable of peering into the lives of
almost anyone who uses a phone, computer or other device connected to the Internet. Although the companies are

multimillion-dollar
payments could create a profit motive to offer more than the
required assistance. It turns surveillance into a revenue stream , and
required to comply with lawful surveillance orders, privacy advocates say the

thats not the way its supposed to work, said Marc Rotenberg, executive director of the Electronic Privacy Information
Center, a Washington-based research and advocacy group. The fact that the government is paying money to telephone
companies to turn over information that they are compelled to turn over is very troubling. Verizon, AT&T and other major
telecommunications companies declined to comment for this article, although several industry officials noted that
government surveillance laws explicitly call for companies to receive reasonable reimbursement for their costs. Previous
news reports have made clear that companies frequently seek such payments , but never before
has their overall scale been disclosed. The budget documents do not list individual companies, although they do break

down spending among several NSA programs, listed by their code names. There is no record in the documents obtained
by The Post of money set aside to pay technology companies that provide information to the NSAs PRISM program. That
program is the source of 91 percent of the 250 million Internet communications collected through Section 702 of the FISA
Amendments Act, which authorizes PRISM and the upstream programs, according to an 2011 opinion and order by the

companies that provide information to


PRISM, including Apple, Facebook and Google, say they take no payments from the government when they comply
with national security requests. Others say they do take payments in some circumstances. The Guardian
reported last week that the NSA had covered millions of dollars in costs that some technology
companies incurred to comply with government demands for information. Telecommunications
companies generally do charge to comply with surveillance requests, which come from state, local and federal law
Foreign Intelligence Surveillance Court. Several of the

enforcement officials as well as intelligence agencies. Former telecommunications executive Paul Kouroupas, a security

companies welcome the


revenue and enter into contracts in which the government makes
higher payments than otherwise available to firms receiving reimbursement for complying with surveillance
orders. These contractual payments , he said, could cover the cost of buying and installing new
equipment, along with a reasonable profit. These voluntary agreements simplify the
governments access to surveillance, he said. It certainly lubricates the
[surveillance] infrastructure, Kouroupas said. He declined to say whether Global Crossing, which
officer who worked at Global Crossing for 12 years, said that some

operated a fiber-optic network spanning several continents and was bought by Level 3 Communications in 2011, had such
a contract. A spokesman for Level 3 Communications declined to comment.

2. Plan flaw: the plan mandates that we stop surveilling


backdoors, request public encryption keys, and close
existing backdoorsthat guts solvency because the
government can still create backdoors with encryption
keys
3. Presumption: we dont mandate back doors in the status
quo, all their ev is in the context of a bill that would
require backdoors in the future, so the AFF does nothing

1NC KQ
The Secure Data Act of 2015 states that no agency may
mandate backdoors
Secure Data Act of 2015
(Wyden, Ron. Senator, D-OR. S. 135, known as the Secure Data Act of 2015, introduced in Congress
1/8/2015. https://www.congress.gov/bill/114th-congress/senate-bill/135/text//ghs-kw)
SEC. 2. PROHIBITION ON DATA SECURITY VULNERABILITY MANDATES. (a) In General.Except as provided in

no agency may mandate that a manufacturer , developer, or seller of


covered products design or alter the security functions in its product or service to
allow the surveillance of any user of such product or service, or to allow the physical
search of such product, by any agency.
subsection (b),

Mandate means to make required


Merriam-Websters Dictionary of Law 96
(Merriam-Websters Dictionary of Law, 1996,
http://dictionary.findlaw.com/definition/mandate.html//ghs-kw)

mandate n [Latin mandatum , from neuter of mandatus , past participle of mandare to entrust, enjoin, probably

irregularly from manus hand + -dere to put] 1 a : a formal communication from a reviewing court notifying the court
below of its judgment and directing the lower court to act accordingly b : mandamus 2 in the civil law of Louisiana :
an act by which a person gives another person the power to transact for him or her one or several affairs 3 a : an
authoritative command : a clear authorization or direction [the of the full faith and credit clause "National Law
Journal "] b : the authorization to act given by a constituency to its elected representative vt mandated
mandating : to make mandatory or required [the Pennsylvania Constitution s a criminal
defendant's right to confrontation "National Law Journal "]

Circumvention: companies including those under PRISM agree


to provide data because the government pays them
Timberg and Gellman 13
(Timberg, Craig and Gellman, Barton. Reporters for the Washington Post, citing government budgets
and internal documents. NSA paying U.S. companies for access to communications networks,
Washington Post. 8/29/2013. https://www.washingtonpost.com/world/national-security/nsa-paying-uscompanies-for-access-to-communications-networks/2013/08/29/5641a4b6-10c2-11e3-bdf6e4fc677d94a1_story.html//ghs-kw)

The National Security Agency is paying hundreds of millions of dollars a year to U.S.
companies for clandestine access to their communications networks , filtering vast traffic
flows for foreign targets in a process that also sweeps in large volumes of American telephone calls, e-mails and instant messages.

The bulk of the spending, detailed in a multi-volume intelligence budget obtained by The Washington Post, goes to
participants in a Corporate Partner Access Project for major U.S. telecommunications providers. The
documents open an important window into surveillance operations on U.S. territory that have been the subject of debate since they
were revealed by The Post and Britains Guardian newspaper in June. New details of the corporate-partner project, which falls under
the NSAs Special Source Operations, confirm that

the agency taps into high volume circuit and

packet-switched networks,

according to the spending blueprint for fiscal 2013. The program was expected to cost
$278 million in the current fiscal year, down nearly one-third from its peak of $394 million in 2011. Voluntary cooperation from the
backbone providers of global communications dates to the 1970s under the cover name BLARNEY, according to documents
provided by former NSA contractor Edward Snowden. These relationships long predate the PRISM program disclosed in June, under
which American technology companies hand over customer data after receiving orders from the Foreign Intelligence Surveillance
Court. In briefing slides, the NSA described BLARNEY and three other corporate projects OAKSTAR, FAIRVIEW and STORMBREW
under the heading of passive or upstream collection. They capture data as they move across fiber-optic cables and the
gateways that direct global communications traffic. Read the documents Budget Inside the secret 'black budget' View select pages
from the Office of the Director of National Intelligence's top-secret 2013 budget with key sections annotated by The Washington
Post. The documents offer a rare view of a secret surveillance economy in which government officials set financial terms for
programs capable of peering into the lives of almost anyone who uses a phone, computer or other device connected to the Internet.

multimilliondollar payments could create a profit motive to offer more than the
required assistance. It turns surveillance into a revenue stream , and thats not
Although the companies are required to comply with lawful surveillance orders, privacy advocates say the

the way its supposed to work, said Marc Rotenberg, executive director of the Electronic Privacy Information Center, a Washingtonbased research and advocacy group. The fact that the government is paying money to telephone companies to turn over
information that they are compelled to turn over is very troubling. Verizon, AT&T and other major telecommunications companies
declined to comment for this article, although several industry officials noted that government surveillance laws explicitly call for
companies to receive reasonable reimbursement for their costs. Previous news reports have made clear that

companies

frequently seek such payments, but never before has their overall scale been disclosed. The budget documents do
not list individual companies, although they do break down spending among several NSA programs, listed by their code names.
There is no record in the documents obtained by The Post of money set aside to pay technology companies that provide information
to the NSAs PRISM program. That program is the source of 91 percent of the 250 million Internet communications collected through
Section 702 of the FISA Amendments Act, which authorizes PRISM and the upstream programs, according to an 2011 opinion and
order by the Foreign Intelligence Surveillance Court. Several of the

companies that provide information

to PRISM, including Apple, Facebook and Google, say they take no payments from the government when they comply with
national security requests. Others say they do take payments in some circumstances. The Guardian reported last
week that the NSA had covered millions of dollars in costs that some technology companies incurred to
comply with government demands for information. Telecommunications companies generally do charge to comply
with surveillance requests, which come from state, local and federal law enforcement officials as well as intelligence agencies.
Former telecommunications executive Paul Kouroupas, a security officer who worked at Global Crossing for 12 years, said that some

companies welcome the revenue and enter into contracts in which the
government makes higher payments than otherwise available to firms receiving reimbursement for
complying with surveillance orders. These contractual payments, he said, could cover the cost of buying and
installing new equipment, along with a reasonable profit. These voluntary agreements simplify
the governments access to surveillance, he said. It certainly lubricates the
[surveillance] infrastructure, Kouroupas said. He declined to say whether Global Crossing, which operated a
fiber-optic network spanning several continents and was bought by Level 3 Communications in 2011, had such a contract. A
spokesman for Level 3 Communications declined to comment.

2NC

2NC Mandate
Mandate is an order or requirement
The People's Law Dictionary 02
(Hill, Gerald and Kathleen. Gerald Hill holds a J.D. from Hastings College of the Law of the University of
California. He was Executive Director of the California Governor's Housing Commission, has drafted
legislation, taught at Golden Gate University Law School, served as an arbitrator and pro tem judge,
edited and co-authored Housing in California, was an elected trustee of a public hospital, and has
testified before Congressional committees. Kathleen Hill holds an M.A. in political psychology from
California State University, Sonoma. She was also a Fellow in Public Affairs with the prestigious Coro
Foundation, earned a Certificat from the Sorbonne in Paris, France, headed the Peace Corps Speakers'
Bureau in Washington, D.C., worked in the White House for President Kennedy, and was Executive
Coordinator of the 25th Anniversary of the United Nations. Kathleen has served on a Grand Jury,
chaired two city commissions and has developed programs for the Institute of Governmental Studies
of the University of California. The Peoples Law Dictionary, 2002.
http://dictionary.law.com/Default.aspx?selected=1204//ghs-kw)

mandate n. 1) any mandatory order or requirement under statute, regulation, or by a


public agency. 2) order of an appeals court to a lower court (usually the original trial court in the case) to

comply with an appeals court's ruling, such as holding a new trial, dismissing the case or releasing a prisoner whose
conviction has been overturned. 3) same as the writ of mandamus, which orders a public official or public body to
comply with the law.

2NC Circumvention
NSA enters into mutually agreed upon contracts for back doors
Reuters 13
(Menn, Joseph. Exclusive: Secret contract tied NSA and security industry pioneer, Reuters.
12/20/2013. http://www.reuters.com/article/2013NC/12/21/us-usa-security-rsaidUSBRE9BJ1C220131221//ghs-kw)
As a key part of a campaign to embed encryption software that it could crack into widely used computer products,

the U.S. National Security Agency arranged a secret $10 million contract with RSA,
one of the most influential firms in the computer security industry , Reuters has learned.
Documents leaked by former NSA contractor Edward Snowden show that the NSA created and
promulgated a flawed formula for generating random numbers to create a
"back door" in encryption products, the New York Times reported in September. Reuters later
reported that RSA became the most important distributor of that formula by rolling it
into a software tool called Bsafe that is used to enhance security in personal computers and many other
products. Undisclosed until now was that RSA received $10 million in a deal that set
the NSA formula as the preferred, or default, method for number generation in the BSafe
software, according to two sources familiar with the contract. Although that sum might seem paltry, it
represented more than a third of the revenue that the relevant division at
RSA had taken in during the entire previous year, securities filings show. The earlier
disclosures of RSA's entanglement with the NSA already had shocked some in the close-knit world of computer
security experts. The company had a long history of championing privacy and security, and
it played a leading role in blocking a 1990s effort by the NSA to require a special
chip to enable spying on a wide range of computer and communications products.

RSA, now a subsidiary of computer storage giant EMC Corp, urged customers to stop using the NSA formula after
the Snowden disclosures revealed its weakness. RSA and EMC declined to answer questions for this story, but RSA
said in a statement: "RSA always acts in the best interest of its customers and under no circumstances does RSA
design or enable any back doors in our products. Decisions about the features and functionality of RSA products are
our own." The NSA declined to comment. The RSA deal shows one way the NSA carried out what Snowden's
documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools.

NSA

documents

released in recent months called for using "commercial relationships" to


advance that goal, but did not name any security companies as collaborators. The NSA came under attack this
week in a landmark report from a White House panel appointed to review U.S. surveillance policy. The panel noted
that "encryption is an essential basis for trust on the Internet," and called for a halt to any NSA efforts to undermine

RSA employees interviewed said that the company erred in agreeing


to such a contract, and many cited RSA's corporate evolution away from pure cryptography products as
it. Most of the dozen current and former
one of the reasons it occurred.

Case

Economy Adv

Notes
This advantage makes NO sense. Venezia ev doesnt say Internet would
collapse, just that thered be a bunch of identity theft, etc. This has no
bearing on backdoors effects on physical infrastructure
30 second explainer: backdoors collapse the internet (not true), internet k2
the conomy b/c new industries and faster growth, econ collapse = ext b/c
Harris and Burrows

CX Questions
Venezia doesnt say the internet would be eliminated, just that data would be
decrypted and that there would be mass identity theftwheres the ev into
Internet collapse?

1NC Internet Not k2 Econ


No reason why backdoors would collapse the Internet :
1. No internal link: Venezia doesnt say the Internet would
literally collapse, just that it would essentially be
destroyed, meaning that thered be a bunch of identity
thefthas nothing to do with the physical infrastructure
of the internet collapsing.
2. Quals: their evidence just comes from a blogger its
highly unlikely that someone using a backdoor would
destroy the entire Internet.
The Internets positive influence is overhyped the washing
machine has done more for our economy
Dave Masko, award-winning foreign correspondent and photojournalist, 716-2015, "Internets Impact Exaggerated, Washing Machine Does More,"
Read Wave, http://www.readwave.com/internet-s-impact-exaggeratedwashing-machine-does-more_s85070

Data collected by the Pew Internet &


American Life Project finds the Internet being hyped as something great when ITs
not. Still, there are many University of Oregon students admitting to spending lots of time surfing the Internet
Story and photo by Dave Masko EUGENE, Oregon

because they are lonely or bored. Are you kidding, if I had a girlfriend and in love I would hardly spend my
weekends surfing the Net for what for just more mind fxxx you know; its just a way to think you are something
when you are just another loser; lost and lonely online, admits senior Brian Kelleher. In turn, Kelleher points to a
recent lecture he viewed online when researching an economics assignment. The lecture by Nobel Prize nominee
Ha-Joon Chang really opened my eyes to the power of Internet branding and marketing over the past 25 years the

Professor Chang stated that the Internets


impact is vastly exaggerated; while showing how the washing machine has done
way more for our society and various cultures worldwide than the Internet because
not everyone on Earth today uses views online knowledge as something of value.
Net has existed, Kelleher explained. Basically,

Frankly, Professor Chang opened my eyes as to why so many of us are blind when it comes to Internet hype being
just more bullxxxx. Meanwhile, this interview with Kelleher took place a few months ago when this university
senior was single. After letting go of my Internet addiction, I met a like-minded student named Carol who got me
outside in the real world when not in class. You know what, I feel really alive again thanks to Carols view that we
unplug from the machine we dont really need the Net. Its great to be offline and loving life again. Carol and I even
go to the laundry and use our favorite new machine the washing machine, joked Kelleher who is pictured walking
around campus with Carol on a bright and beautiful April 30, 2015 spring day in Eugene. In fact, Kelleher said
Professor Changs lecture raised eyebrows here at the University of Oregons famed Wearable Computing Lab that
was founded in 1995 at the dawn of the so-called information-era. While Kelleher thinks digital-age fans here on
campus view the Internet as revolutionizing just about everything, they took pause when Professor Chang a
famed University of Cambridge, England, economist presented interesting views on why the washing machine
helps more people worldwide than the Net ever could. Professor Chang argues that the Internets revolutionary is
pretty harmless, noting that, Instead

of reading a paper, we now read the news online.


Instead of buying books at a store, we buy them on-line. Whats so revolutionary?
The Internet has mainly affected our leisure life. In short, the washing machine has
allowed women to get into the labor market so that we have nearly doubled the
work force. Moreover, Professor Chang questions all the hype about the good stuff the Internet is doing for the

poor. Charities are now working to give people in poor countries access to the Internet. But shouldnt we spend
that money on providing health clinics and safe water, writes Professor Chang. While the digital revolution has
helped make the shift from traditional industry, the clothes washer technology also has been revolutionary because
it reduces the drudgery of scrubbing and rubbing clothing, the professor added. Professor Chang is viewed as one of

His economic textbook, 23 Things They


details his interest in how the washing machine is way more
revolutionary than the Internet. Thanks to washing machine technology, women
the foremost thinkers on new economics and development.
Dont Tell You About Capitalism also

started having fewer children, gained more bargaining power in their relationships
and enjoyed a higher status. This liberation of women has done more for democracy
than the Internet, states Professor Changs lecture. The washing machine is a symbol of a fundamental
change in how we look at women. It has changed society more than the Internet. As one of the top economics
professors in the world, Professor Chang likes to challenge his students to looking at things in a different way. For
instance, he notes that people like you and me have no memory of spending two hours a day washing our clothes
in cold water. This is part 8 for an occasional series titled: Tech: Hooked Into Machine, that is being offered to
book and website publishers. DAVE MASKO is award-winning foreign correspondent and photojournalist who has
published prolifically in top print newspapers and magazines online. He has reported on vital issues worldwide over
the past 40 years. He accepts freelance work. Contact him at dpmasko@msn.com.

1NC Collapse Inev


Collapse is inevitablepeak capacity
RT 15
(RT, Capacity crunch: Internet could collapse by 2023, researchers warn. 05-05-2015.
http://www.rt.com/news/255329-internet-capacity-collapse-researchers///ghs-kw)

The internet could face an imminent capacity crunch as soon as in eight years ,
should it fail to provide faster data, UK scientists say. The cables and fiber optics that deliver
the data to users will have reached their limit by 2023. Optical cables are transparent
strands the thickness of a human hair: the data is transformed into light, and is sent down the fiber, and then turns

We are starting to reach the point in the research lab where we


can't get any more data into a single optical fiber. The deployment to market is
about six to eight years behind the research lab - so within eight years that will be
it, we can't get any more data in, Professor Andrew Ellis, of Aston University in Birmingham, told the
Daily Mail. Demand is increasingly catching up. It is growing again and again, and it is
harder and harder to keep ahead. Unless we come forward with really radical ideas,
we are going to see costs dramatically increase, he added. Internet companies could set up
additional cables, but that would see price tags for web usage soar. Researchers warn we could end up
with an internet that switches on and off all the time , or be forced to pay far more than we do
back into information.

now. That is a completely different business model. I think a conversation is needed with the British public as to
whether or not they are prepared to switch that business model in exchange for more capacity, Ellis warned.

Plus, there is another issue: that of electricity needed to cope with the skyrocketing
demand. That is quite a huge problem. If we have multiple fibers to keep up, we
are going to run out of energy in about 15 years, Professor Ellis said. Some 16 percent of the
power in the UK is consumed via the internet already, and the amount is doubling every four years. Globally, it
is responsible for about two percent of power usage.

1NC No Collapse
No Internet collapseself-improvements
Dvorak 07
(John C. Dvorak. John Dvorak is a columnist for PCMag.com and the host of the weekly TV video
podcast CrankyGeeks. His work is licensed around the world. Previously a columnist for Forbes, Forbes
Digital, PC World, Barrons, MacUser, PC/Computing, Smart Business and other magazines and
newspapers. Former editor and consulting editor for Infoworld. Has appeared in the New York Times,
LA Times, Philadelphia Enquirer, SF Examiner, Vancouver Sun. Was on the start-up team for CNet TV as
well as ZDTV. At ZDTV (and TechTV) was host of Silicon Spin for four years doing 1000 live and live-totape TV shows. Also was on public radio for 8 years. Written over 4000 articles and columns as well as
authoring or co-authoring 14 books. 2004 Award winner of the American Business Editors Association's
national gold award for best online column of 2003. That was followed up by an unprecedented second
national gold award from the ABEA in 2005, again for the best online column (for 2004). Won the Silver
National Award for best magazine column in 2006. "Will the Internet Collapse?," PCMAG. 5- 1-2007.
http://www.pcmag.com/article2/0%2c2817%2c2124376%2c00.asp//ghs-kw)

When is the Internet going to collapse? The answer is NEVER. The Internet is amazing for no other
reason than that it hasn't simply collapsed, never to be rebooted. Over a decade ago, many pundits
were predicting an all-out catastrophic failure, and back then the load was nothing
compared with what it is today. So how much more can this network take? Let's look at
the basic changes that have occurred since the Net became chat-worthy around 1990. First of all, only a few people
were on the Net back in 1990, since it was essentially a carrier for e-mail (spam free!), newsgroups, gopher, and
FTP. These capabilities remain. But the e-mail load has grown to phenomenal proportions and become burdened
with megatons of spam. In one year, the amount of spam can exceed a decade's worth, say 1990 to 2000, of all

the total
U.S. backbone throughput of the Internet was 1 terabyte, and in 1991 it doubled to 2TB. Throughput
Internet traffic. It's actually the astonishing overall growth of the Internet that is amazing. In 1990,

continued to double until 1996, when it jumped to 1,500TB. After that huge jump, it returned to doubling, reaching

growth rate has continued as more and more


services are added to the burden. The jump in 1996 is attributable to the one-two punch of the
80,000 to 140,000TB in 2002. This ridiculous

universal popularization of the Web and the introduction of the MP3 standard and subsequent music file sharing.
More recently, the emergence of inane video clips (YouTube and the rest) as universal entertainment has continued

Then VoIP
came along, and IPTV is next. All the while, e-mail numbers are in the trillions of
messages, and spam has never been more plentiful and bloated. Add blogging,
vlogging, and twittering and it just gets worse. According to some expensive
studies, the growth rate has begun to slow down to something like 50 percent per
year. But that's growth on top of huge numbers. Petabytes. So when does this
thing just grind to a halt or blow up? To date, we have to admit that the structure
of the Net is robust, to say the least. This is impressive, considering the fact that
experts were predicting a collapse in the 1990s. Robust or not, this Internet is a
transportation system. It transports data. All transportation systems eventually need upgrading,
to slam the Net with overhead, as has large video file sharing via BitTorrent and other systems.

repair, basic changes, or reinvention. But what needs to be done here? This, to me, has come to be the big
question. Does anything at all need to be done, or do we run it into the ground and then fix it later? Is this like a
jalopy leaking oil and water about to blow, or an organic perpetual-motion machine that fixes itself somehow? Many

the Net has never collapsed because it does tend to fix itself. A
decade ago we were going to run out of IP addressesremember? It righted itself,
with rotating addresses and subnets. Many of the Net's improvements are selfimprovements. Only spam, viruses, and spyware represent incurable diseases that could kill the organism. I
have to conclude that the worst-case scenario for the Net is an outage here or there, if
anywhere. After all, the phone system, a more machine-intensive system, never
really imploded after years and years of growth, did it? While it has outages, it's
actually more reliable than the power grid it sits on. Why should the Internet be any
different now that it is essentially run by phone companies who know how to keep
networks up? And let's be real here. The Net is being improved daily, with newer routers
and better gear being constantly hot-swapped all over the world. This is not the
same Internet we had in 1990, nor is it what we had in 2000. While phone companies seem
believe that

to enjoy nickel-and-diming their customers to death with various petty scams and charges, they could easily charge
one flat fee and spend their efforts on quality-of-service issues and improving overall network speed and
throughput.

1NC Econ =/= War


International norms maintain economic stability
***Zero empirical data supports their theory the only financial crisis of the
new liberal order experienced zero uptick in violence or challenges to the
central factions governed by the US that check inter-state violence they
have no theoretical foundation for proving causality
Barnett, 9 senior managing director of Enterra Solutions LLC (Thomas, The
New Rules: Security Remains Stable Amid Financial Crisis, 25 August 2009,
http://www.aprodex.com/the-new-rules--security-remains-stable-amidfinancial-crisis-398-bl.aspx)
When the global financial crisis struck roughly a year ago, the blogosphere was ablaze
with all sorts of scary predictions of, and commentary regarding, ensuing conflict and wars -- a
rerun of the Great Depression leading to world war, as it were. Now, as global economic news brightens and
recovery -- surprisingly led by China and emerging markets -- is the talk of the day, it's interesting to look back over

globalization's first truly worldwide recession has had


virtually no impact whatsoever on the international security landscape. None of the more
than three-dozen ongoing conflicts listed by GlobalSecurity.org can be clearly attributed
to the global recession. Indeed, the last new entry (civil conflict between Hamas and Fatah
in the Palestine) predates the economic crisis by a year, and three quarters of the chronic struggles began
in the last century. Ditto for the 15 low-intensity conflicts listed by Wikipedia (where the latest
the past year and realize how

entry is the Mexican "drug war" begun in 2006). Certainly, the Russia-Georgia conflict last August was specifically
timed, but by most accounts the opening ceremony of the Beijing Olympics was the most important external trigger
(followed by the U.S. presidential campaign) for that sudden spike in an almost two-decade long struggle between

we see a most
familiar picture: the usual mix of civil conflicts, insurgencies, and liberationthemed terrorist movements. Besides the recent Russia-Georgia dust-up, the only two
potential state-on-state wars (North v. South Korea, Israel v. Iran) are both tied to one side acquiring
a nuclear weapon capacity -- a process wholly unrelated to global economic trends. And with the
Georgia and its two breakaway regions. Looking over the various databases, then,

United States effectively tied down by its two ongoing major interventions (Iraq and Afghanistan-bleeding-into-

our involvement elsewhere around the planet has been quite modest, both
leading up to and following the onset of the economic crisis: e.g., the usual counter-drug efforts in Latin
Pakistan),

America, the usual military exercises with allies across Asia, mixing it up with pirates off Somalia's coast).
Everywhere else we find serious instability we pretty much let it burn, occasionally pressing the Chinese -unsuccessfully -- to do something. Our new Africa Command, for example, hasn't led us to anything beyond
advising and training local forces. So, to sum up: No significant uptick in mass violence or unrest
(remember the smattering of urban riots last year in places like Greece, Moldova and Latvia?); The usual
frequency maintained in civil conflicts (in all the usual places); Not a single state-on-state war directly caused (and
no great-power-on-great-power crises even triggered); No

great improvement or disruption in great-

power cooperation regarding the emergence of new nuclear powers (despite all that diplomacy); A
modest scaling back of international policing efforts by the system's acknowledged Leviathan power (inevitable
given the strain); and No

serious efforts by any rising great power to challenge that


Leviathan or supplant its role. (The worst things we can cite are Moscow's occasional deployments of strategic
assets to the Western hemisphere and its weak efforts to outbid the United States on basing rights in Kyrgyzstan;
but the best include China and India stepping up their aid and investments in Afghanistan and Iraq.) Sure, we've
finally seen global defense spending surpass the previous world record set in the late 1980s, but even that's likely
to wane given the stress on public budgets created by all this unprecedented "stimulus" spending. If anything, the

friendly cooperation on such stimulus packaging was the most notable greatpower dynamic caused by the crisis. Can we say that the world has suffered a distinct shift to
political radicalism as a result of the economic crisis? Indeed, no. The world's major economies remain
governed by center-left or center-right political factions that remain decidedly friendly to

both markets and trade. In the short run, there were attempts across the board to insulate economies from
immediate damage (in effect, as much protectionism as allowed under current trade rules), but there was no great
slide into "trade wars." Instead, the World Trade Organization is functioning as it was designed to function, and
regional efforts toward free-trade agreements have not slowed. Can we say Islamic radicalism was inflamed by the
economic crisis? If it was, that shift was clearly overwhelmed by the Islamic world's growing disenchantment with
the brutality displayed by violent extremist groups such as al-Qaida. And looking forward, austere economic times
are just as likely to breed connecting evangelicalism as disconnecting fundamentalism. At the end of the day, the
economic crisis did not prove to be sufficiently frightening to provoke major economies into establishing global
regulatory schemes, even as it has sparked a spirited -- and much needed, as I argued last week -- discussion of the
continuing viability of the U.S. dollar as the world's primary reserve currency. Naturally, plenty of experts and
pundits have attached great significance to this debate, seeing in it the beginning of "economic warfare" and the
like between "fading" America and "rising" China. And yet, in a world of globally integrated production chains and
interconnected financial markets, such "diverging interests" hardly constitute signposts for wars up ahead. Frankly, I
don't welcome a world in which America's fiscal profligacy goes undisciplined, so bring it on -- please! Add it all up

financial crisis has proven the great resilience of America's


post-World War II international liberal trade order.
and it's fair to say that this global

2NC Econ =/= War


Aggregate data proves interstate violence doesnt result from
economic decline
Drezner, 12 --- The Fletcher School of Law and Diplomacy at Tufts University
(October 2012, Daniel W., The Irony of Global Economic Governance: The
System Worked,
www.globaleconomicgovernance.org/wp-content/uploads/IR-ColloquiumMT12-Week-5_The-Irony-of-Global-Economic-Governance.pdf)
a dog that hasnt barked: the effect of the Great Recession on
cross-border conflict and violence. During the initial stages of the crisis, multiple analysts
asserted that the financial crisis would lead states to increase their use of force as a
tool for staying in power.37 Whether through greater internal repression, diversionary wars, arms races, or
a ratcheting up of great power conflict, there were genuine concerns that the global economic
downturn would lead to an increase in conflict . Violence in the Middle East, border disputes in the
The final outcome addresses

South China Sea, and even the disruptions of the Occupy movement fuel impressions of surge in global public
disorder.

The aggregate data suggests otherwise, however. The Institute for Economics and
Peace has constructed a Global Peace Index annually since 2007. A key conclusion
they draw from the 2012 report is that The average level of peacefulness in 2012 is
approximately the same as it was in 2007.38 Interstate violence in particular has
declined since the start of the financial crisis as have military expenditures in most sampled
countries. Other studies confirm that the Great Recession has not triggered any
increase in violent conflict; the secular decline in violence that started with the end of the Cold War has
not been reversed.39 Rogers Brubaker concludes, the crisis has not to date generated the
surge in protectionist nationalism or ethnic exclusion that might have been expected.40
None of these data suggest that the global economy is operating swimmingly. Growth remains unbalanced and
fragile, and has clearly slowed in 2012. Transnational capital flows remain depressed compared to pre-crisis levels,
primarily due to a drying up of cross-border interbank lending in Europe. Currency volatility remains an ongoing
concern. Compared to the aftermath of other postwar recessions, growth in output, investment, and employment in
the developed world have all lagged behind. But the Great Recession is not like other postwar recessions in either

One financial analyst


characterized the post-2008 global economy as in a state of contained
depression.41 The key word is contained, however. Given the severity, reach and depth of
the 2008 financial crisis, the proper comparison is with Great Depression. And by
that standard, the outcome variables look impressive. As Carmen Reinhart and Kenneth Rogoff
scope or kind; expecting a standard V-shaped recovery was unreasonable.

concluded in This Time is Different: that its macroeconomic outcome has been only the most severe global
recession since World War II and not even worse must be regarded as fortunate.42

Most rigorous historical analysis proves


Miller, 2K economist, adjunct professor in the University of Ottawas
Faculty of Administration, consultant on international development issues,
former Executive Director and Senior Economist at the World Bank, (Morris,
Poverty as a cause of wars?, Winter, Interdisciplinary Science Reviews, Vol.
25, Iss. 4, p. Proquest)
Perhaps one should ask, as some scholars do, whether it is not poverty as such but some
dramatic event or sequence of such events leading to the exacerbation of poverty that is
the factor that contributes in a significant way to the denouement of war. This calls for
addressing the question: do wars spring from a popular reaction to an economic

crisis that exacerbates poverty and/or from a heightened awareness of the poor
of the wide and growing disparities in wealth and incomes that diminishes their

tolerance to poverty? It seems reasonable to believe that a powerful "shock" factor


might act as a catalyst for a violent reaction on the part of the people or on the part of
the political leadership. The leadership, finding that this sudden adverse economic

and social impact destabilizing, would possibly be tempted to seek a diversion by


finding or, if need be, fabricating an enemy and setting in train the process
leading to war. There would not appear to be any merit in this hypothesis
according to a study undertaken by Minxin Pei and Ariel Adesnik of the Carnegie
Endowment for International Peace. After studying 93 episodes of economic crisis
in 22 countries in Latin America and Asia in the years since World War II they
concluded that Much of the conventional wisdom about the political impact of
economic crises may be wrong ..The severity of economic crisis - as measured
in terms of inflation and negative growth bore no relationship to the collapse of
regimes.(or, in democratic states, rarely) to an outbreak of violenceIn the
cases of dictatorships and semi-democracies, the ruling elites responded to crises
by increasing repression (thereby using one form of violence to abort another.)

Innovation Adv

Notes
This advantage is even worse than the previous. How that is possible I have
no idea. Zylberberg says backdoors result in centralized information flows,
meaning that information flows inwards towards the NSA. Crowe is in the
context of the internet of things and says that inefficient flows of information
use energy, Also Tyler is a BLOGGER for the Motley Fool, which makes him
qualified to talk about investments but not the technical aspects of the
Internet. He also talks about oversupplying energy to the grid, which is
probably an industrial application and NOT a commercial one, which probably
means they dont have an internal link into energy companies actually
developing better alternatives. Tyler also talks about things like Increased
communication between everything -- engines, appliances, generators,
automobiles -- allows for instant feedback for more efficient travel routes,
optimized fertilizer and water consumption to reduce deforestation, real-time
monitoring of electricity consumption and instant feedback to generators,
and fully integrated heating, cooling, and lighting systems that can adjust for
human occupancy. There are lots of projections and estimates related to
carbon emissions and climate change, but the one that has emerged as the
standard bearer is the amount of carbon emissions which the squo probably
resolves. This reflects a fundamental misunderstanding of what the HELL
backdoors actually are, which is just government access into company
servers, NOT mandating rerouting ALL internet traffic. Its an embarrassment
to DDI. Read the patents advantage CP for this. Sorry Im grouchy at 3:30AM.
30 second explainer: backdoors kill innovation b/c centralized information
flows, that kills innovation b/c no end-to-end encryption, innovation is k2
solve warming b/c we oversupply the grid and better communications means
energy efficiency and less CO2, warming = ext b/c Roberts

CX Questions
Personally Id be very tempted to give them all of CX to explain all their
warrants and tell a coherent story but plz dont do that.

1NC Warming =/= Extinction


This advantage is highly unlikely the minimal effect that
backdoors has on innovation means that the risk of the impact
is tiny. AND, backdoors have existed for a while, so the impact
shouldve happened by now if the link story was true.
AND, Tyler says we just don't always have the adequate
information to make the most efficient decision which means
even in a world of innovation, we still oversupply the grid
which triggers your internal link
No impact to warming
IBD 5/13 (5/13/2014, Investors Business Daily, Obama Climate Report: Apocalypse Not, Factiva, JMP)
Not since Jimmy Carter falsely spooked Americans about overpopulation, the
world running out of food, water and energy, and worsening pollution, has a
president been so filled with doom and gloom as this one. Last week's White
House report on climate change was a primal scream to alarm Americans into
action to save the earth from a literal meltdown . Maybe we should call President
Obama the Fearmonger in Chief. While scientists can argue until the cows come home about what
Climate:

will happen in the future with the planet's climate, we do have scientific records on what's already happened.
Obama moans that the devastation from climate change is already here as more severe weather events

according to the government's own records which


presumably the White House can get severe weather events are no more likely now
than they were 50 or 100 years ago and the losses of lives and property are
much less devastating. Here is what government data reports and top scientists tell us about extreme
climate conditions: Hurricanes: The century-long trend in Hurricanes is slightly down, not
up. According to the National Hurricane Center, in 2013, "There were no major hurricanes in the North Atlantic
threaten to imperil our very survival. But,

Basin for the first time since 1994. And the number of hurricanes this year was the lowest since 1982."
According to Dr. Ryan Maue at Weather Bell Analytics, "We are currently in the longest period since the Civil War

The
National Oceanic and Atmospheric Administration says there has been no change in severe
tornado activity. "There has been little trend in the frequency of the stronger tornadoes over the past 55
Era without a major hurricane strike in the U.S. (i.e., category 3, 4 or 5)" Tornadoes: Don't worry, Kansas.

years." Extreme heat and cold temperatures: NOAA's U.S. Climate Extremes Index of unusually hot or cold
temperatures finds that over the last 10 years, five years have been below the historical mean and five above
the mean. Severe drought/extreme moisture: While higher than average portions of the country were
subjected to extreme drought/moisture in the last few years, the 1930's, 40's and 50's were more extreme in
this regard. In fact, over the last 10 years, four years have been below the average and six above the average.

Dr.
Pielke Jr., past chairman of the American Meteorological Society Committee on
Weather Forecasting and Analysis, reports , "floods have not increased in the U.S. in frequency or
Cyclones: Maue reports: "the global frequency of tropical cyclones has reached a historical low." Floods:
Roger

intensity since at least 1950. Flood losses as a percentage of U.S. GDP have dropped by about 75% since 1940."

Even NOAA admits a "lack of significant warming at the Earth's surface


in the past decade" and a pause "in global warming observed since 2000."
Warming:

Specifically, NOAA last year stated, "since the turn of the century, however, the change in Earth's global mean

"There is no evidence that disasters


are getting worse because of climate change. ... It is misleading, and just plain
incorrect, to claim that disasters associated with hurricanes, tornadoes, floods or
droughts have increased on climate time scales either in the U.S. or globally." One
surface temperature has been close to zero." Pielke sums up:

big change between today and 100 years ago is that humans are much more capable of dealing with hurricanes
and earthquakes and other acts of God. Homes and buildings are better built to withstand severe storms and
alert systems are much more accurate to warn people of the coming storms. As a result, globally, weatherrelated losses have actually decreased by about 25% as a proportion of GDP since 1990. The liberal hubris is
that government can do anything to change the earth's climate or prevent the next big hurricane, earthquake or
monsoon. These are the people in Washington who can't run a website, can't deliver the mail and can't balance

The President's doomsday claims


served mostly to undermine the alarmists' case for radical action on
climate change. Truth always seems to be the first casualty in this debate. This is
the tactic of tyrants. Americans are wise to be wary about giving up our basic freedoms and lowering our
a budget. But they are going to prevent droughts and forest fires.
last week

standard of living to combat an exaggerated crisis.

1NC No Warming
Their models are wrong
Ridley 14 --- author of The Rational Optimist, a columnist for the Times
(London) and a member of the House of Lords (6/19/14, Matt, Junk Science
Week: IPCC commissioned models to see if global warming would reach
dangerous levels this century. Consensus is no,
http://business.financialpost.com/2014/06/19/ipcc-climate-change-warming/?
utm_source=Daily+Carbon+Briefing&utm_campaign=6c73d70ec9DAILY_BRIEFING&utm_medium=email&utm_term=0_876aab4fd76c73d70ec9-303421281)
Even if you pile crazy assumption upon crazy assumption, you cannot even
manage to make climate change cause minor damage The debate over
climate change is horribly polarized. From the way it is conducted, you would think that only two
positions are possible: that the whole thing is a hoax or that catastrophe is inevitable. In fact there is room for lots

that man-made climate change is


real but not likely to do much harm, let alone prove to be the greatest crisis
facing humankind this century. After more than 25 years reporting and commenting on this topic for
of intermediate positions, including the view I hold, which is

various media organizations, and having started out alarmed, thats where I have ended up. But it is not just I that

share it with a very large international organization, sponsored by the


United Nations and supported by virtually all the worlds governments: the
Intergovernmental Panel on Climate Change (IPCC) itself. The IPCC commissioned four
hold this view. I

different models of what might happen to the world economy, society and technology in the 21st century and what
each would mean for the climate, given a certain assumption about the atmospheres sensitivity to carbon

Three of the models show a moderate, slow and mild warming, the hottest of
which leaves the planet just 2 degrees Centigrade warmer than today in 2081-2100 .
The coolest comes out just 0.8 degrees warmer. Now two degrees is the threshold at
which warming starts to turn dangerous, according to the scientific consensus. That
is to say, in three of the four scenarios considered by the IPCC, by the time my
childrens children are elderly, the earth will still not have experienced any harmful
warming, let alone catastrophe. But what about the fourth scenario? This is known
as RCP8.5, and it produces 3.5 degrees of warming in 2081-2100. Curious to know what
assumptions lay behind this model, I decided to look up the original papers describing the creation of this scenario .
Frankly, I was gobsmacked. It is a world that is very, very implausible. For a start,
this is a world of continuously increasing global population so that there are 12 billion on
dioxide.

the planet. This is more than a billion more than the United Nations expects, and flies in the face of the fact that the
world population growth rate has been falling for 50 years and is on course to reach zero i.e., stable population

Second, the world is assumed in the RCP8.5


scenario to be burning an astonishing 10 times as much coal as today , producing 50% of
in around 2070. More people mean more emissions.

its primary energy from coal, compared with about 30% today. Indeed, because oil is assumed to have become
scarce, a lot of liquid fuel would then be derived from coal. Nuclear and renewable technologies contribute little,
because of a slow pace of innovation and hence fossil fuel technologies continue to dominate the primary energy

These
are highly unlikely assumptions. With abundant natural gas displacing coal on a
huge scale in the United States today , with the price of solar power plummeting, with nuclear
power experiencing a revival, with gigantic methane-hydrate gas resources being
portfolio over the entire time horizon of the RCP8.5 scenario. Energy efficiency has improved very little.

discovered on the seabed, with energy efficiency rocketing upwards, and with population growth rates continuing to
fall fast in virtually every country in the world, the one thing we can say about RCP8.5 is that it is very, very

Notice, however, that even so, it is not a world of catastrophic pain. The
per capita income of the average human being in 2100 is three times what it is now.
Poverty would be history. So its hardly Armageddon. But theres an even more
startling fact. We now have many different studies of climate sensitivity based on observational data and they
implausible.

all converge on the conclusion that it is much lower than assumed by the IPCC in these models. It has to be,

otherwise global temperatures would have risen much faster than they have over the past 50 years. As Ross
McKitrick noted on this page earlier this week,

temperatures have not risen at all now for

more than 17 years. With these much more realistic estimates of sensitivity (known as transient climate
response), even RCP8.5 cannot produce dangerous warming. It manages just 2.1C of warming by 2081-2100.

That is to say, even if you pile crazy assumption upon crazy assumption till you
have an edifice of vanishingly small probability, you cannot even manage to make
climate change cause minor damage in the time of our grandchildren, let alone
catastrophe. Thats not me saying this its the IPCC itself. But what strikes me as
truly fascinating about these scenarios is that they tell us that globalization,
innovation and economic growth are unambiguously good for the environment . At the
other end of the scale from RCP8.5 is a much more cheerful scenario called RCP2.6. In this happy world,
climate change is not a problem at all in 2100, because carbon dioxide emissions
have plummeted thanks to the rapid development of cheap nuclear and solar, plus
a surge in energy efficiency. The RCP2.6 world is much, much richer. The average person has an income
about 15 times todays in real terms, so that most people are far richer than Americans are today. And it achieves
this by free trade, massive globalization, and lots of investment in new technology. All the things the green

The answer to climate change


is, and always has been, innovation. To worry now in 2014 about a very small, highly
implausible set of circumstances in 2100 that just might, if climate sensitivity is much higher
than the evidence suggests, produce a marginal damage to the world economy, makes no sense. Think of
all the innovation that happened between 1914 and 2000. Do we really think there
will be less in this century? As for how to deal with that small risk, well there are
several possible options. You could encourage innovation and trade. You could put a modest but growing
tax on carbon to nudge innovators in the right direction. You could offer prizes for low-carbon technologies. All of
these might make a little sense. But the one thing you should not do is pour public subsidy into
movement keeps saying it opposes because they will wreck the planet.

supporting old-fashioned existing technologies that produce more carbon dioxide per unit of energy even than coal
(bio-energy), or into ones that produce expensive energy (existing solar), or that have very low energy density and

The IPCC produced two reports last year. One said that
the cost of climate change is likely to be less than 2% of GDP by the end of this
century. The other said that the cost of decarbonizing the world economy with
renewable energy is likely to be 4% of GDP. Why do something that you know
will do more harm than good?
so require huge areas of land (wind).

1NC Warming Inev


Even if they win that warming is real and caused by CO2,
warming is inevitable
Skuce 4/19 a recently-retired geophysical consultant living in British
Columbia. He has a BSc in geology from Sheffield University and an MSc in
geophysics from the University of Leeds. His work experience includes a
period at the British Geological Survey in Edinburgh and work for a variety of
oil companies based in Calgary, Vienna and Quito (Andrew, Global Warming:
Not Reversible, But Stoppable, Skeptical Science, 2014,
http://www.skepticalscience.com/global-warming-not-reversible-butstoppable.html)
Bringing human emissions to a dead stop , as shown by the red lines in Figure 1, is not a realistic
option. This would put the entire world, all seven billion of us, into a new dark age and the
human suffering would be unimaginable. For this reason, most climate models dont
even consider it as a viable scenario and, if they run the model at all, it is as a
"what-if". Even cutting back emissions severely enough to stabilize CO2
concentrations at a fixed level, as shown in the blue lines in Figure 1, would still require massive
and rapid reductions in fossil fuel use . But, even this reduction would not be
enough to stop future warming. For example, holding concentration levels steady
at 380 ppm would lead to temperatures rising an additional 0.5 degrees C over
the next two hundred years. This effect is often referred to as warming in the
pipeline: extra warming that we cant do anything to avoid . The most important
distinction to grasp, though, is that the inertia is not inherent in the physics and
chemistry of the planets climate system, but rather in our inability to change our
behaviour rapidly enough. Figure 2 shows the average lifetimes of the equipment and infrastructure that we rely
upon in the modern world. Cars last us up to 20 years; pipelines up to 50; coal-fired plants 60; our buildings and urban

It takes time to change our ways, unless we discard working


vehicles, power plants and buildings and immediately replace them with, electric
carsnewable energy plants and new, energy-efficient buildings . Warming in the
pipeline is not, therefore, a very good metaphor to describe the natural climate
system, if we could stop emissions, the warming would stop. However, when it
comes to the decisions we are making to build new, carbon-intensive
infrastructure, such as the Keystone XL pipeline, the expression is quite literally true.
infrastructure a century.

1NC Adaptation
Animals and plants will adapt and thrive our evidence
assumes rapid change
Contescu 12 Professor Emeritus of Geology and Geography at Roosevelt
University, Ph.D. (Lorin, "600 MILION YEARS OF CLIMATE CHANGE; A
CRITIQUE OF THE ANTHROPOGENIC GLOBAL WARMING HYPOTESIS FROM A
TIME-SPACE PERSPECTIVE, Geo-Eco-Marina, 2012, Issue 18, pgs. 5-25, peer
reviewed, Proquest)

climate warming has also important favorable effects, mostly


on plants and indirectly on animals that feed on the plants. Studies concluded that the most
feared doubling of the atmospheric CO2 will increase the productivity of herbs by
30%-50% and of trees by 50%-80%. Many plants will grow faster and healthier
during a warmer climate (Idso et al., 2003), and produce more offsprings. It also appears that
plants can survive quite well when climatic conditions change, even when change
is rapid. For instance, cold-adapted trees can still grow to maturity (though slower) even 100The other side of the coin shows that

150 km north of their natural range, and they also grow as well as much as 1,000 km south of their southern

Shifting climate boundaries will also generate competition among


species of grasses and trees, leading to the selection of those most adaptable to
changing conditions. The conclusion that can be drawn from the above considerations is that both the
vegetal and animal kingdoms are far more resilient and adaptable even for
relatively quickly environmental modifications. If a species becomes extinct, a biological
niche becoming thus empty, it will be quickly occupied by another species better
adapted to the new eco logical conditions, as bio-ecological history of the planet has
demonstrated time and again.
boundaries.

1NC CO2 =/=Key


Even if they win an impactCO2 doesnt increase warming
peer viewed studies prove and the IPCC is wrong
Ballonoff 14 Economist, a former utility rate regulator in Kansas and
Illinois, writer for the Cato Institute (Paul, A Fresh Look at Climate Change,
Winter, AN INTERDISCIPLINARY JOURNAL OF PUBLIC POLICY ANALYSIS, The
Cato Journal, Volume 34, Number 1,
http://object.cato.org/sites/cato.org/files/serials/files/catojournal/2014/2/cato34n1issuelow.pdf#page=119)

The foundation of the modern climate change discussion is the accurate observation
that human activity has significantly increased the atmospheric concentration of
CO2, and that such activity is continuing (Tans 2009). Increased CO2 concentration, especially when
amplified by predicted feedback effects thus also is assumed to predict increasing global average
atmospheric temperature. Depending on the degree of warming expected, other serious and mainly
undesired effects are predicted. As The Economist (2013a) observed, the average global temperature
did rise on average over the previous century . Following a 25-year cooling trend post-World War II,
temperatures increased at an especially strong rate in the quarter century ending in 1997. The trend of that
warming period, the correlation with increased CO2, and the fact of human activity
causing that CO2 increase apparently supported use of projection models extending
that trend to future years. Such projections were the basis for the UNs 1997 IPCC analysis on which much
current policy is based. It is thus at least ironic that 1997 was also the last year in which such measured global

One of the key features of the IPCC forecast, and


greenhouse effect forecasts generally, is the expected feedback loops. One of those is that
average temperature increase took place.

A
distinct kind of greenhouse effect is also predicted from increased CO2 concentration
namely, the aerial fertilization effect, which is that plants grow better in
an atmosphere of higher CO2. Many analysts, such as the IPCC, clearly thought the
greater effect would be from heating, not plant growth . One must assume this was an
the presumed drier and hotter conditions on the ground would cause expanded desertification and deforestation.

intentional judgment, as the IPCC was aware of the CO2 aerial fertilization effect from its 1995 Second Assessment
Report, which contained empirical evidence of increased greening in enhanced CO2 environments (Reilly 2002: 19).

In contrast, climate analysts such as those with the Cato Center for the Study of Science have argued
since 1999 that atmospheric temperature is much less sensitive to increased
concentration of CO2 (Michaels 1999b). While in fact heating has not occurred as the IPCC
forecasted, greatly increased global biomass is indeed demonstrated . Well documented
evidence shows that concurrently with the increased CO2 levels, extensive, large,
and continuing increase in biomass is taking place globallyreducing deserts,
turning grasslands to savannas, savannas to forests, and expanding existing forests
(Idso 2012). That survey covered 400 peer-reviewed empirical studies , many of
which included surveys of dozens to hundreds of sources. Comprehensive study of
global and regional relative greening and browning using NOAA data showed that
shorter-term trends in specific locations may reflect either greening or browning,
and also noted that the rapid pace of greening of the Sahel is due in part to the end
of the drought in that region. Nevertheless, in nearly all regions and globally, the overall effect in
recent decades is decidedly toward greening (de Jong et al. 2012). This result is also the
opposite of what the IPCC expected. Global greening in response to increased CO2
concentrations was clearly predicted by a controlled experiment of the U.S. Water
Conservation Laboratory conducted from 1987 through 2005 (Idso 1991).1 In that study,
half of a group of genetically identical trees were grown in natural conditions and
the other half in the same conditions but in an atmosphere of enhanced CO2

concentration. By 1991 the Agricultural Research Service (ARS) reported that the trees in the
enhanced CO2 environment contained more than 2.8 times more sequestered
carbon than the natural environment trees (i.e., were 2.8 times larger). By 2005, when the
experiment was ended, the total additional growth of the enhanced CO2 trees was 85 percent more than that of the

One reason for expanded growth even into


dry environments is a seldom remarked propensity that CO2 induced growth due to
aerial fertilization also greatly increases a plants efficiency of use of water . The ARS
further documented this effect in a 2011 study, citing the extensive literature demonstrating that enhanced
CO2 environments impact growth through improved plant water relations (Prior et al.
2011). Similar results, both as to aerial fertilization effect and increased efficiency of
water use, were found by the joint study of the USDA and the U.S. Department of
Energy on the effects of CO2 on agricultural production in the United States (Reilly
2002). In that study, the effect of forecasted increased CO2 concentration, together with
the increased warming forecasted, was shown to cause up to 80 percent
increases in agricultural productivity, and decreased use of water since the
growth would occur faster and with more efficient water use by plants. While different
crops were forecasted to respond differently, most crops were positively affected , with a range from 10
natural-condition trees, both in woody mass and in fruit.

percent reduction in yield up to 80 percent increase. Even considering the complex interactions with market

the overall effect was certainly found to be favorable . Using demonstrated


ARS study also predicted effects of further or even greatly
enhanced atmospheric CO2 concentrations, such as from the expected large
increase that might come (and subsequently did come and is continuing) especially from
developing and newly industrializing countries . Comparing demonstrated warming to that date to
the evidence, the ARS study concluded: If past is prologue to the future, how much more CO2
induced warming is likely to occur? Very little. . . . The warming yet to be faced
cannot be much more than what has already occurred. . . . A doubling of
current emissions, for example, would lead to an atmospheric CO2 content on the order
of 700 ppm, which would probably be climatically acceptable , but only if the earths
forests are not decimated in the meantime [Idso 1991: 96465]. The 1991 study noted that
expanded forested areas would allow even greater atmospheric CO2 concentrations. To assure the
measured results were accurate and a reasonable basis on which to infer the effect of
global-scale CO2 concentration, the ARS also published results of eight additional
distinct empirical studies of natural processes , each of which independently verified that the
measured results found by direct experiment were a reasonable basis for such extrapolation (Idso 1998). The
effects were recently further verified by models whose results were compared to
empirical data on Australian and other arid regions. Modeling water use by plants in
enhanced CO2 environment, the study predicted the effect on plant growth in dry
regions and verified the result empirically compared to actual measurements over a
30-year period (AGU 2013). The data verified the prediction both in the direction and in the quantity of effect
observed: Enhanced CO2 improves water use by plants and reduces, not increases, dry
regions by making them greener. Thus, evidence to date implies that the view that
global temperature is far less sensitive to CO2 than many fear, is likely correct .
conditions,

experimental data, the 1991

Simultaneously, demonstrated experimental evidence on plant growth predicted exactly what the now extensive

Enhanced CO2 is associated with greatly increased biomass


production, even in dry climates. The extent of increased CO2 sequestration both in
soil and in biomass associated with increased atmospheric concentration has also
been documented (Pan et al. 2011). Those results, while not what the IPCC
predicted, do not imply we should have no concerns about climate policy.
empirical literature shows:

2NC CO2 =/= Key


Empiricsthey go negativedata shows temperature rise
comes before CO2 spikes
Contescu 12 Professor Emeritus of Geology and Geography at Roosevelt
University, Ph.D. (Lorin, "600 MILION YEARS OF CLIMATE CHANGE; A
CRITIQUE OF THE ANTHROPOGENIC GLOBAL WARMING HYPOTESIS FROM A
TIME-SPACE PERSPECTIVE, Geo-Eco-Marina, 2012, Issue 18, pgs. 5-25, peer
reviewed, Proquest)
Most unsettling is the fact that data show quite clearly that during glacialinterglacial intervals the rise in temperature has preceded the increase in
atmospheric CO2 and not the other way around (Lee Ray, 1993; Solomon, 2008). Indeed, the
analysis of Antarctica ice cores determined that temperatures over the continent
started to rise centuries (more precisely some 800 years) before the atmospheric CO2 levels
begun to increase.

Cyber Crime Adv

Notes
At least theres a coherent arg? Nuclear terror attack ev doesnt indicate
where attack would occurdont let them be shifty about this.
30 second explainer: backdoors = cause organized crime b/c theyd be
exploited, that funds organized crime, some random nuke terror card that
doesnt have a coherent internal link with the AFF, retaliation and extinction

CX Questions
Zaitseva doesnt actually talk about organized crime in Russia, what are the
scenarios for nuke terror/whats the internal link from Russian crime to nuke
terror?
Where would the US retaliate and how would we attribute nuclear attack?
Ayson ev assumes US retaliation in a world in which US-Russia and US-China
are already exchanging military threatswheres the ev that this is
happening in the status quo?

1NC No Impact
No impact to backdoors, and there are already solutions to
backdoors their evidence
Kohn 14
(Cindy, writer for the Electronic Freedom Foundation, 9-26-14, Nine Epic
Failures of Regulating Cryptography,
https://www.eff.org/deeplinks/2014/09/nine-epic-failures-regulatingcryptography, BC)

here's a refresher
list of why forcing companies to break their own privacy and security measures by
installing a back door was a bad idea 15 years ago: It will create security risks. Don't
take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First,
it's hard to secure communications properly even between two parties. Cryptography with a back door adds a
third party, requiring a more complex protocol, and as Bellovin puts it: "Many previous attempts to add such features have
For those who weren't following digital civil liberties issues in 1995, or for those who have forgotten,

resulted in new, easily exploited security flaws rather than better law enforcement access." It doesn't end there. Bellovin notes:
Complexity in the protocols isn't the only problem; protocols require computer programs to implement them, and more complex
code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked
by an unknown party. The so-called 'lawful intercept' mechanisms in the switch that is, the features designed to permit the police
to wiretap calls easily was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime
minister's. This attack would not have been possible if the vendor hadn't written the lawful intercept code. More recently, as

a Cisco wiretapping architecture


designed to accommodate law-enforcement requirements a system
already in use by major carriers had numerous security holes in its
design. This would have made it easy to break into the communications
network and surreptitiously wiretap private communications." The same
is true for Google, which had its "compliance" technologies hacked by
China. This isn't just a problem for you and me and millions of companies that need secure communications. What will the
security researcher Susan Landau explains, "an IBM researcher found that

government itself use for secure communications? The FBI and other government agencies currently use many commercial products
the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or
does the government plan to stop using commercial communications technologies altogether? It won't stop the bad guys. Users
who want strong encryption will be able to get it from Germany, Finland, Israel, and many other places in the world where it's
offered for sale and for free. In 1996, the National Research Council did a study called "Cryptography's Role in Securing the
Information Society," nicknamed CRISIS. Here's what they said: Products using unescrowed encryption are in use today by millions
of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data,

Users
could store their data on remote computers, accessible through the click
of a mouse but otherwise unknown to anyone but the data owner, such
practices could occur quite legally even with a ban on the use of
unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government
using whatever means were available, before their data were accepted by an escrowed encryption device or system.

publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand
from criminal elements. CRISIS Report at 303 None of that has changed. And of course, more encryption technology is more
readily available today than it was in 1996. So unless the goverment wants to mandate that you are forbidden to run anything that

It will
harm innovation. In order to ensure that no "untappable" technology exists, we'll
likely see a technology mandate and a draconian regulatory framework . The
implications of this for America's leadership in innovation are dire. Could
Mark Zuckerberg have built Facebook in his dorm room if he'd had to build in
surveillance capabilities before launch in order to avoid government fines? Would Skype have
ever happened if it had been forced to include an artificial bottleneck to allow
government easy access to all of your peer-to-peer communications? This has especially serious
implications for the open source community and small innovators. Some open
source developers have already taken a stand against building back doors into
software. It will harm US business. If, thanks to this proposal, US businesses cannot
innovate and cannot offer truly secure products, we're just handing
is not U.S. government approved on your devices, they won't stop bad guys from getting access to strong encryption.

business over to foreign companies who don't have such limitations . Nokia,
Siemens, and Ericsson would all be happy to take a heaping share of the
communications technology business from US companies . And it's not just telecom carriers and
VOIP providers at risk. Many game consoles that people can use to play over the Internet, such
as the Xbox, allow gamers to chat with each other while they play. They'd have to
be tappable, too.

1NC Cyber Inev


Cybersecurity vulnerabilities are inevitable
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)
Like CALEA, a statutory obligation along the lines proposed herein will inevitably trigger criticisms and generate

One obvious criticism is that the creation of an escrow key or the maintenance
of a duplicate key by a manufacturer would introduce an unacceptable risk of
compromise for the device. This argument presupposes that the risk is significant, that
the costs of its exploitation are large, and that the benefit is not worth the risk. Yet
manufacturers, product developers, service providers and users
constantly introduce such risks. Nearly every feature or bit of code added
to a device introduces a risk, some greater than others. The vulnerabilities
that have been introduced to computers by software such as Flash, ActiveX
controls, Java, and web browsers are well documented .51 The ubiquitous SQL
database, while extremely effective at helping web designers create effective data
driven websites, is notorious for its vulnerability to SQL injection attacks. 52 The
adding of microphones to electronic devices opened the door to aural interceptions.
Similarly, the introduction of cameras has resulted in unauthorized video surveillance
of users. Consumers accept all of these risks, however, since we, as individual users
and as a society, have concluded that they are worth the cost. Some will inevitably
argue that no new possible vulnerabilities should be introduced into devices to allow
the government to execute reasonable, and therefore lawful, searches for unique and
otherwise unavailable evidence. However, this argument implicitly asserts that
there is no, or insignificant, value to society of such a feature. And herein lies the
Achilles heel to opponents of mandated front-door access: the conclusion is entirely at odds with
the inherent balance between individual liberty and collective security central to the
Fourth Amendment itself. Nor should lawmakers be deluded into believing that the
currently existing vulnerabilities that we live with on a daily basis are less significant
in scope than the possibility of obtaining complete access to the encrypted contents
of a device. Various malware variants that are so widespread as to be almost
omnipresent in our online community achieve just such access through what would
seem like minor cracks in the defense of systems. 53 One example is the Zeus
malware strain, which has been tied to the unlawful online theft of hundreds of
millions of dollars from U.S. companies and citizens and gives its operator complete
access to and control over any computer it infects .54 It can be installed on a machine through
the simple mistake of viewing an infected website or email, or clicking on an otherwise innocuous link.55 The
malware is designed to not only bypass malware detection software, but to
deactivate to softwares ability to detect it.56 Zeus and the many other variants of malware that
concerns.

are freely available to purchasers on dark-net websites and forums are responsible for the theft of funds from
countless online bank accounts (the credentials having been stolen by the malwares key-logger features), the theft
of credit card information, and innumerable personal identifiers.57

2NC Cyber Inev


Security issues are inevitable
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal. Consider the report issued this past week by a group of
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

computer security experts (including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys
Under Doormats: Mandating Insecurity By Requiring Government Access to All Data and Communications." The
report does not make an in-principle argument or a conceptual argument against extraordinary access. It argues,
rather, that the effort to build such a system risks eroding cybersecurity in ways far more important than the
problems it would solve. The authors, to summarize, make three claims in support of the broad claim that any

What are those


"grave security risks"? "[P]roviding exceptional access to communications would
force a U-turn from the best practices now being deployed to make the Internet
more secure. These practices include forward secrecy where decryption keys are deleted
exceptional access system would "pose . . . grave security risks [and] imperil innovation."

immediately after use, so that stealing the encryption key used by a communications server would not compromise
earlier or later communications. A related technique, authenticated encryption, uses the same temporary key to

"[B]uilding in
exceptional access would substantially increase system complexity" and
"complexity is the enemy of security." Adding code to systems increases that system's attack surface,
guarantee confidentiality and to verify that the message has not been forged or tampered with."

and a certain number of additional vulnerabilities come with every marginal increase in system complexity. So by
requiring a potentially complicated new system to be developed and implemented, we'd be effectively

"[E]xceptional access would create


concentrated targets that could attract bad actors." If we require tech companies to retain some
guaranteeing more vulnerabilities for malicious actors to hit.

means of accessing user communications, those keys have to stored somewhere, and that storage then becomes
an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large
numbers of users. The strong implication of the report is that these issues are not resolvable,
though the report never quite says that. But at a minimum, the authors raise a series of important questions about
whether such a system would, in practice, create an insecure internet in generalrather than one whose general
security has the technical capacity to make security exceptions to comply with the law. There is some reason, in my

the picture may not be quite as stark as the computer scientists


make it seem. After all, the big tech companies increase the complexity of their
software products all the time, and they generally regard the increased attack
surface of the software they create as a result as a mitigatable problem. Similarly,
there are lots of high-value intelligence targets that we have to secure and would
have big security implications if we could not do so successfully. And when it really counts,
view, to suspect that

that task is not hopeless. Google and Apple and Facebook are not without tools in the cybersecurity department.

The real question, in my view, is whether a system of the sort Comey imagines could be built in
fashion in which the security gain it would provide would exceed the heightened
security risks the extraordinary access would involve. As Herb Lin puts it in his excellent, and
admirably brief, Senate testimony the other day, this is ultimately a question without an answer in the absence of a
lot of new research. "One side says [the] access [Comey is seeking] inevitably weakens the security of a system and
will eventually be compromised by a bad guy; the other side says it doesnt weaken security and wont be
compromised. Neither side can prove its case, and we see a theological clash of absolutes." Only when someone
actually does the research and development and tries actually to produce a system that meets Comey's criteria are
we going to find out whether it's doable or not. And therein lies the rub, and the real meat of the policy problem, in
my view: Who's going to do this research? Who's going to conduct the sustained investment in trying to imagine a
system that secures communications except from government when and only government has a warrant to
intercept those communications? The assumption of the computer scientists in their report is that the burden of
that research lies with the government. "Absent a concrete technical proposal," they write, "and without answers to
the questions raised in this report, legislators should reject out of hand any proposal to return to the failed
cryptography control policy of the 1990s." Indeed, their most central recommendation is that the burden of
development is on Comey. "Our strong recommendation is that anyone proposing regulations should first present
concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses
and for hidden costs." In his testimony, Herb supports this call, though he acknowledges that it is not the inevitable
route: the government has not yet provided any specifics, arguing that private vendors should do it. At the same
time, the vendors wont do it, because [their] customers arent demanding such features. Indeed, many customers
would see such features as a reason to avoid a given vendor. Without specifics, there will be no progress. I believe
the government is afraid that any specific proposal will be subject to enormous criticismand thats truebut the
government is the party that wants . . . access, and rather than running away from such criticism, it should
embrace any resulting criticism as an opportunity to improve upon its initial designs." Herb might also have
mentioned that lots of people in the academic tech community who would be natural candidates to help develop
such an access system are much more interested in developing encryption systems to keep the feds out than to
under any circumstanceslet them in. The tech community has spent a lot more time and energy arguing against
the plausibility and desireability of implementing what Comey is seeking than it has spent in trying to develop
systems that deliver it while mitigating the risks such a system might pose. For both industry and the tech
communities, more broadly, this is government's problem, not their problem. Yet reviving the Clipper Chip model
in which government develops a fully-formed system and then puts it out publicly for the community to shoot down
is clearly not what Comey has in mind. He is talking in very different language: the language of performance
requirements. He wants to leave the development task to Silicon Valley to figure out how to implement

wants to describe what he needsdecrypted signal when he


has a warrantand leave the companies to figure out how to deliver it while still
providing secure communications in other circumstances to their customers. The
advantage to this approach is that it potentially lets a thousand flowers
bloom. Each company might do it differently. They would compete to
provide the most security consistent with the performance standard. They
government's requirements. He

could learn from each other. And government would not be in the position
of developing and promoting specific algorithms. It wouldn't even need to
know how the task was being done.

1NC Crime Inev


Organized crime inevitable--No jurisdiction, weak states, trade
offs, too adaptable
Dr. Phil Williams is Professor of International Security in the Graduate School of Public and International
Affairs at the University of Pittsburgh 8-18-2006
http://www.oup.com/uk/orc/bin/9780199289783/baylis_chap09.pdf
There are several reasons for this. First, in spite of growing international cooperation among national law
enforcement agencies, law enforcement remains a national activity confined to a single territorial
jurisdiction, while organized crime is transnational in scope. In effect, law enforcement still
continues to operate in a bordered world, whereas organized crime operates in a borderless
world. Second, although the United States placed a high priority on denying safe haven or sanctuary to
international criminals, many states have limited capacity to enforce laws against organized

crime.Consequently, transnational criminal organizations are able to operate from safe


havens, using a mix of corruption and violence to perpetuate the weakness of the states from which
they operate.Nowhere is this more evident than in Mexico, where a war for control of routes and markets on the
northern border has led to violence spilling over into the United States. Third, all too often attacking
transnational criminal organizations has been subordinated to other goals and objectives . In
spite of the emphasis on attacking smuggling and smugglers, for example, this is not something which has
been allowed to interfere with global trade. In effect, reaping the benefits of globalization, tacitly at least, has been
deemed more important than combating transnational organized crime.Not surprisingly, therefore, as Moises Naim
has pointed out, there is simply nothing in the cards that points to an imminent reversal of fortune for the myriads
of networks active in illicit trade. It is even difficult to find evidence of substantial progress in reversing or
even just containing the growth of these illicit markets(2005: 221). Fourth, both transnational criminal
organizations and the illicit markets in which they operate are highly adaptable. Law
enforcement success against a particular organization, for example, tends simply to offer opportunities for its rivals
to fill the gap.Moreover, the ability of organizations to move from one illicit product to another
makes them even more difficult to combat. In recent years, for example, Burmese warlords have moved
from opium to methamphetamine production and have become major suppliers to Asian markets for the drug.

2NC Crime Inev


Organized crime inevitable--Globalization
Dr. Phil Williams is Professor of International Security in the Graduate School of Public and International
Affairs at the University of Pittsburgh 8-18-2006
http://www.oup.com/uk/orc/bin/9780199289783/baylis_chap09.pdf

Globalization has

had paradoxical consequences for both transnational organized crime and


international terrorism, acting as both motivator and facilitator. This is not entirely surprising. Although
globalization has had many beneficial consequences, it has losers as well as winnersand the pain for the losers
can be enormous. Indeed, globalization has had a disruptive impact on patterns of employment,

on traditional cultures, and on the capacity of states to deal with problems facing citizens
within their jurisdictions, as well as problems that span multiple jurisdictions . In some instances,
globalization has created massive economic dislocation that has pushed people from the legal economy to the
illegal. In other cases, globalization has been seen as merely a cover for Western and especially United States
cultural and economic dominationdomination that has created enough resentment to help fuel what has become
the global jihad movement. At the same time, globalization has acted as a facilitator for a whole set of
illicit activities ranging from drugs and arms trafficking to the use of large-scale violence against innocent
civilians. Many observers assumed that in the post-cold war world, democracy, peace, stability and order could
easily be exported from the advanced post-industrialized states to areas of conflict and instability (Singer and
Wildavsky 1993). In fact the opposite has occurred. Al-Qaeda was able to attack the United States homeland while
based in Afghanistan, thereby illustrating what Robert Keohane described as the transformation of geography from
a barrier to a connector (2002: 275). Indeed, one of the most important characteristics of a globalized world is that
the interconnections among different parts of the world are dense, communication is cheap and easy, and
transportation and transmission, whether of disease, crime, or violence, are impossible to stop. Transnational
networks link businessmen, families, scientists, and scholars; they also link members of terrorist networks and
criminal organizations. In some cases, networks are successfully integrated into the host societies. In other
instances, however, migrants find themselves in what Castells called zones of social exclusion (1998: 72).Muslim
immigration from North Africa and Pakistan to Western Europe, for example, has resulted in marginalization and
alienation that were evident in the widespread riots in France in the late months of 2005 and that have also helped
to fuel radical Islamic terrorism in Western Europe.Moreover, for second and third generation immigrants who have
limited opportunities in the licit economy, the illegal economy and either petty crime or organized crime can appear
as an attractive alternative. Ethnic networks of this kind can provide both cover and recruitment opportunities for
transnational criminal and terrorist organizations. In effect, therefore, globalization has acted as a force
multiplier for both criminal and terrorist organizations, providing them with new resources and

new opportunities.

Organized crime inevitable--Organization advantage


Dr. Phil Williams is Professor of International Security in the Graduate School of Public and International
Affairs at the University of Pittsburgh 8-18-2006
http://www.oup.com/uk/orc/bin/9780199289783/baylis_chap09.pdf
The bottom line on all this is that, even though the United States has developed clear strategies for
combating both organized crime and terrorism, the implementation of these strategies is clearly hindered
by the dominance of governmental structures that were wellsuited to the cold war against a slow,
bureaucratic, ponderous adversary but are singularly ill-suited to combating agile transnational
adversaries. In the final analysis, fighting terrorism and transnational organized crime is not only about
strategy, it is also about appropriate organizational structures to implement strategy.And in that
respect, terrorists and criminals have the advantage. The result is that the efforts of the United States
and the international community to combat both crime and terrorism are unlikely to meet with
unqualified success.

Organized crime inevitable--Too agile, bureaucratic inefficiency


Dr. Phil Williams is Professor of International Security in the Graduate School of Public and International

2006

Affairs at the University of Pittsburgh 8-18http://www.oup.com/uk/orc/bin/9780199289783/baylis_chap09.pdf


In many respects, the threats posed to the United States and more broadly to the international
community of states by transnational organized crime and terrorism can be understood as an
important manifestation of the new phase in world politics in which some of the key interactions are
between the state system and what James Rosenau (1990) termed the multi-centric system,
composed of sovereignty-free actors . In this connection, it is notable that the first serious challenge
to United States hegemony in the post-cold war world came not from another state but from a
terrorist network.Moreover, both criminals and terrorists have certain advantages over states:
they are agile, distributed, highly dynamic organizations with a capacity to morph or

transform themselves when under pressure. States in contrast are slow, clumsy,
hierarchical, and bureaucratic and, although they have the capacity to bring lots of resources to

bear on a problem, can rarely do this with speed and efficiency. As discussed above, in the United
States war on terror, the strategy for the war of ideas was very slow to develop, not least because of
inter-agency differences. The same has been true in the effort to combat terrorist finances. As the
Government Accountability Office (2005) has noted, the U.S. government lacks an integrated strategy
to coordinate the delivery of counter-terrorism financing training and technical assistance to countries
vulnerable to terrorist financing. Specifically, the effort does not have key stakeholder acceptance of
roles and procedures, a strategic alignment of resources with needs, or a process to measure
performance. Differences of perspective and approach between the Departments of State
and Treasury have also seriously bedevilled the effort to enable weak states, one of the keys to
the multilateral component of the administrations strategy to combat terrorism. Similar problems
have been evident in efforts to combat organized crime and drug trafficking.A striking example
is the counter-drug intelligence architecture for the United States which has the Crime and Narcotics
Center at CIA looking at the international dimension of drug trafficking, the National Drug Intelligence
Center responsible for domestic aspects of the problem, the Treasurys Financial Crimes Enforcement
Network focusing on money laundering, and the El Paso Intelligence Center responsible for tactical
intelligence. Although this architecture provides clear roles and responsibilities, it also creates
bureaucratic seams in the effort to understand and assess what is clearly a seamless process of drug
trafficking and money laundering across borders. Although good information exchanges can ease this
problem, the architecture is far from optimal.

1NC No Retaliation
Domestic and international opposition block retaliation.
Bremmer 4
(Ian, President Eurasia Group and Senior Fellow World
Policy Institute, New Statesman, 9-13, Lexis)
This time, the public response would move much more quickly from shock to anger; debate over how America should respond would begin immediately.

it is difficult to imagine how the Bush administration could focus its response on an external
enemy. Should the US send 50,000 troops to the Afghan-Pakistani border to intensify the hunt for Osama Bin Laden and "step up" efforts to attack
Yet

the heart of al-Qaeda? Many would wonder if that wasn't what the administration pledged to do after the attacks three years ago. The president would
face intensified criticism from those who have argued all along that Iraq was a distraction from "the real war on terror". And what if a significant number of

The Bush administration could hardly


take military action against the Saudi government at a time when crude-oil prices are already more than $45 a barrel and
global supply is stretched to the limit. While the Saudi royal family might support a co-ordinated attack against
the terrorists responsible for the pre-election attack were again Saudis?

terrorist camps, real or imagined, near the Yemeni border - where recent searches for al-Qaeda have concentrated - that would seem like a trivial,
insufficient retaliation for an attack on the US mainland. Remember how the Republicans criticised Bill Clinton's administration for ineffectually "bouncing
the rubble" in Afghanistan after the al-Qaeda attacks on the US embassies in Kenya and Tanzania in the 1990s. So what kind of response might be
credible? Washington's concerns about Iran are rising. The 9/11 commission report noted evidence of co-operation between Iran and al-Qaeda operatives,
if not direct Iranian advance knowledge of the 9/11 hijacking plot. Over the past few weeks, US officials have been more explicit, too, in declaring Iran's
nuclear programme "unacceptable". However, in the absence of an official Iranian claim of responsibility for this hypothetical terrorist attack, the

domestic opposition to such a war and the international outcry it would


provoke would make quick action against Iran unthinkable. In short, a decisive response from
Bush could not be external. It would have to be domestic. Instead of Donald Rumsfeld, the defence secretary, leading a war effort abroad, Tom Ridge, the
homeland security secretary, and John Ashcroft, the attorney general, would pursue an anti-terror campaign at home. Forced to use legal tools more
controversial than those provided by the Patriot Act,

domestic surveillance

Americans would experience stepped-up

and border controls, much tighter security in public places and the detention of a large number of

concern for civil liberties and personal freedom


would ensure that the government would have nowhere near the
public support it enjoyed for the invasion of Afghanistan.
suspects. Many Americans would undoubtedly support such moves. But

2NC No Retaliation
Obama wont retaliate to terrorist attack
Crowley 10
(Michael, Senior Editor New Republic, Obama and Nuclear
Deterrence, The New Republic, 1-5,
http://www.tnr.com/node/72263)
some experts don't place much weight on how our publicly-stated
doctrine emerges because they don't expect foreign nations to take it literally. And the reality is that any decisions
about using nukes will certainly be case-by-case. But I'd still like to see some wider
As the story notes,

discussion of the underlying questions, which are among the most consequential that policymakers can consider. The questions are particularly vexing

Would we, for instance, actually nuke Pyongyang if it


sold a weapon to terrorists who used it in America? That implied threat
seems to exist, but I actually doubt that a President Obama--or any
president, for that matter--would go through with it.
when it comes to terrorist groups and rogue states.

No escalation studies show the public wont support military


intervention in the name of terrorism
Huddy et al 05 (Leonie, Department of Political Science SUNY at Stony Brook Amer. Journal
Poli. Sci., Vol 49, no 3)

The findings from this study lend further insight into the future trajectory of support for antiterrorism measures in
the United States when we consider the potential effects of anxiety. Security threats in this and other studies
increase support for military action (Jentleson 1992; Jentleson and Britton 1998; Herrmann, Tetlock, and Visser

anxious respondents were less supportive of belligerent military action


against terrorists, suggesting an important source of opposition to military
intervention. In the aftermath of 9/11, several factors were consistently related to
heightened levels of anxiety and related psychological reactions , including living close to
1999). But

the attack sites (Galea et al. 2002; Piotrkowski and Brannen 2002; Silver et al. 2002), and knowing someone who
was hurt or killed in the attacks (in this study). It is difficult to say what might happen if the United States were

a future threat or actual attack


would broaden the number of individuals directly
affected by terrorism and concomitantly raise levels of anxiety. This could, in turn,
lower support for overseas military action . In contrast, in the absence of any additional attacks
attacked again in the near future. Based on our results, it is plausible that
directed at a different geographic region

levels of anxiety are likely to decline slowly over time (we observed a slow decline in this study), weakening
opposition to future overseas military action. Since our conclusions are based on analysis of reactions to a single

we should consider whether they


can be generalized to reactions to other terrorist incidents or to reactions under
conditions of sustained terrorist action. Our answer is a tentative yes, although there is no
event in a country that has rarely felt the effects of foreign terrorism,

conclusive evidence on this point as yet. Some of our findings corroborate evidence from Israel, a country that has
prolonged experience with terrorism. For example, Israeli researchers find that perceived risk leads to increased
vilification of a threatening group and support for belligerent action (Arian 1989; Bar-Tal and Labin 2001). There is
also evidence that Israelis experienced fear during the Gulf War, especially in Tel Aviv where scud missiles were
aimed (Arian and Gordon 1993). What is missing, however, is any evidence that anxiety tends to undercut support
for belligerent antiterrorism measures under conditions of sustained threat. For the most part, Israeli research has
not examined the distinct political effects of anxiety.

1NC No Nuclear Terror


No chance of nuclear terror attack---too tough to execute
John Mueller and Mark G. Stewart 12, Senior Research Scientist at the
Mershon Center for International Security Studies and Adjunct Professor in
the Department of Political Science, both at Ohio State University, and Senior
Fellow at the Cato Institute AND Australian Research Council Professorial
Fellow and Professor and Director at the Centre for Infrastructure
Performance and Reliability at the University of Newcastle, "The Terrorism
Delusion," Summer, International Security, Vol. 37, No. 1,
politicalscience.osu.edu/faculty/jmueller//absisfin.pdf
In 2009, the U.S. Department of Homeland Security (DHS) issued a lengthy report on protecting the homeland. Key
to achieving such an objective should be a careful assessment of the character, capacities, and desires of potential
terrorists targeting that homeland. Although the report contains a section dealing with what its authors call the
nature of the terrorist adversary, the section devotes only two sentences to assessing that nature: The number
and high profile of international and domestic terrorist attacks and disrupted plots during the last two decades

Terrorists have proven to be


relentless, patient, opportunistic, and flexible, learning from experience and
modifying tactics and targets to exploit perceived vulnerabilities and avoid
observed strengths.8 This description may apply to some terrorists somewhere, including at
least a few of those involved in the September 11 attacks. Yet, it scarcely describes the vast
majority of those individuals picked up on terrorism charges in the United States since
those attacks. The inability of the DHS to consider this fact even parenthetically in
its fleeting discussion is not only amazing but perhaps delusional in its
single-minded preoccupation with the extreme. In sharp contrast, the authors of the case
underscore the determination and persistence of terrorist organizations.

studies, with remarkably few exceptions, describe their subjects with such words as incompetent, ineffective,
unintelligent, idiotic, ignorant, inadequate, unorganized, misguided, muddled, amateurish, dopey, unrealistic,
moronic, irrational, and foolish.9 And in nearly all of the cases where an operative from the police or from the
Federal Bureau of Investigation was at work (almost half of the total), the most appropriate descriptor would be

would-be terrorists need to be radicalized


enough to die for their cause; Westernized enough to move around without
raising red flags; ingenious enough to exploit loopholes in the security
apparatus; meticulous enough to attend to the myriad logistical details that
could torpedo the operation; self-sufficient enough to make all the
preparations without enlisting outsiders who might give them away;
disciplined enough to maintain complete secrecy; andabove all
psychologically tough enough to keep functioning at a high level without
cracking in the face of their own impending death.10 The case studies examined in
this article certainly do not abound with people with such characteristics. In the
eleven years since the September 11 attacks, no terrorist has been able to
detonate even a primitive bomb in the United States, and except for the four explosions
in the London transportation system in 2005, neither has any in the United Kingdom. Indeed, the only
method by which Islamist terrorists have managed to kill anyone in the
United States since September 11 has been with gunfireinflicting a total of perhaps
gullible. In all, as Shikha Dalmia has put it,

sixteen deaths over the period (cases 4, 26, 32).11 This limited capacity is impressive because, at one time, smallscale terrorists in the United States were quite successful in setting off bombs. Noting that the scale of the
September 11 attacks has tended to obliterate Americas memory of pre-9/11 terrorism, Brian Jenkins reminds us
(and we clearly do need reminding) that the 1970s witnessed sixty to seventy terrorist incidents, mostly bombings,
on U.S. soil every year.12

The situation seems scarcely different in Europe and other

Western locales. Michael Kenney, who has interviewed dozens of government officials and intelligence
agents and analyzed court documents, has found that, in sharp contrast with the boilerplate characterizations
favored by the DHS and with the imperatives listed by Dalmia,

Islamist militants in those locations are

operationally unsophisticated, short on know-how, prone to making


mistakes, poor at planning, and limited in their capacity to learn .13 Another

study documents the difficulties of network coordination that continually


threaten the terrorists operational unity, trust, cohesion, and ability to act
collectively.14 In addition, although some of the plotters in the cases targeting the
United States harbored visions of toppling large buildings, destroying airports, setting off dirty bombs, or
bringing down the Brooklyn Bridge (cases 2, 8, 12, 19, 23, 30, 42), all

were nothing more than wild

fantasies, far beyond the plotters capacities however much they may have

been encouraged in some instances by FBI operatives. Indeed, in many of the cases,
target selection is effectively a random process, lacking guile and careful
planning. Often, it seems, targets have been chosen almost capriciously and
simply for their convenience. For example, a would-be bomber targeted a mall in Rockford, Illinois,
because it was nearby (case 21). Terrorist plotters in Los Angeles in 2005 drew up a list of targets that were all
within a 20-mile radius of their shared apartment, some of which did not even exist (case 15). In Norway, a neoNazi terrorist on his way to bomb a synagogue took a tram going the wrong way and dynamited a mosque
instead.15

2NC No Nuclear Terror


Terrorists arent pursuing nuclear attacks
Wolfe 12 Alan Wolfe is Professor of Political Science at Boston College. He
is also a Senior Fellow with the World Policy Institute at the New School
University in New York. A contributing editor of The New Republic, The
Wilson Quarterly, Commonwealth Magazine, and In Character, Professor
Wolfe writes often for those publications as well as for Commonweal, The
New York Times, Harper's, The Atlantic Monthly, The Washington Post, and
other magazines and newspapers. March 27, 2012, "Fixated by Nuclear
Terror or Just Paranoia?" http://www.hlswatch.com/2012/03/27/fixatedby-nuclear-terror-or-just-paranoia-2/
the most recent unclassified report to Congress on the acquisition
of technology relating to weapons of mass destruction and advanced
conventional munitions, it does have a section on CBRN terrorism (note, not WMD
terrorism). The intelligence community has a very toned down statement that
says several terrorist groups probably remain interested in [CBRN] capabilities, but not
necessarily in all four of those capabilities. mostly focusing on low-level chemicals and toxins .
Theyre talking about terrorists getting industrial chemicals and making ricin toxin, not nuclear weapons.
And yes, Ms. Squassoni, it is primarily al Qaeda that the U.S. government worries about, no one else. The
trend of worldwide terrorism continues to remain in the realm of conventional
attacks. In 2010, there were more than 11,500 terrorist attacks, affecting about 50,000
victims including almost 13,200 deaths. None of them were caused by CBRN hazards. Of the
If one were to read

11,000 terrorist attacks in 2009, none were caused by CBRN hazards. Of the 11,800 terrorist attacks in 2008,
none were caused by CBRN hazards.

No successful detonation
Schneidmiller 9(Chris, Experts Debate Threat of Nuclear, Biological Terrorism, 13 January 2009,
http://www.globalsecuritynewswire.org/gsn/nw_20090113_7105.php)

There is an "almost vanishingly small" likelihood that terrorists


would ever be able to acquire and detonate a nuclear weapon, one expert said
here yesterday (see GSN, Dec. 2, 2008). In even the most likely scenario of nuclear terrorism, there are 20
barriers between extremists and a successful nuclear strike on
a major city, said John Mueller, a political science professor at Ohio State University.
The process itself is seemingly straightforward but exceedingly difficult -- buy or steal
highly enriched uranium, manufacture a weapon, take the bomb to the
target site and blow itup. Meanwhile, variables strewn across the path to an attack would increase the complexity of
the effort, Mueller argued. Terrorists would have to bribe officials in a state nuclear
program to acquire the material, while avoiding a sting by authorities or a scam by the sellers. The
material itself could also turn out to be bad. "Once the purloined material is purloined, [police are] going to be chasing after you.
They are also going to put on a high reward, extremely high reward, on getting the weapon back or getting the fissile material
back," Mueller said during a panel discussion at a two-day Cato Institute conference on counterterrorism issues facing the incoming
Obama administration. Smuggling the material out of a country would mean relying on criminals who "are very good at extortion"

terrorists would then have to


find scientists and engineers willing to giveup their normal lives to
and might have to be killed to avoid a double-cross, Mueller said. The

manufacture a bomb, which would require an expensive and sophisticated machine shop. Finally, further technological expertise
would be needed to sneak the weapon across national borders to its destination point and conduct a successful detonation, Mueller
said. Every obstacle is "difficult but not impossible" to overcome, Mueller said, putting the chance of success at no less than one in

likelihood of successfully passing through each


obstacle, in sequence, would be roughly one in 3 1/2 billion, he
three for each. The

said, but for argument's sake dropped it to 3 1/2 million. "It's a total gamble. This is a very expensive and difficult thing to do," said
Mueller, who addresses the issue at greater length in an upcoming book, Atomic Obsession. "So unlike buying a ticket to the
lottery ... you're basically putting everything, including your life, at stake for a gamble that's maybe one in 3 1/2 million or 3 1/2

Other scenarios are even less probable, Mueller said. A


nuclear-armed state is "exceedingly unlikely" to hand a
weapon to a terrorist group, he argued: "States just simply won't give it to
somebody they can't control." Terrorists are also not likely
tobe able to steala whole weapon, Mueller asserted, dismissingthe idea of
"loose nukes." Even Pakistan, which today is perhaps the nation of greatest concern regarding nuclear security, keeps
billion."

its bombs in two segments that are stored at different locations, he said (see GSN, Jan. 12). Fear of an "extremely improbable event"
such as nuclear terrorism produces support for a wide range of homeland security activities, Mueller said. He argued that there has
been a major and costly overreaction to the terrorism threat -- noting that the Sept. 11 attacks helped to precipitate the invasion of
Iraq, which has led to far more deaths than the original event. Panel moderator Benjamin Friedman, a research fellow at the Cato

academic and governmental discussions of acts of nuclear or


biological terrorism have tended to focus on "worst-case
assumptions about terrorists' ability to use these weapons to kill us." There is need for consideration for what is
probable rather than simply what is possible, he said. Friedman took issue withthe finding late last
year of an experts' report that an act of WMD terrorism would "more likely than not" occurin
the next half decade unless the international community takes greater action. "I would say that the
report, if you read it, actually offers no analysis to justify that claim, which
seems to have been made to change policy by generating
alarm in headlines." One panel speaker offered a partial rebuttal to Mueller's presentation. Jim Walsh,
principal research scientist for the Security Studies Program at the Massachusetts Institute of
Technology, said he agreed that nations would almost certainly not give
anuclear weapon to a nonstate group, that most terrorist organizations have
no interest in seeking out the bomb, and that it would be
difficult to build a weaponor use one that has been stolen.
Institute, said

Not attractive to nukes


Mueller 11 (John, IR Professor at Ohio State, PhD in pol sci from UCLA, The
Truth about Al Qaeda, http://www.foreignaffairs.com/articles/68012/johnmueller/the-truth-about-al-qaeda?page=show, August 2, 2011)
new
information unearthed in Osama bin Laden's hideout in Abbottabad, Pakistan, suggests
that the United States has been doing so for a full decade. Whatever al Qaeda's threatening
rhetoric and occasional nuclear fantasies, its potential as a menace, particularly as an
atomic one, has been much inflated. The public has now endured a decade
of dire warnings about the imminence of a terrorist atomic attack. In 2004, the
As a misguided Turkish proverb holds, "If your enemy be an ant, imagine him to be an elephant." The

former CIA spook Michael Scheuer proclaimed on television's 60 Minutes that it was "probably a near thing," and in
2007, the physicist Richard Garwin assessed the likelihood of a nuclear explosion in an American or a European city
by terrorism or other means in the next ten years to be 87 percent. By 2008, Defense Secretary Robert Gates
mused that what keeps every senior government leader awake at night is "the thought of a terrorist ending up with
a weapon of mass destruction, especially nuclear." Few, it seems, found much solace in the fact that an al Qaeda

computer seized in Afghanistan in 2001 indicated that

the group's budget for research on

weapons of mass destruction (almost all of it focused on primitive chemical weapons work)
was some $2,000 to $4,000. In the wake of the killing of Osama bin Laden, officials now have more al

Qaeda computers, which reportedly contain a wealth of information about the workings of the organization
in the intervening decade. A multi-agency task force has completed its assessment, and according to first reports, it

al Qaeda members have primarily been engaged in dodging drone


strikes and complaining about how cash-strapped they are. Some reports suggest
they've also been looking at quite a bit of pornography. The full story is not out yet, but it seems
breathtakingly unlikely that the miserable little group has had the time or
inclination, let alone the money, to set up and staff a uranium-seizing
operation, as well as a fancy, super-high-tech facility to fabricate a bomb. It
is a process that requires trusting corrupted foreign collaborators and other
criminals, obtaining and transporting highly guarded material, setting up a
machine shop staffed with top scientists and technicians, and rolling the
heavy, cumbersome, and untested finished product into position to be
detonated by a skilled crew, all the while attracting no attention from
outsiders. The documents also reveal that after fleeing Afghanistan, bin Laden maintained what one member
has found that

of the task force calls an "obsession" with attacking the United States again, even though 9/11 was in many ways a
disaster for the group. It led to a worldwide loss of support, a major attack on it and on its Taliban hosts, and a
decade of furious and dedicated harassment. And indeed, bin Laden did repeatedly and publicly threaten an attack
on the United States. He assured Americans in 2002 that "the youth of Islam are preparing things that will fill your
hearts with fear"; and in 2006, he declared that his group had been able "to breach your security measures" and
that "operations are under preparation, and you will see them on your own ground once they are finished." Al
Qaeda's animated spokesman, Adam Gadahn, proclaimed in 2004 that "the streets of America shall run red with
blood" and that "the next wave of attacks may come at any moment." The obsessive desire notwithstanding,

such fulminations have clearly lacked substance. Although hundreds of millions of people

no true al Qaeda cell has been


found in the country since 9/11 and exceedingly few people have been uncovered who
enter the United States legally every year, and countless others illegally,

even have any sort of "link" to the organization. The closest effort at an al Qaeda operation
within the country was a decidedly nonnuclear one by an Afghan-American, Najibullah Zazi, in 2009. Outraged at
the U.S.-led war on his home country, Zazi attempted to join the Taliban but was persuaded by al Qaeda operatives
in Pakistan to set off some bombs in the United States instead. Under surveillance from the start, he was soon
arrested, and, however "radicalized," he has been talking to investigators ever since, turning traitor to his former
colleagues. Whatever training Zazi received was inadequate; he repeatedly and desperately sought further
instruction from his overseas instructors by phone. At one point, he purchased bomb material with a stolen credit
card, guaranteeing that the purchase would attract attention and that security video recordings would be
scrutinized. Apparently, his handlers were so strapped that they could not even advance him a bit of cash to
purchase some hydrogen peroxide for making a bomb. For al Qaeda, then, the operation was a failure in every way
-- except for the ego boost it got by inspiring the usual dire litany about the group's supposedly existential
challenge to the United States, to the civilized world, to the modern state system. Indeed ,

no Muslim
extremist has succeeded in detonating even a simple bomb in the United
States in the last ten years, and except for the attacks on the London Underground in 2005, neither
has any in the United Kingdom. It seems wildly unlikely that al Qaeda is remotely ready to go nuclear. Outside
of war zones, the amount of killing carried out by al Qaeda and al Qaeda
linkees, maybes, and wannabes throughout the entire world since 9/11
stands at perhaps a few hundred per year. That's a few hundred too many, of
course, but it scarcely presents an existential, or elephantine, threat. And the
likelihood that an American will be killed by a terrorist of any ilk stands at
one in 3.5 million per year, even with 9/11 included.

Internet Freedom Adv

Notes
30 second explainer: encryption k2 human rights k2 demopromo k2 solve
Diamond 95 (sorry its 4:30 AM and Im too tired to write more)

CX Questions

1NC Backdoors Inev


Note: more ev under foreign backdoor CP
The UK will inevitably require backdoors, killing Internet
freedom their evidence
Venezia 7-13
Paul Venezia, system and network architect, and senior contributing editor at
InfoWorld, where he writes analysis, reviews and The Deep End blog,
Encryption with backdoors is worse than useless its dangerous,
InfoWorld, 7/13/15,
http://www.infoworld.com/article/2946064/encryption/encryption-with-forcedbackdoors-is-worse-than-useless-its-dangerous.html, 7/14/15 AV
U.K. Prime Minister David Cameron has said he wants to
either ban strong encryption or require backdoors to be placed into any encryption
code to allow law enforcement to decrypt any data at any time . The fact that these officials
On the other side of the pond,

are even having this discussion is a bald demonstration that they do not understand encryption or how critical it is
for modern life. They're missing a key point: The moment you force any form of encryption to contain a backdoor,
that form of encryption is rendered useless. If a backdoor exists, it will be exploited by criminals. This is not a
supposition, but a certainty. It's not an American judge that we're worried about. It's the criminals looking for
exploits. We use strong encryption every single day. We use it on our banking sites, shopping sites, and social
media sites. We protect our credit card information with encryption. We encrypt our databases containing sensitive
information (or at least we should). Our economy relies on strong encryption to move money around in industries
large and small. Many high-visibility sites, such as Twitter, Google, Reddit, and YouTube, default to SSL/TLS
encryption now. When there were bugs in the libraries that support this type of encryption, the IT world moved
heaven and earth to patch them and eliminate the vulnerability. Security pros were sweating bullets for the hours,
days, and in some cases weeks between the hour Heartbleed was revealed and the hour they could finally get their
systems patched -- and now politicians with no grasp of the ramifications want to introduce a fixed vulnerability into
these frameworks. They are threatening the very foundations of not only Internet commerce, but the health and
security of the global economy. Put simply, if backdoors are required in encryption methods, the Internet would
essentially be destroyed, and billions of people would be put at risk for identity theft, bank and credit card fraud,
and any number of other horrible outcomes. Those of us who know how the security sausage is made are appalled
that this is a point of discussion at any level, much less nationally on two continents. Its abhorrent to consider. The
general idea coming from these camps is that terrorists use encryption to communicate. Thus, if there are
backdoors, then law enforcement can eavesdrop on those communications. Leaving aside the massive
vulnerabilities that would be introduced on everyone else, its clear that the terrorists could very easily modify their
communications to evade those types of encryption or set up alternative communication methods. We would be
creating holes in the protection used for trillions of transactions, all for naught. Citizens of a city do not give the
police the keys to their houses. We do not register our bank account passwords with the FBI. We do not knowingly
or specifically allow law enforcement to listen and record our phone calls and Internet communications (though that
hasnt seemed to matter). We should definitely not crack the foundation of secure Internet communications with a
backdoor that will only be exploited by criminals or the very terrorists that were supposedly trying to thwart.
Remember, if the government can lose an enormous cache of extraordinarily sensitive, deeply personal information
on millions of its own employees, one can only wonder what horrors would be visited upon us if it somehow
succeeded in destroying encryption as well.

2NC Backdoors Inev


Other countries will inevitably build backdoors their evidence
Dimitri, Data Journalist at the Correspondent (Netherlands) Think piece:
How to protect privacy and security? Global Conference on CyberSpace
2015 16 - 17 April 2015 The Hague, The Netherlands
https://www.gccs2015.com/sites/default/files/documents/How%20to
%20protect%20privacy%20and%20security%20in%20the%20crypto
%20wars.pdf
Unsound economics The second argument is one of economics. Backdoors can stifle innovation. Even until very
recently, communications were a matter for a few big companies, often state-owned. The architecture of their
systems changed slowly, so it was relatively cheap and easy to build a wiretapping facility into them. Today
thousands of start-ups handle communications in one form or another. And with each new feature these companies
provide, the architecture of the systems changes. It would be a big burden for these companies if they had to
ensure that governments can always intercept and decrypt their traffic. Backdoors require centralised information
flows, but the most exciting innovations are moving in the opposite direction, i.e. towards decentralised services.
More and more web services are using peer-to-peer technology through which computers talk directly to one
another, without a central point of control. File storage services as well as payment processing and communications
services are now being built in this decentralised fashion. Its extremely difficult to wiretap these services. And if
you were to force companies to make such wiretapping possible, it would become impossible for these services to

A government that imposes backdoors on its tech companies


also risks harming their export opportunities. For instance, Huawei the
Chinese manufacturer of phones, routers and other network equipment is
unable to gain market access in the US because of fears of Chinese
backdoors built into its hardware. US companies, especially cloud storage providers, have lost
continue to exist.

overseas customers due to fears that the NSA or other agencies could access client data. Unilateral demands for
backdoors could put companies in a tight spot. Or, as researcher Julian Sanchez of the libertarian Cato Institute
says: An iPhone that Apple cant unlock when American cops come knocking for good reasons is also an iPhone
they cant unlock when the Chinese government comes knocking for bad ones.

1NC Cyber Inev


Cybersecurity vulnerabilities are inevitable
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)
Like CALEA, a statutory obligation along the lines proposed herein will inevitably trigger criticisms and generate

One obvious criticism is that the creation of an escrow key or the maintenance
of a duplicate key by a manufacturer would introduce an unacceptable risk of
compromise for the device. This argument presupposes that the risk is significant, that
the costs of its exploitation are large, and that the benefit is not worth the risk. Yet
manufacturers, product developers, service providers and users
constantly introduce such risks. Nearly every feature or bit of code added
to a device introduces a risk, some greater than others. The vulnerabilities
that have been introduced to computers by software such as Flash, ActiveX
controls, Java, and web browsers are well documented .51 The ubiquitous SQL
database, while extremely effective at helping web designers create effective data
driven websites, is notorious for its vulnerability to SQL injection attacks. 52 The
adding of microphones to electronic devices opened the door to aural interceptions.
Similarly, the introduction of cameras has resulted in unauthorized video surveillance
of users. Consumers accept all of these risks, however, since we, as individual users
and as a society, have concluded that they are worth the cost. Some will inevitably
argue that no new possible vulnerabilities should be introduced into devices to allow
the government to execute reasonable, and therefore lawful, searches for unique and
otherwise unavailable evidence. However, this argument implicitly asserts that
there is no, or insignificant, value to society of such a feature. And herein lies the
Achilles heel to opponents of mandated front-door access: the conclusion is entirely at odds with
the inherent balance between individual liberty and collective security central to the
Fourth Amendment itself. Nor should lawmakers be deluded into believing that the
currently existing vulnerabilities that we live with on a daily basis are less significant
in scope than the possibility of obtaining complete access to the encrypted contents
of a device. Various malware variants that are so widespread as to be almost
omnipresent in our online community achieve just such access through what would
seem like minor cracks in the defense of systems. 53 One example is the Zeus
malware strain, which has been tied to the unlawful online theft of hundreds of
millions of dollars from U.S. companies and citizens and gives its operator complete
access to and control over any computer it infects .54 It can be installed on a machine through
the simple mistake of viewing an infected website or email, or clicking on an otherwise innocuous link.55 The
malware is designed to not only bypass malware detection software, but to
deactivate to softwares ability to detect it.56 Zeus and the many other variants of malware that
concerns.

are freely available to purchasers on dark-net websites and forums are responsible for the theft of funds from
countless online bank accounts (the credentials having been stolen by the malwares key-logger features), the theft
of credit card information, and innumerable personal identifiers.57

2NC Cyber Inev


Security issues are inevitable
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal. Consider the report issued this past week by a group of
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

computer security experts (including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys
Under Doormats: Mandating Insecurity By Requiring Government Access to All Data and Communications." The
report does not make an in-principle argument or a conceptual argument against extraordinary access. It argues,
rather, that the effort to build such a system risks eroding cybersecurity in ways far more important than the
problems it would solve. The authors, to summarize, make three claims in support of the broad claim that any

What are those


"grave security risks"? "[P]roviding exceptional access to communications would
force a U-turn from the best practices now being deployed to make the Internet
more secure. These practices include forward secrecy where decryption keys are deleted
exceptional access system would "pose . . . grave security risks [and] imperil innovation."

immediately after use, so that stealing the encryption key used by a communications server would not compromise
earlier or later communications. A related technique, authenticated encryption, uses the same temporary key to

"[B]uilding in
exceptional access would substantially increase system complexity" and
"complexity is the enemy of security." Adding code to systems increases that system's attack surface,
guarantee confidentiality and to verify that the message has not been forged or tampered with."

and a certain number of additional vulnerabilities come with every marginal increase in system complexity. So by
requiring a potentially complicated new system to be developed and implemented, we'd be effectively

"[E]xceptional access would create


concentrated targets that could attract bad actors." If we require tech companies to retain some
guaranteeing more vulnerabilities for malicious actors to hit.

means of accessing user communications, those keys have to stored somewhere, and that storage then becomes
an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large
numbers of users. The strong implication of the report is that these issues are not resolvable,
though the report never quite says that. But at a minimum, the authors raise a series of important questions about
whether such a system would, in practice, create an insecure internet in generalrather than one whose general
security has the technical capacity to make security exceptions to comply with the law. There is some reason, in my

the picture may not be quite as stark as the computer scientists


make it seem. After all, the big tech companies increase the complexity of their
software products all the time, and they generally regard the increased attack
surface of the software they create as a result as a mitigatable problem. Similarly,
there are lots of high-value intelligence targets that we have to secure and would
have big security implications if we could not do so successfully. And when it really counts,
view, to suspect that

that task is not hopeless. Google and Apple and Facebook are not without tools in the cybersecurity department.

The real question, in my view, is whether a system of the sort Comey imagines could be built in
fashion in which the security gain it would provide would exceed the heightened
security risks the extraordinary access would involve. As Herb Lin puts it in his excellent, and
admirably brief, Senate testimony the other day, this is ultimately a question without an answer in the absence of a
lot of new research. "One side says [the] access [Comey is seeking] inevitably weakens the security of a system and
will eventually be compromised by a bad guy; the other side says it doesnt weaken security and wont be
compromised. Neither side can prove its case, and we see a theological clash of absolutes." Only when someone
actually does the research and development and tries actually to produce a system that meets Comey's criteria are
we going to find out whether it's doable or not. And therein lies the rub, and the real meat of the policy problem, in
my view: Who's going to do this research? Who's going to conduct the sustained investment in trying to imagine a
system that secures communications except from government when and only government has a warrant to
intercept those communications? The assumption of the computer scientists in their report is that the burden of
that research lies with the government. "Absent a concrete technical proposal," they write, "and without answers to
the questions raised in this report, legislators should reject out of hand any proposal to return to the failed
cryptography control policy of the 1990s." Indeed, their most central recommendation is that the burden of
development is on Comey. "Our strong recommendation is that anyone proposing regulations should first present
concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses
and for hidden costs." In his testimony, Herb supports this call, though he acknowledges that it is not the inevitable
route: the government has not yet provided any specifics, arguing that private vendors should do it. At the same
time, the vendors wont do it, because [their] customers arent demanding such features. Indeed, many customers
would see such features as a reason to avoid a given vendor. Without specifics, there will be no progress. I believe
the government is afraid that any specific proposal will be subject to enormous criticismand thats truebut the
government is the party that wants . . . access, and rather than running away from such criticism, it should
embrace any resulting criticism as an opportunity to improve upon its initial designs." Herb might also have
mentioned that lots of people in the academic tech community who would be natural candidates to help develop
such an access system are much more interested in developing encryption systems to keep the feds out than to
under any circumstanceslet them in. The tech community has spent a lot more time and energy arguing against
the plausibility and desireability of implementing what Comey is seeking than it has spent in trying to develop
systems that deliver it while mitigating the risks such a system might pose. For both industry and the tech
communities, more broadly, this is government's problem, not their problem. Yet reviving the Clipper Chip model
in which government develops a fully-formed system and then puts it out publicly for the community to shoot down
is clearly not what Comey has in mind. He is talking in very different language: the language of performance
requirements. He wants to leave the development task to Silicon Valley to figure out how to implement

wants to describe what he needsdecrypted signal when he


has a warrantand leave the companies to figure out how to deliver it while still
providing secure communications in other circumstances to their customers. The
advantage to this approach is that it potentially lets a thousand flowers
bloom. Each company might do it differently. They would compete to
provide the most security consistent with the performance standard. They
government's requirements. He

could learn from each other. And government would not be in the position
of developing and promoting specific algorithms. It wouldn't even need to
know how the task was being done.

1NC Alt Cause


Their evidence concedes there are alt causes to a decline in
democracy, specifically the strengthening of non-democratic
nations, which the US cannot reverse
Chenoweth & Stephan 2015
Erica Chenoweth, political scientist at the University of Denver.& Maria J.
Stephan, Senior Policy Fellow at the U.S. Institute of Peace, Senior Fellow at
the Atlantic Council.7-7-2015, "How Can States and Non-State Actors
Respond to Authoritarian Resurgence?," Political Violence @ a Glance,
http://politicalviolenceataglance.org/2015/07/07/how-can-states-and-nonstate-actors-respond-to-authoritarian-resurgence/
Chenoweth: Why is authoritarianism making a comeback? Stephan: Theres obviously no single
answer to this. But part of the answer is that democracy is losing its allure in parts of the world. When
people dont see the economic and governance benefits of democratic transitions, they lose hope. Then theres the

Regimes around the world, including China and


Russia, have readily cited the chaos of the Arab Spring to justify heavyhanded policies and consolidating their grip on power. The color
revolutions that toppled autocratic regimes in Serbia, Georgia, and Ukraine
inspired similar dictatorial retrenchment. There is nothing new about authoritarian regimes
compelling stability first argument.

adapting to changing circumstances. Their resilience is reinforced by a combination of violent and non-coercive
measures. But authoritarian paranoia seems to have grown more piqued over the past decade. Regimes have
figured out that people power endangers their grip on power and they are cracking down. Theres no better
evidence of the effectiveness of civil resistance than the measures that governments take to suppress it
something you detail in your chapter from my new book. Finally, and importantly, democracy in this country and
elsewhere has taken a hit lately. Authoritarian regimes mockingly cite images of torture, mass surveillance, and the
catering to the radical fringes happening in the US political system to refute pressures to democratize themselves.
The financial crisis here and in Europe did not inspire much confidence in democracy and we are seeing political
extremism on the rise in places like Greece and Hungary. Here in the US we need to get our own house in order if
we hope to inspire confidence in democracy abroad.

Alt cause: Economic development key to democracy promotion


Drake et al 00 (William J. Drake was a Senior Associate and the Director of
the Project on the Information Revolution and World Politics at the Carnegie
Endowment for International Peace. Shanthi Kalathil specializes in the
political impact of information and communication technology (ICT). Her
research focuses on the impact of ICT in authoritarian regimes, the global
digital divide, and security issues in the information age. Taylor Boas is a
Project Associate with the Project on the Information Revolution and World
Politics. Dictatorships in the Digital Age: Some Considerations on the
Internet in China and Cuba
<http://carnegieendowment.org/2000/10/23/dictatorships-in-digital-agesome-considerations-on-internet-in-china-and-cuba/4e9e>) NM
The Economy. Economic development and the growth of a middle class may be

important contributors to democratization. Internet-based electronic commerce is


set to boom in parts of the developing world (most notably Asia and Latin America)
and will provide many new opportunities for individual entrepreneurs, small
businesses, larger internationally-oriented companies, and consumers. The

resulting invigoration of national economies could help to foster prodemocracy attitudes, e.g., by increasing demands for transparency,

accountability, and "good government" and an end to "crony capitalist"


practices that are out of synch with the ethos of the global Internet economy.

Alternatively, in some cases even Internet-oriented businesspeople and consumers


may prefer to go along with an undemocratic regime than to rock the boat. Hence, it
would be worth attempting to gauge the impact of Internet-based economic activity
on the broad tenor of national political cultures, as well as on the attitudes and
political demands of relevant individuals, firms, trade associations, etc .

1NC No Solvency
Aff insufficient their author says more action than the plan is
necessary to solve internet freedom
Donahoe, 14,
Eileen Donahoe, director of global affairs at Human Rights Watch. Donahoe
previously served as the first US Ambassador to the United Nations Human
Rights Council, "Human Rights in the Digital Age", Just Security, 12-23-2014,
http://justsecurity.org/18651/human-rights-digital-age/
1. Create a Special Rapporteur Mandate on the Right to Privacy at the UN
Human Rights Council The first practical step to take in protecting human rights in the digital realm is to
generate global support for the creation of a special rapporteur (essentially an international human rights law
expert) for the right to privacy at the UN Human Rights Council in Geneva at its next session in March. The creation
of such a mandate would follow directly from an invitation in the UN General Assembly (UNGA) resolution on The
Right to Privacy in the Digital Age that passed by consensus on Dec. 18 in New York, under the leadership of Brazil
and Germany. The core idea is simple: when everything you say or do can be tracked and intercepted, it has a
chilling effect on what you feel free to say, where you feel free to go, and with whom you choose to meet. These
concerns go to the heart of the work of human rights activists and defenders around the world. The consensus
UNGA text expressed growing global concern about the human rights costs and consequences of unchecked mass
surveillance, including the erosion of fundamental freedoms of expression, assembly and association. The resolution
invited Human Rights Council members in Geneva to take up the challenge of protecting privacy in the digital
context by considering the creation of a special procedure mandate holder to address these global concerns.
Ideally, this mandate holder would be dedicated to fleshing out the implications of digital communications
technology for the right to privacy, and help articulate how to adhere to the rule of law and ensure protection of
human rights and fundamental freedoms in the digital environment. The international community must stand

2. Contribute to Development
of Multi-Stakeholder Internet Governance A second practical step that can be
taken to reinforce human rights promotion in the digital context would be to
support further development of the multi-stakeholder approach to Internet
governance that prevails today, rather than allow retrenchment toward a
multilateral, state based Westphalian model of governance. The Internet itself has
behind the creation of this urgently needed international mandate.

in many ways been a boon to the exercise of rights, but also has contributed to the larger trend of distribution of
power away from governments to non-state actors. The Internet, which emerged through the collaboration of
technologists and various other stakeholders, operates through global, trans-boundary connectivity, and does not
depend on geographic borders. In effect, the Internet challenged the nation-state system that lies at the heart of
the UN structure, the so-called Westphalian model. While individuals have been empowered through global
connectivity and the free flow of information across borders, the territory-based nation-state system of governance
has been tested. In response, some governments, notably China, increasingly endorse a concept of Internet
sovereignty, whereby each national government has sovereign control over all aspects of Internet infrastructure,
data, content and governance within its borders. This approach would in effect be an effort to Westphalianize the
global Internet, and to resist the global trend toward a distributed, decentralized multi-stakeholder model of Internet
governance. To meet this challenge, the multi-stakeholder model for Internet governance must be protected and
strengthened. A basic concept underlying this model is that governments alone are not best positioned to make
technical or policy decisions about the Internet single-handedly. The Internet evolved through collaboration and
decision-making by many non-governmental actors, and the functionality of the open interoperable Internet
depends on continued inclusion of many stakeholders in Internet governance processes, most notably
technologists. On the human rights policy front, civil society organizations dedicated to protection and promotion of
human rights are best placed, and must have a seat at the table alongside governments, technologists, the private
sector and others, in creating Internet governance mechanisms that prioritize global human rights in the digital

3. Reinforce the Conceptualization of Human Rights Protection as a


National Security Priority Finally, we need to solidify the international understanding that protection of
realm.

human rights and adherence to the rule of law in the digital realm are essential to the protection of national and
global security, rather than antithetical to it. All too often in the post-Snowden context, national security interests
are presented in binary opposition to freedom and privacy consideration, as though there is only a zero-sum
relationship between human rights and national security. In reality, human rights protection has been an essential
pillar of the global security architecture since the founding of the United Nations immediately after World War II.
Recent failures to adequately protect human rights and adhere to the rule of law in the digital realm has been
deeply undermining of some crucial aspects of long-term national and global security. One of the most troubling
aspects of the mass surveillance programs disclosed by Edward Snowden was the extent to which digital security
for individual users, for data, and for networks, has been undermined in the name of protecting of national

security. This is both ironic and tragic, given that digital security is now at the heart of national security whether
protecting critical infrastructure, confidential information, or sensitive data. Practices, such as surreptitiously
tapping into networks, requiring back doors to encrypted services and weakening global encryption standards will
directly undermine national and global security, as well as human rights. Meanwhile targeted malware and crafted
digital attacks on human rights activists have become the modus operandi of repressive governments motivated to
undermine human rights work. Civil society actors increasingly face an onslaught of persistent computer espionage
attacks from governments and other political actors like cyber militias, just as businesses and governments do. So
while our notions of privacy are evolving along with social media and data-capturing technology, we also need to
recognize that its not just privacy that is affected by the digitization of everything. The exercise of all
fundamental freedoms is undermined when governments utilize new capacities that flow from digitization without
regard for human rights. Furthermore, by engaging in tactics that undermine digital security for individuals, for
networks and for data, governments trigger and further inspire a hackers race to the bottom. Practices that
undermine digital security will be learned and followed by other governments and non-state actors, and ultimately
undermine security for critical infrastructure, as well as individuals users everywhere. Strengthening digital security
for individual users, for data, for networks, and for critical infrastructure must be seen as the national and global
security priority that it is. Conclusion We are at a critical moment for protection of human rights in the digital
context. All global players whose actions impact the enjoyment of human rights, especially governments who claim
to be champions of human rights, must lead in the reaffirmation of the international human rights framework as a
central pillar for security, development and freedom in the 21st century digital environment.

1NC No Authoritarianism Solvency


Internet freedom is just as likely to be used to crush dissent
Siegel 11 (Lee Siegel, a columnist and editor at large for The New York
Observer, is the author of Against the Machine: How the Web Is Reshaping
Culture and Commerce and Why It Matters. The Net Delusion and the
Egypt Crisis, February 4, 2011,
http://artsbeat.blogs.nytimes.com/2011/02/04/the-net-delusion-and-theegypt-crisis)
Morozov takes the ideas of what he calls cyber-utopians and shows how
reality perverts them in one political situation after another. In Iran, the
regime used the internet to crush the internet-driven protests in June 2009.
In Russia, neofascists use the internet to organize pogroms. And on and on.
Morozov has written hundreds of pages to make the point that technology is
amoral and cuts many different ways. Just as radio can bolster democracy or
as in Rwanda incite genocide, so the internet can help foment a
revolution but can also help crush it. This seems obvious, yet it has often
been entirely lost as grand claims are made for the internets positive,
liberating qualities. And suddenly here are Tunisia and, even more
dramatically, Egypt, simultaneously proving and refuting Morozovs
argument. In both cases, social networking allowed truths that had been
whispered to be widely broadcast and commented upon. In Tunisia and Egypt
and now across the Arab world Facebook and Twitter have made people
feel less alone in their rage at the governments that stifle their lives. There is
nothing more politically emboldening than to feel, all at once, that what you
have experienced as personal bitterness is actually an objective condition, a
universal affliction in your society that therefore can be universally opposed.
Yet at the same time, the Egyptian government shut off the internet, which
is an effective way of using the internet. And according to one Egyptian
blogger, misinformation is being spread through Facebook as it was in Iran
just as real information was shared by anti-government protesters. This is
the dark side of internet freedom that Morozov is warning against. It is
the freedom to wantonly crush the forces of freedom. All this should not
surprise anyone. It seems that, just as with every other type of technology of
communication, the internet is not a solution to human conflict but an
amplifier for all aspects of a conflict. As you read about pro-government
agitators charging into crowds of protesters on horseback and camel, you
realize that nothing has changed in our new internet age. The human
situation is the same as it always was, except that it is the same in a newer
and more intense way. Decades from now, we will no doubt be celebrating a
spanking new technology that promises to liberate us from the internet. And
the argument joined by Morozov will occur once again.

2NC No Authoritarianism Solvency


Mobilization and Internet access are not correlated other
factors are more important
Kuebler 11 (Johanne Kuebler, contributor to the CyberOrient journal, Vol. 5,
Iss. 1, 2011, Overcoming the Digital Divide: The Internet and Political
Mobilization in Egypt and Tunisia, http://www.cyberorient.net/article.do?
articleId=6212)
The assumption that the uncensored accessibility of the Internet encourages
the struggle for democracy has to be differentiated. At first sight, the case
studies seem to confirm the statement, since Egypt, featuring a usually
uncensored access to the Internet, has witnessed mass mobilisations
organised over the Internet while Tunisia had not. However, the mere
availability of freely accessible Internet is not a sufficient condition insofar as
mobilisations in Egypt took place when a relative small portion of the
population had Internet access and, on the other hand, mobilisation
witnessed a decline between 2005 and 2008 although the number of Internet
users rose during the same period. As there is no direct correlation
between increased Internet use and political action organised through
this medium, we have to assume a more complex relationship. A successful
social movement seems to need more than a virtual space of debate to be
successful, although such a space can be an important complementary
factor in opening windows and expanding the realm of what can be said in
public. A political movement revolves around a core of key actors, and
"netizens" qualify for this task. The Internet also features a variety of tools
that facilitate the organisation of events. However, to be successful, social
movements need more than a well-organised campaign. In Egypt, we
witnessed an important interaction between print and online media, between
the representatives of a relative elitist medium and the traditional, more
accessible print media. A social movement needs to provide frames
resonating with grievances of the public coupled with periods of increased
public attention to politics in order to create opportunity structures. To
further transport their message and to attract supporters, a reflection of the
struggle of the movement with the government in the "classical" media such
as newspapers and television channels is necessary to give the movement
momentum outside the Internet context.

1NC No I/L
No evidence that the internet actually spurs democratization
Aday et al. 10 (Sean Aday is an associate professor of media and public
affairs and international affairs at The George Washington University, and
director of the Institute for Public Diplomacy and Global Communication.
Henry Farrell is an associate professor of political science at The George
Washington University. Marc Lynch is an associate professor of political
science and international affairs at The George Washington University and
director of the Institute for Middle East Studies. John Sides is an assistant
professor of political science at The George Washington University. John Kelly
is the founder and lead scientist at Morningside Analytics and an affiliate of
the Berkman Center for Internet and Society at Harvard University. Ethan
Zuckerman is senior researcher at the Berkman Center for Internet and
Society at Harvard University and also part of the team building Global
Voices, a group of international bloggers bridging cultural and linguistic
differences through weblogs. August 2010, BLOGS AND BULLETS: new
media in contentious politics, http://www.usip.org/files/resources/pw65.pdf)
New media, such as blogs, Twitter, Facebook, and YouTube, have played a
major role in episodes of contentious political action. They are often
described as important tools for activists seeking to replace authoritarian
regimes and to promote freedom and democracy, and they have been
lauded for their democratizing potential. Despite the prominence of Twitter
revolutions, color revolutions, and the like in public debate, policymakers
and scholars know very little about whether and how new media affect
contentious politics. Journalistic accounts are inevitably based on
anecdotes rather than rigorously designed research. Although data on
new media have been sketchy, new tools are emerging that measure linkage
patterns and content as well as track memes across media outlets and thus
might offer fresh insights into new media. The impact of new media can be
better understood through a framework that considers five levels of analysis:
individual transformation, intergroup relations, collective action, regime
policies, and external attention. New media have the potential to change
how citizens think or act, mitigate or exacerbate group conflict, facilitate
collective action, spur a backlash among regimes, and garner international
attention toward a given country. Evidence from the protests after the
Iranian presidential election in June 2009 suggests the utility of examining
the role of new media at each of these five levels. Although there is reason to
believe the Iranian case exposes the potential benefits of new media, other
evidencesuch as the Iranian regimes use of the same social network tools
to harass, identify, and imprison protesterssuggests that, like any media,
the Internet is not a magic bullet. At best, it may be a rusty
bullet. Indeed, it is plausible that traditional media sources were equally if
not more important. Scholars and policymakers should adopt a more
nuanced view of new medias role in democratization and social change, one
that recognizes that new media can have both positive and negative effects.

Introduction In January 2010, U.S. Secretary of State Hillary Clinton


articulated a powerful vision of the Internet as promoting freedom and global
political transformation and rewriting the rules of political engagement and
action. Her vision resembles that of others who argue that new media
technologies facilitate participatory politics and mass mobilization, help
promote democracy and free markets, and create new kinds of global
citizens. Some observers have even suggested that Twitters creators should
receive the Nobel Peace Prize for their role in the 2009 Iranian protests.1 But
not everyone has such sanguine views. Clinton herself was careful to note
when sharing her vision that new media were not an unmitigated blessing.
Pessimists argue that these technologies may actually exacerbate conflict,
as exemplified in Kenya, the Czech Republic, and Uganda, and help
authoritarian regimes monitor and police their citizens. 2 They argue that
new media encourage self-segregation and polarization as people seek out
only information that reinforces their prior beliefs, offering ever more
opportunities for the spread of hate, misinformation, and prejudice.3 Some
skeptics question whether new media have significant effects at all. Perhaps
they are simply a tool used by those who would protest in any event or a
trendy hook for those seeking to tell political stories. Do new media have
real consequences for contentious politicsand in which direction?4 The
sobering answer is that, fundamentally, no one knows. To this point, little
research has sought to estimate the causal effects of new media in a
methodologically rigorous fashion, or to gather the rich data needed to
establish causal influence. Without rigorous research designs or rich
data, partisans of all viewpoints turn to anecdotal evidence and
intuition

1NC Collapse Inev


Cant solve US allies destroy i-freedom signal
Hanson 10/25/12, Nonresident Fellow, Foreign Policy, Brookings
http://www.brookings.edu/research/reports/2012/10/25-ediplomacy-hansoninternet-freedom
Another challenge is dealing with close partners and allies who undermine
internet freedom. In August 2011, in the midst of the Arab uprisings, the UK
experienced a different connection technology infused movement, the
London Riots. On August 11, in the heat of the crisis, Prime Minister Cameron
told the House of Commons: Free flow of information can be used for good.
But it can also be used for ill. So we are working with the police, the
intelligence services and industry to look at whether it would be right to stop
people communicating via these websites and services when we know they
are plotting violence, disorder and criminality. This policy had far-reaching
implications. As recently as January 2011, then President of Egypt, Hosni
Mubarak, ordered the shut-down of Egypts largest ISPs and the cell phone
network, a move the United States had heavily criticized. Now the UK was
contemplating the same move and threatening to create a rationale for
authoritarian governments everywhere to shut down communications
networks when they threatened violence, disorder and criminality. Other
allies like Australia are also pursuing restrictive internet policies. As OpenNet
reported it: Australia maintains some of the most restrictive Internet policies
of any Western country When these allies pursue policies so clearly at
odds with the U.S. internet freedom agenda, several difficulties arise. It
undermines the U.S. position that an open and free internet is something
free societies naturally want. It also gives repressive authoritarian
governments an excuse for their own monitoring and filtering activities. To an
extent, U.S. internet freedom policy responds even-handedly to this
challenge because the vast bulk of its grants are for open source
circumvention tools that can be just as readily used by someone in London
as Beijing, but so far, the United States has been much more discreet about
criticising the restrictive policies of allies than authoritarian states.

2NC Collapse Inev


Collapse of Internet freedom inevitable
VARA 14 [Vauhini Vara, the former business editor of newyorker.com, lives in
San Francisco and is a business and technology correspondent for the New Yorker.
The World Cracks Down on the Internet, 12-4-14,
http://www.newyorker.com/tech/elements/world-cracks-internet, msm]

Chinese authorities announced a n unorthodox standard to help them decide


punish people for posting online comments that are false, defamatory, or
otherwise harmful: Was a message popular enough to attract five hundred reposts or five thousand views? It was a
striking example of how sophisticated the Chinese government has become , in recent
years, in restricting Internet communication going well beyond crude measures like restricting access to
In September of last year,
whether to

particular Web sites or censoring online comments that use certain keywords. Madeline Earp, a research analyst at Freedom House,
the Washington-based nongovernmental organization, suggested a phrase to describe the approach: strategic, timely censorship.
She told me, Its about allowing a surprising amount of open discussion, as long as youre not the kind of person who can really use
that discussion to organize people. On Thursday, Freedom House published its fifth annual report on Internet freedom around the
world. As in years past,

China is

again

near the bottom of the rankings, which include sixty-five countries.

Only Syria and Iran got worse scores , while Iceland and Estonia fared the best. (The report was funded partly by
the Dutch Ministry of Foreign Affairs, the United States Department of State, Google, and Yahoo, but Freedom House described the

Chinas place in the


rankings wont come as a surprise to many people. The notable part is that the report suggests that, when it
comes to Internet freedom, the rest of the world is gradually becoming more like
China and less like Iceland. The researchers found that Internet freedom declined in thirty-six of the
sixty-five countries they studied, continuing a trajectory they have noticed since
they began publishing the reports in 2010. Earp, who wrote the China section, said that
authoritarian regimes might even be explicitly looking at China as a model in policing
Internet communication. (Last year, she co-authored a report on the topic for the Committee to Protect Journalists.)
China isnt alone in its influence, of course. The reports authors even said that some countries are using the U.S.
report as its sole responsibility and said that it doesnt necessarily represent its funders views.)

National Security Agencys widespread surveillance, which came to light following disclosures by the whistle-blower Edward
Snowden, as an excuse to augment their own monitoring capabilities. Often, the surveillance comes with little or no oversight,
they said, and is directed at human-rights activists and political opponents. China, the U.S., and their copycats arent the only

interestingly, the United States was the sixth-best country for


Internet freedom, after Germanythough this may say as much about the poor state of Web freedom in other places as it
does about protections for U.S. Internet users. Among the other countries, this was a particularly bad year for Russia and
Turkey, which registered the sharpest declines in Internet freedom from the previous
year. In Turkey, over the past several years, the government has increased censorship, targeted
online journalists and social-media users for assault and prosecution, allowed state
agencies to block content, and charged more people for expressing themselves
online, the report notednot to mention temporarily shutting down access to YouTube and Twitter. As Jenna Krajeski wrote in a
offenders, of course. In fact,

post about Turkeys Twitter ban, Prime Minister Recep Tayyip Erdoan vowed in March, Well eradicate Twitter. I dont care what the
international community says. They will see the power of the Turkish Republic. A month later, Russian President Vladimir Putin, not
to be outdone by Erdoan, famously called the Internet a C.I.A. project, as Masha Lipman wrote in a post about Russias recent
Internet controls. Since Putin took office again in 2012, the report found, the government has enacted laws to block online content,
prosecuted people for their Internet activity, and surveilled information and communication technologies. Among changes in other
countries, the report said that the governments of Uzbekistan and Nigeria had passed laws requiring cybercafs to keep logs of their
customers, and that the Vietnamese government began requiring international Internet companies to keep at least one server in

behind the decline in Internet freedom throughout the world? There could be several
reasons for it, but the most obvious one is also somewhat mundane: especially in countries where people are just beginning to go
online in large numbers, governments that restrict freedom offline particularly authoritarian
regimesare only beginning to do the same online, too. Whats more, governments that had been using strategies like blocking
Vietnam. Whats

certain Web sites to try to control the Internet are now realizing that those approaches dont actually do much to keep their citizens

turning to their legal systems,


enacting new laws that restrict how people can use the Internet and other
technologies. There is definitely a sense that the Internet offered this real alternative to traditional mediaand then
from seeing content that the governments would prefer to keep hidden. So theyre

government started playing catch-up a little bit, Earp told me. If a regime has developed laws and practices over time that limit
what the traditional media can do, theres that moment of recognition: How can we apply what we learned in the traditional media
world online? There were a couple of hopeful signs for Internet activists during the year. India, where authorities relaxed restrictions that had been imposed in 2013 to help quell rioting, saw the biggest improvement in its Internet-freedom score. Brazil, too,

notched a big gain after lawmakers approved a bill known as the Marco Civil da Internet, which protects net neutrality and online
privacy. But, despite those developments, the reports authors didnt seem particularly upbeat. There might be some cautious
optimism there, but I do not want to overstate that because, since we started tracking this, its been a continuous decline,
unfortunately, Sanja Kelly, the project director for the report, told me. Perhaps the surprising aspect of Freedom Houses findings
isnt that the Internet is becoming less freeits that it has taken this long for it to happen.

Governments will inevitably oppose internet freedom


attempts to oppose it exasperate the problem
Utah Post 1-3 [2014 MARKED THE DECLINE IN INTERNET FREEDOM, 1-3-15,
http://www.utahpeoplespost.com/2015/01/2014-marked-decline-internetfreedom/, msm]

Last year marked a decline in internet freedom in numerous countries , as indicated by a report
released by the Freedom House. The study analyzed 65 countries in terms of user access to internet
and laws governing the World Wide Web . The report shows that web freedom has corroded for
the fourth back to back year. The document highlights administrative endeavors to
ban applications and tech advances by putting cutoff points on content, sites filters
and infringement of clients rights by peeping in their online log. The report also warns that
2015s dares in terms of web freedom will increase as Russia and Turkey plan to
increase controls on foreign-based internet organizations. Many countries already
put major American internet businesses into odd circumstances . Among them:
Twitter, Facebook and Google, who were challenged by problematic regulations.
Overlooking these laws has led to their services being hindered . For instance, Googles engineers retreated
from Russia while China blocked Gmail, after the company refused to give the national governments access to its servers. This Wednesday, Vladimir Putin, Russian President
approved the law obliging organizations to store Russian clients information on
servers located on Russian grounds. But only a few countries approve of this new legislation. As a result it is expected that the law will spur

some international debates not long from now. Most of tech experts believe that pieces of legislation and other state measures will not be able to actually stop information from
rolling on the internet. For instance, a year ago Russian powers asked Facebook to shut down a page setup against the government, advancing anti-government protests. Despite the

The Turkish
government was also slammed by internet power when it attempted to stop the
spread of leaked documents on Twitter in March. Recep Tayyip Erdogans government at the time requested the shutdown of
fact that Facebook consented to the request and erased the page, which had 10 million supporters, different replica pages were immediately set up.

Twitter inside Turkey after the organization declined to erase the posts revealing information about government authorities accused of corruption. The result of the government action
was that while Twitter was blocked, Turkish users started to evade the ban. Comparable demands were registered in nations like China, Pakistan, and so forth. According to a popular
Russian blogger, Anton Nosik, governments are delusional to think they can remove an article or video footage from the web when materials can easily be duplicated and posted

Governments
are not really fans of this idea. Tech analysts say it is likely to see an
increase in clashes between internet surfers and authorities in various countries
throughout 2015.
somewhere else.

Most Internet users militate for a free and limitless system, where individuals are permitted to openly navigate whatever they want.

on the other hand,

As internet use increases, internet freedom will inevitably


decrease its zero-sum
Kelly and Cook 11 [Sanja Kelly, managing editor, and Sarah Cook, assistant editor,

at Freedom House produced "Freedom on the Net: A Global Assessment of Internet and
Digital Media," a 2011 report. Internet freedom, 4-17-11,
http://www.sfgate.com/opinion/openforum/article/Internet-freedom-declining-as-use-grows2375021.php, msm]

as more people use the Internet to freely communicate and obtain information, governments
have ratcheted up efforts to control it. Today, more than 2 billion people have access to the Internet, a number
Indeed,

that has more than doubled in the past five years. Deepening Internet penetration is particularly evident in the developing world,
where declining subscription costs, government investments in infrastructure, and the rise of mobile technology has allowed the
number of users to nearly triple since 2006. In order to better understand the diverse, rapidly evolving threats to Internet freedom,
Freedom House, a Washington, D.C., NGO that conducts research on political freedom, has undertaken an analysis - the first of its
kind - of the ways in which governments in 37 key countries create obstacles to Internet access, limit digital content and violate
users' rights. What we found was that Internet freedom in a range of countries, both democratic and authoritarian, is declining.
Emboldened

governments and their sympathizers are increasingly using technical


attacks to disrupt political activists' online networks, eavesdrop on their
communications and debilitate their websites. Such attacks were reported in at

least 12 countries, ranging from China to Russia, Tunisia to Burma, Iran to Vietnam. In Belarus, at the height of
controversial elections, the authorities created mirror versions of opposition websites,
diverting users to the new ones, where deliberately false information on the times
and locations of protests were posted. In Tunisia, in the run-up to the January 2011 uprising that drove the
regime from power, the authorities regularly broke into the e-mail, Facebook and blogging accounts of opposition and human rights

Governments around the


world increasingly are establishing mechanisms to block what they deem to be
undesirable information. In many cases, the restrictions apply to content involving illegal gambling, child pornography,
copyright infringement or the incitement of hatred or violence. However, a large number of governments are
also engaging in deliberate efforts to block access to information related to politics,
social issues and human rights. In Thailand, tens of thousands of websites critical of the monarchy have been
blocked. In China - in addition to blocking dissident websites - user discussions and blog postings revealing
tainted-milk products, pollution or torture are deleted. Centralized government
control over a country's connection to international Internet traffic also emerged as
one significant threat to online free expression. In one-third of the states examined,
authorities have exploited their control over infrastructure to limit access to
politically and socially controversial content or, in extreme cases, cut off access to
the Internet entirely, as Hosni Mubarak's government did in Egypt during the height
of the protests there. Until recently, the conventional assumption has been that Internet
freedom would inexorably improve, given the technology's diffuse and open
structure. But this assumption was premature. Our findings should serve as an early
warning sign to defenders of free expression.
activists, either deleting specific material or simply collecting intelligence about their plans.

1NC Squo Solves


Squo solves their evidence concedes that were already
funding groups to fight for Internet freedom
Kehl, 2015
Danielle Kehl is a senior policy analyst at New America's Open Technology
Institute, BA cum laude Yale 6-17-2015, "Doomed To Repeat History? Lessons
From The Crypto Wars Of The 1990s," New America,
https://www.newamerica.org/oti/doomed-to-repeat-history-lessons-from-thecrypto-wars-of-the-1990s/
Strong encryption has become an integral tool in the protection of privacy and the promotion of free expression
online The end of the Crypto Wars ushered in an age where the security and privacy protections afforded by the use
of strong encryption also help promote free expression. As the American Civil Liberties Union recently explained in a
submission to the UN Human Rights Council, encryption and anonymity are the modern safeguards for free
expression. Without them, online communications are effectively unprotected as they traverse the Internet,
vulnerable to interception and review in bulk. Encryption makes mass surveillance significantly more costly.187
The human rights benefits of strong encryption have undoubtedly become more evident since the end of the Crypto
Wars. Support for strong encryption has become an integral part of American foreign policy related to Internet
freedom, and since 2010, the U.S. government has built up a successful policy and programming agenda based on

These efforts include providing over $120 million


in funding for groups working to advance Internet freedom, much of which
specifically funds circumvention tools that rely on strong encryption which
makes Internet censorship significantly harder as part of the underlying
technology.189 Similarly, a June 2015 report by David Kaye, the UN Special Rapporteur for Freedom of
promoting an open and free Internet.188

Expression and Opinion found that, Encryption and anonymity provide individuals and groups with a zone of
privacy online to hold opinions and exercise freedom of expression without arbitrary and unlawful interference or
attacks.190 The report goes on to urge all states to protect and promote the use of strong encryption, and not to
restrict it in any way. Over the past fifteen years, a virtuous cycle between strong encryption, economic growth, and
support for free expression online has evolved. Some experts have dubbed this phenomenon collateral freedom,
which refers to the fact that, When crucial business activity is inseparable from Internet freedom, the prospects for
Internet freedom improve.191 Free expression and support for human rights have certainly benefited from the
rapid expansion of encryption in the past two decades.

1NC No Impact
No democracy impact.
Rosato, 03 Sebastian, Ph.D. candidate, Political Science Department, UChicago, American Political Science
Review, November, http://journals.cambridge.org/download.php?file=%2FPSR
%2FPSR97_04%2FS0003055403000893a.pdf&code=97d5513385df289000828a47df480146, The Flawed Logic of
Democratic Peace Theory, ADM

Democratic peace theory is probably the most powerful liberal contribution to


the debate on the causes of war and peace. In this paper I examine the causal
logics that underpin the theory to determine whether they offer compelling
explanations for the finding of mutual democratic pacifism. I find that they do not. Democracies
do not reliably externalize their domestic norms of conflict resolution and do
not trust or respect one another when their interests clash. Moreover, elected leaders are
not especially accountable to peace loving publics or pacific interest groups, democracies are not
particularly slow to mobilize or incapable of surprise attack, and open political competition does not guarantee that
a democracy will reveal private information about its level of resolve thereby avoidingconflict. Since the evidence

logics do not operate as stipulated by the theorys proponents,


there are good reasons to believe that while there is certainly peace among
democracies, it may not be caused by the democratic nature of those states.
suggests that the

Democratic peace theorythe claim that democracies rarely fight one another because they share common norms
of live-and-let-live and domestic institutions that constrain the recourse to waris probably the most powerful
liberal contribution to the debate on the causes of war and peace.1 If the theory is correct, it has important
implications for both the study and the practice of international politics. Within the academy it undermines both the
realist claim that states are condemned to exist in a constant state of security competition and its assertion that the
structure of the international system, rather than state type, should be central to our understanding of state
behavior. In practical terms democratic peace theory provides the intellectual justification for the belief that
spreading democracy abroad will perform the dual task of enhancing American national security and promoting
world peace. In this article I offer an assessment of democratic peace theory. Specifically, I examine the causal
logics that underpin the theory to determine whether they offer compelling explanations for why democracies do
not fight one another. A theory is comprised of a hypothesis stipulating an association between an independent and
a dependent variable and a causal logic that explains the connection between those two variables. To test a theory
fully, we should determine whether there is support for the hypothesis, that is, whether there is a correlation
between the independent and the dependent variables and whether there is a causal relationship between them.2
An evaluation of democratic peace theory, then, rests on answering two questions. First, do the data support the
claim that democracies rarely fight each other? Second, is there a compelling explanation for why this should be the

Democratic peace theorists have discovered a powerful empirical


generalization: Democracies rarely go to war or engage in militarized
disputes with one another. Although there have been several attempts to challenge these findings
case?

(e.g., Farber and Gowa 1997; Layne 1994; Spiro 1994), the correlations remain robust (e.g., Maoz 1998; Oneal and

some scholars argue that while


there is certainly peace among democracies, it may be caused by factors
other than the democratic nature of those states (Farber and Gowa 1997; Gartzke 1998;
Russett 1999; Ray 1995; Russett 1993; Weart 1998). Nevertheless,

Layne 1994). Farber and Gowa (1997), for example, suggest that the Cold War largely explains the democratic
peace finding. In essence, they are raising doubts about whether there is a convincing causal logic that explains
how democracies interact with each other in ways that lead to peace. To resolve this debate, we must take the next
step in the testing process: determining the persuasiveness of the various causal logics offered by democratic
peace theorists.

1NC No Dem Peace Theory


Democracy doesnt solve war their ev is based on flawed
studies
Henderson 2

(Errol Henderson, Assistant Professor, Dept. of Political Science at the University of Florida,
2002, Democracy and War The End of an Illusion?)

The replication and extension of Oneal and Russet (1997), which is


one of the most important studies on the DPP, showed that
democracies are not significantly less likely to fight each other . The
results demonstrate that Oneal and Russet (1997) findings in support
of the DPP are not robust and that join democracy does not reduce the
probability of international conflict of pairs of states during the postwar
era. Simple and straightforward modifications of Oneal and Russetts (1997)
research design generate these dramatically contradictory results.
Specifically, by teasing out the separate impact of democracy and political
distance (or political dissimilarity) and by not coding cases of ongoing
disputes as new cases of conflict, it became clear that there is no siginifant
relationship between join democracy and the likelihood of international war
or militarized interstate dispute (MID) for states during the postwar era. These
findings suggest that the post-Cold War strategy of democratic enlargement, which is aimed at ensuring peace by
englaring the community of democratic states, is quite a thin reed on which to rest a states foreign policy- much

The results indicate that democracies are


more war-prone than non-democracies (whether democracy is coded dichotomously or
continuously) and that democracies are more likely to initiate interstate
wars. The findings are obtained from analyses that control for a
host of political, economic, and cultural factors that have been
implicated in the onset of interstate war, and focus explicitly on state level factors
less the hope for international peace.

instead of simply inferring state level processes from dyadic level observations as was done in earlier studies (e.g.,

The results imply that democratic


enlargement is more likely to increase the probability of war for states since
democracies are more likely to become involved inand to initiate
interstate wars.
Oneal and Russett, 1997; Oneal and Ray, 1997).

2NC No Dem Peace Theory


Democratic peace theory is flawed
Layne 7
Christopher, Professor @ TX A&M, American Empire: A Debate, pg. 94

Wilsonian ideology drives the American Empire because its proponents posit that the
United States must use its military power to extend democracy abroad. Here, the
ideology of Empire rests on assumptions that are not supported by the facts. One
reason the architects of Empire champion democracy promotion is because they believe
in the so-called democratic peace theory, which holds that democratic states do not fight
other democracies. Or as President George W. Bush put it with his customary eloquence,
"democracies don't war; democracies are peaceful."136 The democratic peace theory is
the probably the most overhyped and undersupported "theory" ever to be concocted by
American academics. In fact, it is not a theory at all. Rather it is a theology that suits the
conceits of Wilsonian true believers-especially the neoconservatives who have been
advocating American Empire since the early 1990s. As serious scholars have shown,
however, the historical record does not support the democratic peace theory .131 On the
contrary, it shows that democracies do not act differently toward other democracies than
they do toward nondemocratic states. When important national interests are at stake,
democracies not only have threatened to use force against other democracies, but, in
fact, democracies have gone to war with other democracies.

Democracy doesnt prevent war


Goldstein, 11

(Joshua, is professor emeritus of international relations at American University and author


of Winning the War on War: The Decline of Armed
Conflict Worldwide, Sept/Oct 2011, Think Again: War. World peace could be closer than you think, Foreign Policy)
"A More Democratic World Will Be a More Peaceful One." Not necessarily. The well-worn observation that real

democracies have
always been perfectly willing to fight nondemocracies. In fact, democracy can
heighten conflict by amplifying ethnic and nationalist forces, pushing
leaders to appease belligerent sentiment in order to stay in power. Thomas
democracies almost never fight each other is historically correct, but it's also true that

Paine and Immanuel Kant both believed that selfish autocrats caused wars, whereas the common people, who bear

try telling that to the leaders of authoritarian China,


who are struggling to hold in check, not inflame, a popular undercurrent of nationalism
against Japanese and American historical enemies. Public opinion in tentatively
democratic Egypt is far more hostile toward Israel than the authoritarian government of
Hosni Mubarak ever was (though being hostile and actually going to war are quite different things). Why
the costs, would be loath to fight. But

then do democracies limit their wars to non-democracies rather than fight each other? Nobody really knows As the
University of Chicago's Charles Lipson once quipped about the notion of a democratic peace, "We know it works in
practice. Now we have to see if it works in theory!" The best explanation is that of political scientists 9/29/2011
Think Again: War - By Joshua S. Goldst foreignpolicy.com//think_again_war? 6/9Bruce Russett and John Oneal,
who argue that three elements -- democracy, economic interdependence (especially trade), and the growth of
international organizations -- are mutually supportive of each other and of peace within the community of

Democratic leaders, then, see themselves as having less to


lose in going to war with autocracies.
democratic countries.

Democratic Peace is a myth, the United States is the worlds


leading democracy and engages in many wars.
Ostrowski 02, (James Ostrowski is a lawyer and a libertarian author. The
Myth of Democratic Peace.

http://www.lewrockwell.com/1970/01/james-ostrowski/the-myth-ofdemocratic-peace/)

We are led to believe that democracy and peace are inextricably linked; that
democracy leads to and causes peace; and that peace cannot be achieved in the
absence of democracy. Woodrow Wilson was one of the earliest and strongest proponents of this view. He said in his "war message" on
April 2, 1917: A steadfast concert for peace can never be maintained except by a partnership of democratic nations. No autocratic government could be
trusted to keep faith within it or observe its covenants. It must be a league of honour, a partnership of opinion. Intrigue would eat its vitals away; the
plottings of inner circles who could plan what they would and render account to no one would be a corruption seated at its very heart. Only free peoples
can hold their purpose and their honour steady to a common end and prefer the interests of mankind to any narrow interest of their own. Spencer R. Weart

Even if this is true, it distorts reality and makes


people far too sanguine about democracys ability to deliver the worlds greatest
need today peace. In reality, the main threat to world peace today is not war
between two nation-states, but (1) nuclear arms proliferation; (2) terrorism; and (3)
ethnic and religious conflict within states. As this paper was being written, India, the worlds largest democracy,
appeared to be itching to start a war with Pakistan, bringing the world closer to nuclear war than it has been for many years. The United
States, the worlds leading democracy, is waging war in Afghanistan, which war
relates to the second and third threats noted above terrorism and ethnic/religious
conflict. If the terrorists are to be believed and why would they lie?they struck at
the United States on September 11th because of its democratically-induced
interventions into ethnic/religious disputes in their parts of the world. As I shall argue below,
democracy is implicated in all three major threats to world peace and others as well.
The vaunted political machinery of democracy has failed to deliver on its promises.
The United States, the quintessential democracy, was directly or indirectly involved
in most of the major wars in the 20th Century. On September 11, 2001, the 350-year experiment with the modern
alleges that democracies rarely if ever go to war with each other.

nation-state ended in failure. A radical re-thinking of the relationship between the individual and the collective, society and state is urgently required. Our

We must seriously question whether the primitive and ungainly political


technology of democracy can possibly keep the peace in tomorrows world. Thus, a
thorough reconsideration of the relationship between democracy and peace is
essential. This paper makes a beginning in that direction.
lives depend on it.

1NC Democracy Bad


Democracy causes war much more recent evidence
Lebow 11
(http://www.dartmouth.edu/~nedlebow/aggresive_democracies.pdf,
Aggressive Democracies)//A.V.
Aggressive Democracies Richard Ned Lebow1 abstract D emocracies

are the most aggressive


regime type measured in terms of war initiation. Since 1945, the United States has
also been the worlds most aggressive state by this measure. This finding prompts the

question of whether the aggressiveness of democracies, and the United States in particular, is due to regime type
or other factors. I make the case for the latter. My argument has implications for the Democratic Peace thesis and

The Democratic
Peace research programme is based on the putative empirical finding that
democracies do not fight other democracies. It has generated a large literature
around the validity of this finding and about the reasons why democracies do not
initiate wars against democratic opponents. In this paper, I do not engage these
controversies directly, but rather look at the record of democracies as war initiators
in the post-World War II period. They turn out to be the most aggressive regime type
measured by war initiation. The United States, which claims to be the worlds leading democracy, is also
the unfortunate tendency of some of its advocates to use its claims for policy guidance.

the worlds most aggressive state by this measure. Below, I first document this set of claims using a data set that
Benjamin Valentino and I constructed. Next, I speculate about some of the reasons why the United States has been
such an aggressive state in the post-war era. In particular, I am interested in the extent to which this
aggressiveness is due to democratic governance or other, more idiosyncratic factors. I am inclined to make the case
for the latter. This argument has implications for the Democratic Peace thesis and the unfortunate tendency of
some of its advocates to use its claims for policy guidance. Richard Ned Lebow, Aggressive Democracies, St
Antonys International Review 6, no. 2 (2011): 120133.121 The United States and War Initiation The more
meaningful peer group comparison for the United States is with the countries of Western Europe, Japan, the Old
Commonwealth (Canada, Australia, and New Zealand), and certain Latin American states .

This is because
these are all fellow democracies thatlike the United Statesare relatively wellestablished, relatively liberal, relatively wealthy (on a per capita income basis), and unlike
Israel and Indiarelatively geo-politically secure and relatively lacking in severe
religious and ethnic tension. Here the United States is clearly an outlier, as only two of these countries

initiated wars (France and Britain against Egypt in 1956). Britain was also a partner of the United States in the 1991
Gulf War and the 2003 invasion of Iraq. The United States differs from all these countries in several important ways.
In A Cultural Theory of International Relations, I describe it as a parvenu power. These are states that are late
entrants into the arena where they can compete for standing and do so with greater intensity than other states.
Moreover, due to the ideational legacy left by their parvenu status, such states may continue to behave like this for

They devote a higher percentage of their


national income to military forces and pursue more aggressive foreign policies.
Examples include Sweden under Gustavus Adolphus, Prussia and Russia in the eighteenth centuries, and
Japan and the United States in the late nineteenth and twentieth centuries.5 Unlike
a considerable time after achieving great power status.

other parvenu powers, the constraints on the United States were more internal than external. Congress, not other
powers, kept American presidents from playing a more active role in European affairs in the 1920s and 1930s and
forced a withdrawal from Indochina in the 1970s. The United States was never spurned or humiliated by other
powers, but some American presidents and their advisers did feel humiliated by the constraints imposed upon them
domestically. They frequently sought to commit the country to activist policies through membership in international
institutions that involved long-term obligations (for example, the imf and nato), executive actions (for example, the
1940 destroyer deal, intervention in the Korean War, and sending Marines to Lebanon in 1958), and congressional
resolutions secured on the basis of false or misleading information (the Gulf of Tonkin and Iraq War resolutions).
Ironically, concern for credibility promoted ill-considered and open-ended commitments like Vietnam and Iraq that
later led to public opposition and the congressional constraints that subsequent American presidents considered

Instead of 123 prompting a reassessment of national


security strategy, these setbacks appear to have strengthened the commitment of
at least some presidents and their advisers to breaking free of these constraints and
asserting leadership in the world, thus ushering in a new cycle of overextension,
failure, and renewed constraints. The United States is unique in other ways. It is by far and away the
most powerful economy in the world. At the end of World War II, it accounted for 46 per cent
detrimental to presidential credibility.

of the worlds gross domestic product (gdp) and today represents a still-impressive
21 per cent.6 Prodigious wealth allows the United States to spend an extraordinary
percentage of its gdp on its armed forces in comparison to other countries. In the
aftermath of the Cold War, most countries cut back on military spending, but us spending has increased. In 2003,
the United States spent $417 billion on defence, 47 per cent of the world total .7 In
2008, it spent 41 per cent of its national budget on the military and the cost of past wars, which accounted for
almost 50 per cent of world defence spending. In absolute terms, this was twice the total of Japan, Russia, the
United Kingdom, Germany, and China combined. Not surprisingly, the United States is the only state with global
military reach.8 Democratic and Republican administrations alike have held that extraordinary levels of military
expenditure will sustain, if not increase, the standing and influence that traditionally comes with military
dominance. It is intended to make the United States, in the words of former Secretary of State Madeleine Albright,
the indispensable nationthe only power capable of enforcing global order.9 An equally important point is that
possession of such military instruments encourages policymakers to formulate maximalist objectives. Such goals
are, by definition, more difficult to achieve by diplomacy, pushing the United States into eyeball-to-eyeball
confrontations where the use of force becomes a possibility. us defence expenditure also reflects the political power
of the military-industrial complex. Defence spending has encouraged the dependence of numerous companies on
the government and helped bring others into being. In 1991, at the end of the Cold War, twelve million people,
roughly ten per cent of the us workforce, were directly or indirectly dependent upon defence dollars. The number

Having such a large impact on the economy gives


defence contractors enormous political clout.10 Those who land major weapons
projects are careful to subcontract production across the country, often offering a
part of the production process to companies in every state. This gives the contractors
enormous political leverage in Congress, often 124
has not changed significantly since.

2NC Democracy Bad


Democracies start more wars- statistical analysis proves

Henderson 2 (Errol Henderson, Assistant Professor, Dept. of Political Science at the


University of Florida, 2002, Democracy and War The End of an Illusion?, p. 146)
Are Democracies More Peaceful than Nondemocracies with Respect to Interstate Wars ?
The results indicate that democracies are more war-prone than non-democracies
(whether democracy is coded dichotomously or continuously) and that democracies are
more likely to initiate interstate wars. The findings are obtained from analyses

that control for a host of political, economic, and cultural factors that have been
implicated in the onset of interstate war , and focus explicitly on state level factors
instead of simply inferring state level processes from dyadic level observations as was
done in earlier studies (e.g., Oneal and Russett, 1997; Oneal and Ray, 1997). The results
imply that democratic enlargement is more likely to increase the probability of war
for states since democracies are more likely to become involved inand to initiate
interstate wars.

Democracy leads to wars against non-democracies.

Daase 6 (Christopher, Chair in International Organisation, University of Frankfurt,


Democratic Wars, pg. 77)
In what follows, I will focus on three reasons why democracies might be peaceful to each
other, but abrasive or even bellicose towards non- democracies. The first reason is an
institutional one: domestic institutions dampen conflicts among democracies but
aggravate conflicts between democracies and non-democracies . The second
reason is a normative one: shared social values and political ideals prevent wars

between democracies but make wars between democracies and nondemocracies more likely and savage. The third reason is a structural one: the search
for safety encourages democracies to create security communities by
renouncing violence among themselves but demands assertiveness against
outsiders and the willingness to use military means if enlargement of that
community cannot be achieved peacefully . To illustrate this, I will draw mainly on the
United States as an example following a Tocquevlllean tradition, but knowing that not all
democracies behave in the same way or that the US is the only war-fighting democracy.
It is clear that the hypotheses are first conjectures and that more case studies and
quantitative tests are needed to reach more general conclusions .

Democratic governments engage in diversionary wars to


influence elections.

Daase 6 (Christopher, Chair in International Organisation, University of Frankfurt,


Democratic Wars, pg. 77)
However, there is a contradictory effect as well. Democratic governments are
tempted to use military violence prior to elections if their public esteem is in decline
and if they must fear not being re-elected (Ostrom and]ob, 1986; Russett, 1990; Mintz
and Russett, 1992; Mintz and Geza, 1993). In doing so, they count on the 'rally round

the flag' effect, which is usually of short duration but long enough to make the
public forget economic misery or governmental misbehaviour in order to
influence tight elections results in favour of the incumbent . This diversionary
effect of warfare is especially attractive to democracies since they have no other
means at their disposal to diffuse discontent or suppress internal conflict.
Therefore, the use of military force for diversionary purposes is generally 'a
pathology of democratic systems' (Gelpi, 1997, p. 280).

Critical Infrastructure (Zero Days)


Adv

Notes
30 second explainer: zero-day vulnerabilities (basically vulnerabilities in
software that are unknown and can be exploited in zero-days) makes nuke
power at risk, cyber-terror causes nuke meltdowns, extinction, retaliation,
nuke war, yadayadayada

CX Questions

1NC Cyber Inev


Cybersecurity vulnerabilities are inevitable
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)
Like CALEA, a statutory obligation along the lines proposed herein will inevitably trigger criticisms and generate

One obvious criticism is that the creation of an escrow key or the maintenance
of a duplicate key by a manufacturer would introduce an unacceptable risk of
compromise for the device. This argument presupposes that the risk is significant, that
the costs of its exploitation are large, and that the benefit is not worth the risk. Yet
manufacturers, product developers, service providers and users
constantly introduce such risks. Nearly every feature or bit of code added
to a device introduces a risk, some greater than others. The vulnerabilities
that have been introduced to computers by software such as Flash, ActiveX
controls, Java, and web browsers are well documented .51 The ubiquitous SQL
database, while extremely effective at helping web designers create effective data
driven websites, is notorious for its vulnerability to SQL injection attacks. 52 The
adding of microphones to electronic devices opened the door to aural interceptions.
Similarly, the introduction of cameras has resulted in unauthorized video surveillance
of users. Consumers accept all of these risks, however, since we, as individual users
and as a society, have concluded that they are worth the cost. Some will inevitably
argue that no new possible vulnerabilities should be introduced into devices to allow
the government to execute reasonable, and therefore lawful, searches for unique and
otherwise unavailable evidence. However, this argument implicitly asserts that
there is no, or insignificant, value to society of such a feature. And herein lies the
Achilles heel to opponents of mandated front-door access: the conclusion is entirely at odds with
the inherent balance between individual liberty and collective security central to the
Fourth Amendment itself. Nor should lawmakers be deluded into believing that the
currently existing vulnerabilities that we live with on a daily basis are less significant
in scope than the possibility of obtaining complete access to the encrypted contents
of a device. Various malware variants that are so widespread as to be almost
omnipresent in our online community achieve just such access through what would
seem like minor cracks in the defense of systems. 53 One example is the Zeus
malware strain, which has been tied to the unlawful online theft of hundreds of
millions of dollars from U.S. companies and citizens and gives its operator complete
access to and control over any computer it infects .54 It can be installed on a machine through
the simple mistake of viewing an infected website or email, or clicking on an otherwise innocuous link.55 The
malware is designed to not only bypass malware detection software, but to
deactivate to softwares ability to detect it.56 Zeus and the many other variants of malware that
concerns.

are freely available to purchasers on dark-net websites and forums are responsible for the theft of funds from
countless online bank accounts (the credentials having been stolen by the malwares key-logger features), the theft
of credit card information, and innumerable personal identifiers.57

2NC Cyber Inev


Security issues are inevitable
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal. Consider the report issued this past week by a group of
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

computer security experts (including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys
Under Doormats: Mandating Insecurity By Requiring Government Access to All Data and Communications." The
report does not make an in-principle argument or a conceptual argument against extraordinary access. It argues,
rather, that the effort to build such a system risks eroding cybersecurity in ways far more important than the
problems it would solve. The authors, to summarize, make three claims in support of the broad claim that any

What are those


"grave security risks"? "[P]roviding exceptional access to communications would
force a U-turn from the best practices now being deployed to make the Internet
more secure. These practices include forward secrecy where decryption keys are deleted
exceptional access system would "pose . . . grave security risks [and] imperil innovation."

immediately after use, so that stealing the encryption key used by a communications server would not compromise
earlier or later communications. A related technique, authenticated encryption, uses the same temporary key to

"[B]uilding in
exceptional access would substantially increase system complexity" and
"complexity is the enemy of security." Adding code to systems increases that system's attack surface,
guarantee confidentiality and to verify that the message has not been forged or tampered with."

and a certain number of additional vulnerabilities come with every marginal increase in system complexity. So by
requiring a potentially complicated new system to be developed and implemented, we'd be effectively

"[E]xceptional access would create


concentrated targets that could attract bad actors." If we require tech companies to retain some
guaranteeing more vulnerabilities for malicious actors to hit.

means of accessing user communications, those keys have to stored somewhere, and that storage then becomes
an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large
numbers of users. The strong implication of the report is that these issues are not resolvable,
though the report never quite says that. But at a minimum, the authors raise a series of important questions about
whether such a system would, in practice, create an insecure internet in generalrather than one whose general
security has the technical capacity to make security exceptions to comply with the law. There is some reason, in my

the picture may not be quite as stark as the computer scientists


make it seem. After all, the big tech companies increase the complexity of their
software products all the time, and they generally regard the increased attack
surface of the software they create as a result as a mitigatable problem. Similarly,
there are lots of high-value intelligence targets that we have to secure and would
have big security implications if we could not do so successfully. And when it really counts,
view, to suspect that

that task is not hopeless. Google and Apple and Facebook are not without tools in the cybersecurity department.

The real question, in my view, is whether a system of the sort Comey imagines could be built in
fashion in which the security gain it would provide would exceed the heightened
security risks the extraordinary access would involve. As Herb Lin puts it in his excellent, and
admirably brief, Senate testimony the other day, this is ultimately a question without an answer in the absence of a
lot of new research. "One side says [the] access [Comey is seeking] inevitably weakens the security of a system and
will eventually be compromised by a bad guy; the other side says it doesnt weaken security and wont be
compromised. Neither side can prove its case, and we see a theological clash of absolutes." Only when someone
actually does the research and development and tries actually to produce a system that meets Comey's criteria are
we going to find out whether it's doable or not. And therein lies the rub, and the real meat of the policy problem, in
my view: Who's going to do this research? Who's going to conduct the sustained investment in trying to imagine a
system that secures communications except from government when and only government has a warrant to
intercept those communications? The assumption of the computer scientists in their report is that the burden of
that research lies with the government. "Absent a concrete technical proposal," they write, "and without answers to
the questions raised in this report, legislators should reject out of hand any proposal to return to the failed
cryptography control policy of the 1990s." Indeed, their most central recommendation is that the burden of
development is on Comey. "Our strong recommendation is that anyone proposing regulations should first present
concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses
and for hidden costs." In his testimony, Herb supports this call, though he acknowledges that it is not the inevitable
route: the government has not yet provided any specifics, arguing that private vendors should do it. At the same
time, the vendors wont do it, because [their] customers arent demanding such features. Indeed, many customers
would see such features as a reason to avoid a given vendor. Without specifics, there will be no progress. I believe
the government is afraid that any specific proposal will be subject to enormous criticismand thats truebut the
government is the party that wants . . . access, and rather than running away from such criticism, it should
embrace any resulting criticism as an opportunity to improve upon its initial designs." Herb might also have
mentioned that lots of people in the academic tech community who would be natural candidates to help develop
such an access system are much more interested in developing encryption systems to keep the feds out than to
under any circumstanceslet them in. The tech community has spent a lot more time and energy arguing against
the plausibility and desireability of implementing what Comey is seeking than it has spent in trying to develop
systems that deliver it while mitigating the risks such a system might pose. For both industry and the tech
communities, more broadly, this is government's problem, not their problem. Yet reviving the Clipper Chip model
in which government develops a fully-formed system and then puts it out publicly for the community to shoot down
is clearly not what Comey has in mind. He is talking in very different language: the language of performance
requirements. He wants to leave the development task to Silicon Valley to figure out how to implement

wants to describe what he needsdecrypted signal when he


has a warrantand leave the companies to figure out how to deliver it while still
providing secure communications in other circumstances to their customers. The
advantage to this approach is that it potentially lets a thousand flowers
bloom. Each company might do it differently. They would compete to
provide the most security consistent with the performance standard. They
government's requirements. He

could learn from each other. And government would not be in the position
of developing and promoting specific algorithms. It wouldn't even need to
know how the task was being done.

1NC No Meltdowns Impact


No impact to or risk of nuclear meltdowns their evidence
Cappiello 3/29/11 national environmental reporter for The Associated
Press, masters degrees in earth and environmental science and journalism
from Columbia University (Dina, Long Blackouts Pose Risk To U.S. Nuclear
Reactors Huffington Post,
http://www.huffingtonpost.com/2011/03/29/blackout-risk-us-nuclearreactors_n_841869.html)//IS
should power be
knocked out by an earthquake or tornado it "would be unlikely that power will be recovered
in the time frame to prevent core meltdown." In Japan, it was a one-two punch: first the
A 2003 federal analysis looking at how to estimate the risk of containment failure said that

earthquake, then the tsunami. Tokyo Electric Power Co., the operator of the crippled plant, found other ways to cool
the reactor core and so far avert a full-scale meltdown without electricity. "Clearly the coping duration is an issue on
the table now," said Biff Bradley, director of risk assessment for the Nuclear Energy Institute. "The industry and the
Nuclear Regulatory Commission will have to go back in light of what we just observed and rethink station blackout
duration." David Lochbaum, a former plant engineer and nuclear safety director at the advocacy group Union of
Concerned Scientists, put it another way: "Japan

shows what happens when you play beat-theclock and lose." Lochbaum plans to use the Japan disaster to press lawmakers and the nuclear power industry
to do more when it comes to coping with prolonged blackouts, such as having temporary generators on site that

A complete loss of electrical power, generally speaking, poses a


major problem for a nuclear power plant because the reactor core must be kept
cool, and back-up cooling systems mostly pumps that replenish the core with
water_ require massive amounts of power to work. Without the electrical grid, or
diesel generators, batteries can be used for a time, but they will not last long with
the power demands. And when the batteries die, the systems that control and
monitor the plant can also go dark, making it difficult to ascertain water levels and
the condition of the core. One variable not considered in the NRC risk assessments
of severe blackouts was cooling water in spent fuel pools, where rods once used in
the reactor are placed. With limited resources, the commission decided to focus its
analysis on the reactor fuel, which has the potential to release more radiation . An
analysis of individual plant risks released in 2003 by the NRC shows that
for 39 of the 104 nuclear reactors, the risk of core damage from a blackout
was greater than 1 in 100,000. At 45 other plants the risk is greater than 1
in 1 million, the threshold NRC is using to determine which severe accidents should be evaluated in its latest
can recharge batteries.

analysis. The Beaver Valley Power Station, Unit 1, in Pennsylvania had the greatest risk of core melt 6.5 in
100,000, according to the analysis. But that risk may have been reduced in subsequent years as NRC regulations
required plants to do more to cope with blackouts. Todd Schneider, a spokesman for FirstEnergy Nuclear Operating
Co., which runs Beaver Creek, told the AP that batteries on site would last less than a week. In 1988, eight years
after labeling blackouts "an unresolved safety issue," the NRC required nuclear power plants to improve the
reliability of their diesel generators, have more backup generators on site, and better train personnel to restore
power. These steps would allow them to keep the core cool for four to eight hours if they lost all electrical power. By
contrast, the newest generation of nuclear power plant, which is still awaiting approval, can last 72 hours without
taking any action, and a minimum of seven days if water is supplied by other means to cooling pools. Despite the
added safety measures, a 1997 report found that blackouts the loss of on-site and off-site electrical power
remained "a dominant contributor to the risk of core melt at some plants." The events of Sept. 11, 2001, further
solidified that nuclear reactors might have to keep the core cool for a longer period without power. After 9/11, the

plants have portable power supplies for relief


valves and be able to manually operate an emergency reactor cooling
system when batteries go out. The NRC says these steps, and others, have
reduced the risk of core melt from station blackouts from the current fleet
of nuclear plants. For instance, preliminary results of the latest analysis of the risks to the Peach Bottom
plant show that any release caused by a blackout there would be far less rapid
and would release less radiation than previously thought, even without
commission issued regulations requiring that

any actions being taken. With more time, people can be evacuated . The NRC
says improved computer models, coupled with up-to-date information about the plant, resulted in the rosier
outlook. "When you simplify, you always err towards the worst possible circumstance," Scott Burnell, a spokesman

The latest work shows that


"even in situations where everything is broken and you can't do anything
else, these events take a long time to play out," he said. "Even when you get
to releasing into environment, much less of it is released than actually
thought." Exelon Corp., the operator of the Peach Bottom plant, referred all detailed questions about its
for the Nuclear Regulatory Commission, said of the earlier studies.

preparedness and the risk analysis back to the NRC. In a news release issued earlier this month, the company,
which operates 10 nuclear power plants, said "all

Exelon nuclear plants are able to safely shut


down and keep the fuel cooled even without electricity from the grid ." Other people,

looking at the crisis unfolding in Japan, aren't so sure. In the worst-case scenario, the NRC's 1990 risk assessment

a core melt at Peach Bottom could begin in one hour if electrical power
on- and off-site were lost, the diesel generators the main back-up source of power
for the pumps that keep the core cool with water failed to work and other
mitigating steps weren't taken. "It is not a question that those things are definitely
effective in this kind of scenario," said Richard Denning, a professor of nuclear engineering at Ohio
predicted that

State University, referring to the steps NRC has taken to prevent incidents. Denning had done work as a contractor
on severe accident analyses for the NRC since 1975. He retired from Battelle Memorial Institute in 1995. "They
certainly could have made all the difference in this particular case," he said, referring to Japan. "That's assuming
you have stored these things in a place that would not have been swept away by tsunami."

1NC No Cyber
Their impacts are all hypeno cyberattack
Walt 10 Stephen M. Walt 10 is the Robert and Rene Belfer Professor of
international relations at Harvard University "Is the cyber threat overblown?"
March 30
walt.foreignpolicy.com/posts/2010/03/30/is_the_cyber_threat_overblown
cyber-warfare

Am I the only person -- well, besides Glenn Greenwald and Kevin Poulson -- who thinks the "
" business may be overblown? Its
clear the U.S. national security establishment is paying a lot more attention to the issue, and colleagues of mine -- including some pretty serious and

looks to me
like a classic opportunity for threat-inflation. Mind you, I'm not saying that there aren't a lot of
level-headed people -- are increasingly worried by the danger of some sort of "cyber-Katrina." I don't dismiss it entirely, but this sure

shenanigans going on in cyber-space, or that various forms of cyber-warfare don't have military potential. So I'm not arguing for complete head-in-

heres what makes me worry that the threat is being overstated. First, the whole
issue is highly esoteric -- you really need to know a great deal about computer networks, software, encryption, etc., to know how
serious the danger might be. Unfortunately, details about a number of the alleged incidents that are being
invoked to demonstrate the risk of a "cyber-Katrina," or a cyber-9/11, remain classified, which makes it
hard for us lay-persons to gauge just how serious the problem really was or is. Moreover, even when we
hear about computers being penetrated by hackers, or parts of the internet crashing, etc., its hard to
know how much valuable information was stolen or how much actual damage was done .
And as with other specialized areas of technology and/or military affairs, a lot of the experts have a clear vested
interest in hyping the threat, so as to create greater demand for their services. Plus, we
already seem to have politicians leaping on the issue as a way to grab some pork for their states.
Second, there are lots of different problems being lumped under a single banner , whether the
the-sand complacency. But

label is "cyber-terror" or "cyber-war." One issue is the use of various computer tools to degrade an enemys military capabilities (e.g., by disrupting
communications nets, spoofing sensors, etc.). A second issue is the alleged threat that bad guys would penetrate computer networks and shut down
power grids, air traffic control, traffic lights, and other important elements of infrastructure, the way that internet terrorists (led by a disgruntled
computer expert) did in the movie Live Free and Die Hard. A third problem is web-based criminal activity, including identity theft or simple fraud (e.g.,
those emails we all get from someone in Nigeria announcing that they have millions to give us once we send them some account information). A
fourth potential threat is cyber-espionage; i.e., clever foreign hackers penetrate Pentagon or defense contractors computers and download
valuable classified information. And then there are annoying activities like viruses, denial-of-service attacks, and other things that affect the stability

This sounds like a rich menu of


potential trouble, and putting the phrase "cyber" in front of almost any noun makes it
sound trendy and a bit more frightening. But notice too that these are all somewhat different problems of quite different
importance, and the appropriate response to each is likely to be different too. Some issues -- such as the danger of
cyber-espionage -- may not require elaborate technical fixes but simply more rigorous
security procedures to isolate classified material from the web. Other problems may not require big federal
programs to address, in part because both individuals and the private sector
of web-based activities and disrupt commerce (and my ability to send posts into FP).

have incentives to protect themselves (e.g., via firewalls or by backing up critical data). And as Greenwald
warns, there may be real costs to civil liberties if concerns about vague cyber dangers lead us to grant the NSA or some other government agency

Is the danger
that some malign hacker crashes a power grid greater than the likelihood that a blizzard
would do the same thing? Is the risk of cyber-espionage greater than the potential danger
from more traditional forms of spying? Without a comparative assessment of different risks and the costs of mitigating each
greater control over the Internet. Third, this is another issue that cries out for some comparative cost-benefit analysis.

one, we will allocate resources on the basis of hype rather than analysis. In short, my fear is not that we won't take reasonable precautions against a
potential set of dangers; my concern is that we will spend tens of billions of dollars protecting ourselves against a set of threats that are not as
dangerous as we are currently being told they are.

2NC No Cyber
No cyber impact
Healey 3/20 Jason, Director of the Cyber Statecraft Initiative at the Atlantic
Council, "No, Cyberwarfare Isn't as Dangerous as Nuclear War", 2013,
www.usnews.com/opinion/blogs/world-report/2013/03/20/cyber-attacks-notyet-an-existential-threat-to-the-us
America does not face an existential cyberthreat today, despite recent
warnings. Our cybervulnerabilities are undoubtedly grave and the threats we face are severe

but far from comparable to nuclear war. The most recent alarms come in a Defense Science
Board report on how to make military cybersystems more resilient against advanced threats (in short, Russia or
China). It warned that the "cyber threat is serious, with potential consequences similar in some ways to the nuclear
threat of the Cold War." Such fears were also expressed by Adm. Mike Mullen, then chairman of the Joint Chiefs of
Staff, in 2011. He called cyber "The single biggest existential threat that's out there" because "cyber actually more
than theoretically, can attack our infrastructure, our financial systems."

While it is true that cyber

attacks might do these things, it is also true they have not only never
happened but are far more difficult to accomplish than mainstream
thinking believes. The consequences from cyber threats may be similar in some

ways to nuclear, as the Science Board concluded, but mostly, they are incredibly
dissimilar. Eighty years ago, the generals of the U.S. Army Air Corps were sure that their bombers would
easily topple other countries and cause their populations to panic, claims which did not stand up to reality. A
study of the 25-year history of cyber conflict, by the Atlantic Council and Cyber Conflict
Studies Association, has shown a similar dynamic where the impact of disruptive
cyberattacks has been consistently overestimated. Rather than theorizing about

future cyberwars or extrapolating from today's concerns, the history of cyberconflict that have actually been fought,
shows that cyber incidents have so far tended to have effects that are either widespread but fleeting or persistent

No attacks, so far, have been both widespread and persistent.


There have been no authenticated cases of anyone dying from a cyber
attack. Any widespread disruptions, even the 2007 disruption against Estonia, have been
short-lived causing no significant GDP loss. Moreover, as with conflict in other domains, cyberattacks can
but narrowly focused.

take down many targets but keeping them down over time in the face of determined defenses has so far been out
of the range of all but the most dangerous adversaries such as Russia and China. Of course, if the United States is
in a conflict with those nations, cyber will be the least important of the existential threats policymakers should be
worrying about. Plutonium trumps bytes in a shooting war. This is not all good news.
Policymakers have recognized the problems since at least 1998 with little significant progress. Worse, the threats
and vulnerabilities are getting steadily more worrying.

Still, experts have been warning of a

cyber Pearl Harbor for 20 of the 70 years since the actual Pearl Harbor. The

cyber espionage could someday accumulate


into an existential threat. But it doesn't seem so seem just yet, with only
transfer of U.S. trade secrets through Chinese

handwaving estimates of annual losses of 0.1 to 0.5 percent to the total U.S. GDP of around $15 trillion. That's bad,
but

it doesn't add up to an existential crisis or "economic cyberwar."

No impact to cyberterror
Green 2 editor of The Washington Monthly (Joshua, 11/11, The Myth of
Cyberterrorism,
http://www.washingtonmonthly.com/features/2001/0211.green.html, AG)
There's just one problem:

There is no such thing as cyberterrorism--no instance of

anyone ever having been killed by a terrorist (or anyone else) using a computer.
Nor is there compelling evidence that al Qaeda or any other terrorist
organization has resorted to computers for any sort of serious destructive activity. What's more,
outside of a Tom Clancy novel, computer security specialists believe it is virtually
impossible to use the Internet to inflict death on a large scale, and many scoff at the
notion that terrorists would bother trying. "I don't lie awake at night worrying about cyberattacks ruining my life,"

says Dorothy Denning, a computer science professor at Georgetown University and


one of the country's foremost cybersecurity experts. "Not only does
[cyberterrorism] not rank alongside chemical, biological, or nuclear weapons, but it is not anywhere
near as serious as other potential physical threats like car bombs or suicide bombers." Which
is not to say that cybersecurity isn't a serious problem--it's just not one that involves terrorists. Interviews with
terrorism and computer security experts, and current and former government and military officials, yielded near
unanimous agreement that the real danger is from the criminals and other hackers who did $15 billion in damage to
the global economy last year using viruses, worms, and other readily available tools. That figure is sure to balloon if
more isn't done to protect vulnerable computer systems, the vast majority of which are in the private sector. Yet
when it comes to imposing the tough measures on business necessary to protect against the real cyberthreats, the

people imagine cyberterrorism, they


tend to think along Hollywood plot lines, doomsday scenarios in which
terrorists hijack nuclear weapons, airliners, or military computers from halfway around the world.
Bush administration has balked. Crushing BlackBerrys When ordinary

Given the colorful history of federal boondoggles--billion-dollar weapons systems that misfire, $600 toilet seats-that's an understandable concern. But, with few exceptions, it's not one that applies to preparedness for a
cyberattack. "The government is miles ahead of the private sector when it comes to cybersecurity," says Michael
Cheek, director of intelligence for iDefense, a Virginia-based computer security company with government and
private-sector clients. "Particularly the most sensitive military systems." Serious effort and plain good fortune have
combined to bring this about. Take nuclear weapons. The biggest fallacy about their vulnerability, promoted in
action thrillers like WarGames, is that they're designed for remote operation. "[The movie] is premised on the
assumption that there's a modem bank hanging on the side of the computer that controls the missiles," says Martin
Libicki, a defense analyst at the RAND Corporation. "I assure you, there isn't." Rather, nuclear weapons and other
sensitive military systems enjoy the most basic form of Internet security: they're "air-gapped," meaning that they're
not physically connected to the Internet and are therefore inaccessible to outside hackers. (Nuclear weapons also
contain "permissive action links," mechanisms to prevent weapons from being armed without inputting codes
carried by the president.) A retired military official was somewhat indignant at the mere suggestion: "As a general
principle, we've been looking at this thing for 20 years. What cave have you been living in if you haven't considered

the Defense Department has been


particularly vigilant to protect key systems by isolating them from the Net and
this [threat]?" When it comes to cyberthreats,

even from the Pentagon's internal network. All new software must be submitted to the National Security Agency for
security testing. "Terrorists

could not gain control of our spacecraft, nuclear

weapons, or any

other type of high-consequence asset," says Air Force Chief Information


Officer John Gilligan. For more than a year, Pentagon CIO John Stenbit has enforced a moratorium on new wireless
networks, which are often easy to hack into, as well as common wireless devices such as PDAs, BlackBerrys, and
even wireless or infrared copiers and faxes. The September 11 hijackings led to an outcry that airliners are
particularly susceptible to cyberterrorism. Earlier this year, for instance, Sen. Charles Schumer (D-N.Y.) described
"the absolute havoc and devastation that would result if cyberterrorists suddenly shut down our air traffic control
system, with thousands of planes in mid-flight." In fact, cybersecurity experts give some of their highest marks to
the FAA, which reasonably separates its administrative and air traffic control systems and strictly air-gaps the latter.

It's impossible to hijack


a plane remotely, which eliminates the possibility of a high-tech 9/11 scenario in which
planes are used as weapons. Another source of concern is terrorist infiltration of our intelligence
agencies. But here, too, the risk is slim. The CIA's classified computers are also air-gapped,
as is the FBI's entire computer system. "They've been paranoid about this forever," says Libicki, adding that
And there's a reason the 9/11 hijackers used box-cutters instead of keyboards:

paranoia is a sound governing principle when it comes to cybersecurity. Such concerns are manifesting themselves
in broader policy terms as well. One notable characteristic of last year's Quadrennial Defense Review was how
strongly it focused on protecting information systems.

Cyberattacks impossible empirics and defenses solve


Rid 12 (Thomas Rid, reader in war studies at King's College London, is

author of "Cyber War Will Not Take Place" and co-author of "CyberWeapons.", March/April 2012, Think Again: Cyberwar,
http://www.foreignpolicy.com/articles/2012/02/27/cyberwar?page=full)
"Cyberwar Is Already Upon Us." No way. "Cyberwar

is coming!" John Arquilla and David Ronfeldt predicted in


a celebrated Rand paper back in 1993. Since then, it seems to have arrived -- at least by the account of
the U.S. military establishment, which is busy competing over who should get what share of the fight. Cyberspace is
"a domain in which the Air Force flies and fights," Air Force Secretary Michael Wynne claimed in 2006. By 2012,
William J. Lynn III, the deputy defense secretary at the time, was writing that

cyberwar is "just as critical

to military operations as land, sea, air, and space ." In January, the Defense Department vowed to
equip the U.S. armed forces for "conducting a combined arms campaign across all domains -- land, air, maritime,
space, and cyberspace." Meanwhile, growing piles of books and articles explore the threats of cyberwarfare,

Time for a reality check: Cyberwar is still more


hype than hazard. Consider the definition of an act of war: It has to be potentially violent, it
has to be purposeful, and it has to be political. The cyberattacks we've seen so far , from
Estonia to the Stuxnet virus, simply don't meet these criteria. Take the dubious story of a Soviet pipeline
explosion back in 1982, much cited by cyberwar's true believers as the most destructive cyberattack
ever. The account goes like this: In June 1982, a Siberian pipeline that the CIA had virtually booby-trapped with a
cyberterrorism, and how to survive them.

so-called "logic bomb" exploded in a monumental fireball that could be seen from space. The U.S. Air Force
estimated the explosion at 3 kilotons, equivalent to a small nuclear device. Targeting a Soviet pipeline linking gas
fields in Siberia to European markets, the operation sabotaged the pipeline's control systems with software from a

No one died, according to Thomas Reed, a U.S.


the only
harm came to the Soviet economy. But did it really happen? After Reed's account came out,
Vasily Pchelintsev, a former KGB head of the Tyumen region , where the alleged explosion
supposedly took place, denied the story. There are also no media reports from 1982 that confirm such an
Canadian firm that the CIA had doctored with malicious code.

National Security Council aide at the time who revealed the incident in his 2004 book, At the Abyss;

explosion, though accidents and pipeline explosions in the Soviet Union were regularly reported in the early 1980s.
Something likely did happen, but Reed's book is the only public mention of the incident and his account relied on a
single document. Even after the CIA declassified a redacted version of Reed's source, a note on the so-called
Farewell Dossier that describes the effort to provide the Soviet Union with defective technology, the agency did not
confirm that such an explosion occurred. The available evidence on the Siberian pipeline blast is so thin that it
shouldn't be counted as a proven case of a successful cyberattack. Most other commonly cited cases of cyberwar
are even less remarkable. Take the attacks on Estonia in April 2007, which came in response to the controversial
relocation of a Soviet war memorial, the Bronze Soldier. The well-wired country found itself at the receiving end of a
massive distributed denial-of-service attack that emanated from up to 85,000 hijacked computers and lasted three
weeks. The attacks reached a peak on May 9, when 58 Estonian websites were attacked at once and the online
services of Estonia's largest bank were taken down. "What's the difference between a blockade of harbors or
airports of sovereign states and the blockade of government institutions and newspaper websites?" asked Estonian
Prime Minister Andrus Ansip. Despite his analogies, the attack was no act of war. It was certainly a nuisance and an
emotional strike on the country, but the bank's actual network was not even penetrated; it went down for 90
minutes one day and two hours the next. The attack was not violent, it wasn't purposefully aimed at changing
Estonia's behavior, and no political entity took credit for it. The same is true for the vast majority of cyberattacks on

there is no known cyberattack that has caused the loss of


human life. No cyberoffense has ever injured a person or damaged a building . And if
an act is not at least potentially violent, it's not an act of war . Separating war from physical
record. Indeed,

violence makes it a metaphorical notion; it would mean that there is no way to distinguish between World War II,
say, and the "wars" on obesity and cancer. Yet those ailments, unlike past examples of cyber "war," actually do kill
people. "A Digital Pearl Harbor Is Only a Matter of Time ." Keep waiting. U.S. Defense
Secretary Leon Panetta delivered a stark warning last summer: "We could face a cyberattack that could be the

alarmist predictions have been ricocheting inside the


Beltway for the past two decades, and some scaremongers have even upped the
ante by raising the alarm about a cyber 9/11. In his 2010 book, Cyber War, former White House
equivalent of Pearl Harbor." Such

counterterrorism czar Richard Clarke invokes the specter of nationwide power blackouts, planes falling out of the
sky, trains derailing, refineries burning, pipelines exploding, poisonous gas clouds wafting, and satellites spinning

the empirical record is


less hair-raising, even by the standards of the most drastic example available . Gen.
Keith Alexander, head of U.S. Cyber Command (established in 2010 and now boasting a budget of more
out of orbit -- events that would make the 2001 attacks pale in comparison. But

than $3 billion), shared his worst fears in an April 2011 speech at the University of Rhode Island: "What I'm
concerned about are destructive attacks," Alexander said, "those that are coming." He then invoked a remarkable
accident at Russia's Sayano-Shushenskaya hydroelectric plant to highlight the kind of damage a cyberattack might
be able to cause. Shortly after midnight on Aug. 17, 2009, a 900-ton turbine was ripped out of its seat by a socalled "water hammer," a sudden surge in water pressure that then caused a transformer explosion. The turbine's
unusually high vibrations had worn down the bolts that kept its cover in place, and an offline sensor failed to detect
the malfunction. Seventy-five people died in the accident, energy prices in Russia rose, and rebuilding the plant is
slated to cost $1.3 billion. Tough luck for the Russians, but here's what the head of Cyber Command didn't say: The
ill-fated turbine had been malfunctioning for some time, and the plant's management was notoriously poor. On top
of that, the key event that ultimately triggered the catastrophe seems to have been a fire at Bratsk power station,
about 500 miles away. Because the energy supply from Bratsk dropped, authorities remotely increased the burden
on the Sayano-Shushenskaya plant. The sudden spike overwhelmed the turbine, which was two months shy of

the Sayano-Shushenskaya
incident highlights how difficult a devastating attack would be to mount .
The plant's washout was an accident at the end of a complicated and unique chain
of events. Anticipating such vulnerabilities in advance is extraordinarily difficult
even for insiders; creating comparable coincidences from cyberspace would be a
daunting challenge at best for outsiders. If this is the most drastic incident Cyber Command
can conjure up, perhaps it's time for everyone to take a deep breath. " Cyberattacks Are Becoming
Easier." Just the opposite. U.S. Director of National Intelligence James R. Clapper warned last
year that the volume of malicious software on American networks had more than
tripled since 2009 and that more than 60,000 pieces of malware are now discovered every day. The United
States, he said, is undergoing "a phenomenon known as 'convergence, ' which amplifies
the opportunity for disruptive cyberattacks, including against physical infrastructures." ("Digital
reaching the end of its 30-year life cycle, sparking the catastrophe. If anything,

convergence" is a snazzy term for a simple thing: more and more devices able to talk to each other, and formerly

Just because there's more malware, however,


doesn't mean that attacks are becoming easier. In fact, potentially damaging or
life-threatening cyberattacks should be more difficult to pull off . Why? Sensitive
systems generally have built-in redundancy and safety systems, meaning
an attacker's likely objective will not be to shut down a system , since merely
forcing the shutdown of one control system, say a power plant, could trigger a backup
and cause operators to start looking for the bug. To work as an effective weapon,
malware would have to influence an active process -- but not bring it to a screeching
halt. If the malicious activity extends over a lengthy period, it has to remain
stealthy. That's a more difficult trick than hitting the virtual off-button. Take Stuxnet,
the worm that sabotaged Iran's nuclear program in 2010. It didn't just crudely shut down the
centrifuges at the Natanz nuclear facility; rather, the worm subtly manipulated the
system. Stuxnet stealthily infiltrated the plant's networks, then hopped onto the protected control systems,
separate industries and activities able to work together.)

intercepted input values from sensors, recorded these data, and then provided the legitimate controller code with
pre-recorded fake input signals, according to researchers who have studied the worm. Its objective was not just to
fool operators in a control room, but also to circumvent digital safety and monitoring systems so it could secretly

Building and deploying Stuxnet required extremely


detailed intelligence about the systems it was supposed to compromise , and the
same will be true for other dangerous cyberweapons. Yes, "convergence,"
standardization, and sloppy defense of control-systems software could increase the
risk of generic attacks, but the same trend has also caused defenses against the
most coveted targets to improve steadily and has made reprogramming
highly specific installations on legacy systems more complex, not less.
manipulate the actual processes.

Cyber-Vulnerability Adv

Notes
30 second explainer: yeah whatevs nuke war outweighs.

CX Questions

1NC Util
Prefer consequences
Goodin 95
Robert E. Goodin, Professor of Philosophy at the University of Australia, Utilitarianism as a Public Philosophy, pg
26 1995
This focus on the moral importance of modal shifts can be shown to have important implications for nuclear
weapons policy. The preconditions for applying my argument surely all exist. Little need be said to justify the claim

the consequences in view matter morally. Maybe consequentialistic


considerations are not the only ones that should guide our choices, of
military policies or any others; but where the consequences in view are so
momentous as those involved in an all-out nuclear war, it would be sheer
lunacy to deny such considerations any role at all
that

Morality is vacuous- infinite regress


Stelzig 98
[Tim Stelzig, B.A. 1990, West Virginia University; M.A. 1995, University of
Illinois at Chicago; J.D. Candidate 1998, University of Pennsylvania. , 3/98,
"COMMENT: DEONTOLOGY, GOVERNMENTAL ACTION, AND THE DISTRIBUTIVE
EXEMPTION: HOW THE TROLLEY PROBLEM SHAPES THE RELATIONSHIP
BETWEEN RIGHTS AND POLICY", 146 U. Pa. L. Rev. 901,&nbsp;lexis law]
Take first the epistemological problem. Every view of morality must ultimately give some
account of how it is that we come to know what is right. An otherwise
impressive moral metaphysics is pointless if epistemologically implausible . 103
With general norms, it is plausible that we may come to learn them gradually, refining our understanding through
practice. Naturalistically learning through practice, however, is foreclosed to one who sees deontology as both

Almost every situation is morally different from the


rest, even if only slightly so. If deontology is exhaustive of morality, there must be a separate
injunction for each situation. The epistemological [*922] problem is that learning an
essentially infinite number of separate rules to govern our conduct is
implausible. It initially might be thought that the epistemological problem could be overcome by allowing
pervasive and particularist.

generality within the specific norms, thus making it possible for the student of morality to learn these general
principles and then derive the specific deontological prohibitions from them. The trouble with this response is that
the important theoretic work is performed by the underlying principles by which the specific deontological maxims
can be learned. This is problematic because theoretic entities are abstract. As such, Ockhams Razor and the

There is
no logical inconsistency in positing a deontological norm for every morally
distinct situation. But if pervasive, deontological maxims would be superfluous.
principles of pragmatism dictate that we do better to recognize conceptually the general principles.

Thus, it is theoretically preferable to deny them this exclusivity. 106 Suppose the epistemological problem can be

If deontology
may be exhaustive without being particularist, then a separate objection, the
conflicts problem, arises. As was true of the epistemological problem, the conflicts problem
arises because morality has something to say about almost everything.
Because the world is complex, if rights are general, then the evaluation of most
morally interesting situations will either depend on more than one rights
claim or on some other moral element, each problematic for the claim that
deontology is exhaustive of morality. The reason is structural. Our moral intuitions are highly
nuanced often minor changes to a factual situation alter the normative evaluation of that situation. But since
a limited number of general norms, because they are general, cannot
account for this contextual sensitivity, some other explanation must be
skirted by allowing that some theoretically benign generality informs our moral understanding.

offered. Positing a greater number of more specific deontological norms


could account for this factual sensitivity. Doing so, however, threatens to
reincarnate the epistemological problem. If our norms are relatively few in number, thereby
putting them within our epistemic reach, either many norms will apply to each situation to give us the contextual
sensitivity that is evident, or some other principles must be at work.

Turn- morality undercuts political responsibility leading to


political failures and greater evils
Isaac 02
Issac, poli sci prof at Indiana Bloomington, dir Center for the Study of
Democracy and Public life, 02 (Jeffrey, PhD from Yale, Dissent Magazine, Vol.
49, Iss. 2, Ends, Means, and Politics, p. Proquest)
As writers such as Niccolo Machiavelli, Max Weber, Reinhold Niebuhr, and Hannah Arendt have taught, an

unyielding concern with moral goodness undercuts political responsibility.


The concern may be morally laudable, reflecting a kind of personal integrity, but it suffers
from three fatal flaws: (1) It fails to see that the purity of ones intention does
not ensure the achievement of what one intends. Abjuring violence or refusing to make
common cause with morally compromised parties may seem like the right thing; but if such tactics entail
impotence, then it is hard to view them as serving any moral good beyond the
clean conscience of their supporters; (2) it fails to see that in a world of real
violence and injustice, moral purity is not simply a form of powerlessness; it is often a form of
complicity in injustice. This is why, from the standpoint of politics--as opposed to religion--pacifism is
always a potentially immoral stand. In categorically repudiating violence , it refuses in principle
to oppose certain violent injustices with any effect; and (3) it fails to see that politics
is as much about unintended consequences as it is about intentions; it is the
effects of action, rather than the motives of action, that is most significant.
Just as the alignment with good may engender impotence, it is often the
pursuit of good that generates evil. This is the lesson of communism in the
twentieth century: it is not enough that ones goals be sincere or idealistic; it is equally
important, always, to ask about the effects of pursuing these goals and to judge these
effects in pragmatic and historically contextualized ways. Moral absolutism inhibits this
judgment. It alienates those who are not true believers. It promotes arrogance. And it
undermines political effectiveness.

Solvency

1NC No Solvency
Aff is insufficient because it doesnt seek international
commitments their evidence
CCIA 12 (international not-for-profit membership organization dedicated to
innovation and enhancing societys access to information and
communications)
(Promoting CrossBorder Data Flows Priorities for the Business Community,
http://www.ccianet.org/wpcontent/uploads/library/PromotingCrossBorderDataFlows.pdf)
The movement of electronic information across borders is critical to businesses around the world, but the
international rules governing flows of digital goods, services, data and infrastructure are incomplete. The global
trading system does not spell out a consistent, transparent framework for the treatment of cross border flows of
digital goods, services or information, leaving businesses and individuals to deal with a patchwork of national,
bilateral and global arrangements covering significant issues such as the storage, transfer, disclosure, retention and
protection of personal, commercial and financial data. Dealing with these issues is becoming even more important
as a new generation of networked technologies enables greater crossborder collaboration over the Internet, which
has the potential to stimulate economic development and job growth. Despite the widespread benefits of cross
border data flows to innovation and economic growth, and due in large part to gaps in global rules and inadequate
enforcement of existing commitments, digital protectionism is a growing threat around the world. A number of
countries have already enacted or are pursuing restrictive policies governing the provision of digital commercial and
financial services, technology products, or the treatment of information to favor domestic interests over
international competition. Even where policies are designed to support legitimate public interests such as national
security or law enforcement, businesses can suffer when those rules are unclear, arbitrary, unevenly applied or
more traderestrictive than necessary to achieve the underlying objective. Whats more, multiple governments may
assert jurisdiction over the same information, which may leave businesses subject to inconsistent or conflicting
rules. In response, the United States should drive the development and adoption of transparent and highquality
international rules, norms and best practices on crossborder flows of digital data and technologies while also
holding countries to existing international obligations. Such efforts must recognize and accommodate legitimate
differences in regulatory approaches to issues such as privacy and security between countries as well as across
sectors. They should also be grounded in key concepts such as nondiscrimination and national treatment that have
underpinned the trading system for decades.

The U.S. Government should seek


international commitments on several key objectives, including: prohibiting
measures that restrict legitimate crossborder data flows or link commercial
benefit to local investment; addressing emerging legal and policy issues
involving the digital economy; promoting industry driven international
standards, dialogues and best practices; and expanding trade in digital
goods, services and infrastructure. U.S. efforts should ensure that trade
agreements cover digital technologies that may be developed in the future.
At the same time, the United States should work with governments around
the world to pursue other policies that support crossborder data flows,
including those endorsed in the Communiqu on Principles for Internet
Policymaking related to intellectual property protection and limiting
intermediary liability developed by the Organization for Economic
Cooperation and Development (OECD) in June 2011. U.S. negotiators should
pursue these issues in a variety of forums around the world, including the
World Trade Organization (WTO), Asia Pacific Economic Cooperation (APEC)
forum, OECD, and regional trade negotiations such as the TransPacific
Partnership as appropriate in each forum. In addition, the U.S. Government
should solicit ideas and begin to develop a plurilateral framework to set a
new global gold standard to improve innovation. Finally, the U.S.
Government should identify and seek to resolve through WTO or bilateral
consultations or other processes violations of current international rules

concerning digital goods, services and information. Promoting CrossBorder Data Flows:
Priorities for the Business Community 2 The importance of crossborder commercial and financial flows Access to
computers, servers, routers and mobile devices, services such as cloud computing whereby remote data centers
host information and run applications over the Internet, and information is vital to the success of billions of
individuals, businesses and entire economies. In the United States alone, the goods, services and content flowing
through the Internet have been responsible for 15 percent of GDP growth over the past five years. Open, fair and
contestable international markets for information and communication technologies (ICT) and information are
important to electronic retailers, search engines, social networks, web hosting providers, registrars and the range of
technology infrastructure and service providers who rely directly on the Internet to create economic value. But they
are also critical to the much larger universe of manufacturers, retailers, wholesalers, financial services and logistics
firms, universities, labs, hospitals and other organizations which rely on hardware, software and reliable access to
the Internet to improve their productivity, extend their reach across the globe, and manage international networks
of customers, suppliers, and researchers. For example, financial institutions rely heavily on gathering, processing,
and analyzing customer information and will often process data in regional centers, which requires reliable and
secure access both to networked technologies and crossborder data flows. According to McKinsey, more than
threequarters of the value created by the Internet accrues to traditional industries that would exist without the
Internet. The overall impact of the Internet and information technologies on productivity may surpass the effect of
any other technology enabler in history, including electricity and the combustion engine, according to the OECD.
Networked technologies and data flows are particularly important to small businesses, nonprofits and
entrepreneurs. Thanks to the Internet and advances in technology, small companies, NGOs and individuals can
customize and rapidly scale their IT systems at a lower cost and collaborate globally by accessing on line services
and platforms. Improved access to networked technologies also creates new opportunities for entrepreneurs and
innovators to design applications and to extend their reach internationally to the more than two billion people who
are now connected to the Internet. In fact, advances in networked technologies have led to the emergence of
entirely new business platforms. Kiva, a microlending service established in 2005, has used the Internet to
assemble a network of nearly 600,000 individuals who have lent over $200 million to entrepreneurs in markets
where access to traditional banking systems is limited. Millions of others use online advertising and platforms such
as eBay, Facebook, Google Docs, Hotmail, Skype and Twitter to reach customers, suppliers and partners around the
world. More broadly, economies that are open to international trade in ICT and information grow faster and are
more productive Limiting network access dramatically undermines the economic benefits of technology and can
slow growth across entire economies.

Backdoor reform is key to solve, not abolishment


Burger et al 14
(Eric, Research Professor of Computer Science at Georgetown, L. Jean Camp,
Associate professor at the Indiana University School of Information and
Computing, Dan Lubar, Emerging Standards Consultant at RelayServices, Jon
M Pesha, Carnegie Mellon University, Terry Davis, MicroSystems Automation
Group, Risking It All: Unlocking the Backdoor to the Nations Cybersecurity,
IEEE USA, 7/20/2014, pg. 1-5, Social Science Research Network,
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2468604)//duncan
This paper addresses government policies that can influence commercial practices to weaken security in products

The debate on information surveillance for national


security must include consideration of the potential cybersecurity risks and
economic implications of the information collection strategies employed . As IEEE-USA,
and services sold on the commercial market.

we write to comment on current discussions with respect to weakening standards, or altering commercial products
and services for intelligence, or law enforcement. Any policy that seeks to weaken technology sold on the
commercial market has many serious downsides, even if it temporarily advances the intelligence and law

we define and
address the risks of installing backdoors in commercial products, introducing
malware and spyware into products, and weakening standards. We illustrate that
these are practices that harm Americas cybersecurity posture and put the
resilience of American cyberinfrastructure at risk. We write as a technical society to
enforcement missions of facilitating legal and authorized government surveillance. Specifically,

clarify the potential harm should these strategies be adopted. Whether or not these strategies ever have been used

Individual computer users, large corporations and


depend on security features built into information technology
products and services they buy on the commercial market. If the security features
of these widely available products and services are weak, everyone is in greater
danger. There recently have been allegations that U.S. government agencies (and
in practice is outside the scope of this paper.
government agencies all

have engaged in a number of activities deliberately intended to


weaken mass market, widely used technology. Weakening commercial products and services does
have the benefit that it becomes easier for U.S. intelligence agencies to conduct
surveillance on targets that use the weakened technology, and more information is available for law
enforcement purposes. On the surface, it would appear these motivations would be reasonable. However, such
strategies also inevitably make it easier for foreign powers, criminals and
terrorists to infiltrate these systems for their own purposes. Moreover,
everyone who uses backdoor technologies may be vulnerable , and not just the handful of
some private entities)

surveillance targets for U.S. intelligence agencies. It is the opinion of IEEE-USAs Committee on Communications
Policy that no entity should act to reduce the security of a product or service sold on the commercial market without
first conducting a careful and methodical risk assessment. A complete risk assessment would consider the interests

A
methodical risk assessment would give proper weight to the asymmetric nature of
cyberthreats, given that technology is equally advanced and ubiquitous in the United States, and the locales of
many of our adversaries. Vulnerable products should be corrected , as needed, based on this
of the large swath of users of the technology who are not the intended targets of government surveillance.

assessment. The next section briefly describes some of the government policies and technical strategies that might
have the undesired side effect of reducing security. The following section discusses why the effect of these practices

Government policies can affect greatly the


security of commercial products, either positively or negatively. There are a number of
methods by which a government might affect security negatively as a
means of facilitating legal government surveillance. One inexpensive
method is to exploit pre-existing weaknesses that are already present in
commercial software, while keeping these weaknesses a secret. Another
method is to motivate the designer of a computer or communications
system to make those systems easier for government agencies to access.
Motivation may come from direct mandate or financial incentives. There are many ways
that a designer can facilitate government access once so motivated. For example, the system may be
equipped with a backdoor. The company that creates it and, presumably, the
government agency that requests it would know the backdoor , but not the products
(or services) purchaser(s). The hope is that the government agency will use this feature
when it is given authority to do so, but no one else will. However, creating a
backdoor introduces the risk that other parties will find the vulnerability,
especially when capable adversaries, who are actively seeking security
vulnerabilities, know how to leverage such weaknesses . History illustrates
that secret backdoors do not remain secret and that the more widespread a
backdoor, the more dangerous its existence . The 1988 Morris worm, the first
widespread Internet attack, used a number of backdoors to infect systems and
spread widely. The backdoors in that case were a set of secrets then known only by a small, highly technical
community. A single, putatively innocent error resulted in a large-scale attack that
disabled many systems. In recent years, Barracuda had a completely undocumented
backdoor that allowed high levels of access from the Internet addresses assigned to Barracuda.
However, when it was publicized, as almost inevitably happens, it became extremely unsafe, and
Barracudas customers rejected it. One example of how attackers can subvert
backdoors placed into systems for benign reasons occurred in the network of the largest commercial
cellular operator in Greece. Switches deployed in the system came equipped with built-in
wiretapping features, intended only for authorized law enforcement agencies. Some
unknown attacker was able to install software , and made use of these embedded wiretapping
features to surreptitiously and illegally eavesdrop on calls from many cell phones
including phones belonging to the Prime Minister of Greece, a hundred high-ranking Greek
dignitaries, and an employee of the U.S. Embassy in Greece before the security breach finally
was discovered. In essence, a backdoor created to fight crime was used to commit crime.
may be a decrease, not an increase, in security.

2NC No Solvency
Aff doesnt solve their author

Kehl et al 14 (Danielle Kehl is a Policy Analyst at New Americas Open Technology Institute
(OTI). Kevin Bankston is the Policy Director at OTI, Robyn Greene is a Policy Counsel at OTI,
and Robert Morgus is a Research Associate at OTI, New Americas Open Technology
Institute Policy Paper, Surveillance Costs: The NSAs Impact on the Economy, Internet
Freedom & Cybersecurity, July 2014// rck)
The U.S. government has already taken some limited steps to mitigate this damage and begin the slow, difficult
process of rebuilding trust in the United States as a responsible steward of the Internet. But the reform efforts to
date have been relatively narrow, focusing primarily on the surveillance programs impact on the rights of U.S.
citizens. Based on our findings, we recommend that the U.S. government take the following steps to address the
broader concern that the NSAs programs are impacting our economy, our foreign relations, and our cybersecurity:
Strengthen privacy protections for both Americans and non-Americans, within the United States and
extraterritorially. Provide for increased transparency around government surveillance, both

from the government and companies. Recommit to the Internet Freedom agenda in a way
that directly addresses issues raised by NSA surveillance, including moving toward
international human-rights based standards on surveillance. Begin the process of restoring
trust in cryptography standards through the National Institute of Standards and Technology.
Ensure that the U.S. government does not undermine cybersecurity by inserting surveillance backdoors into
hardware or software products. Help to eliminate security vulnerabilities in software, rather than

stockpile them. Develop clear policies about whether, when, and under what legal standards
it is permissible for the government to secretly install malware on a computer or in a
network. Separate the offensive and defensive functions of the NSA in order to minimize
conflicts of interest.

1NC Circumvention
Circumvention the NSA will force companies to build
backdoors
Trevor Timm 15, Trevor Timm is a Guardian US columnist and executive
director of the Freedom of the Press Foundation, a non-profit that supports
and defends journalism dedicated to transparency and accountability. 3-42015, "Building backdoors into encryption isn't only bad for China, Mr
President," Guardian,
http://www.theguardian.com/commentisfree/2015/mar/04/backdoorsencryption-china-apple-google-nsa)//GV
Want to know why forcing tech companies to build backdoors into encryption is a terrible idea? Look no further than
President Obamas stark criticism of Chinas plan to do exactly that on Tuesday. If only he would tell the FBI and NSA

the FBI - and more recently the NSA have been pushing for a new US law that would force tech companies like
Apple and Google to hand over the encryption keys or build backdoors into
the same thing. In a stunningly short-sighted move,

their products and tools so the government would always have access to our communications. It was only a matter
of time before other governments jumped on the bandwagon, and China wasted no time in demanding the same
from tech companies a few weeks ago. As President Obama himself described to Reuters, China has proposed an
expansive new anti-terrorism bill that would essentially force all foreign companies, including US companies, to
turn over to the Chinese government mechanisms where they can snoop and keep track of all the users of those
services. Obama continued: Those kinds of restrictive practices I think would ironically hurt the Chinese economy
over the long term because I dont think there is any US or European firm, any international firm, that could credibly
get away with that wholesale turning over of data, personal data, over to a government. Bravo! Of course these
are the exact arguments for why it would be a disaster for US government to force tech companies to do the same.
(Somehow Obama left that part out.) As Yahoos top security executive Alex Stamos told NSA director Mike Rogers
in a public confrontation last week, building backdoors into encryption is like drilling a hole into a windshield.
Even if its technically possible to produce the flaw - and we, for some reason, trust the US government never to

Companies will no longer be


in a position to say no, and even if they did, intelligence services would
find the backdoor unilaterally - or just steal the keys outright. For an example on
how this works, look no further than last weeks Snowden revelation that the UKs intelligence
service and the NSA stole the encryption keys for millions of Sim cards
used by many of the worlds most popular cell phone providers . Its happened
abuse it - other countries will inevitably demand access for themselves.

many times before too. Security expert Bruce Schneier has documented with numerous examples, Back-door
access built for the good guys is routinely used by the bad guys. Stamos repeatedly (and commendably) pushed
the NSA director for an answer on what happens when China or Russia also demand backdoors from tech
companies, but Rogers didnt have an answer prepared at all. He just kept repeating I think we can work through
this. As Stamos insinuated, maybe Rogers should ask his own staff why we actually cant work through this,
because virtually every technologist agrees backdoors just cannot be secure in practice. (If you want to further
understand the details behind the encryption vs. backdoor debate and how what the NSA director is asking for is
quite literally impossible, read this excellent piece by surveillance expert Julian Sanchez.) Its downright bizarre that
the US government has been warning of the grave cybersecurity risks the country faces while, at the very same
time, arguing that we should pass a law that would weaken cybersecurity and put every single citizen at more risk
of having their private information stolen by criminals, foreign governments, and our own. Forcing backdoors will
also be disastrous for the US economy as it would be for Chinas. US tech companies - which already have suffered
billions of dollars of losses overseas because of consumer distrust over their relationships with the NSA - would lose
all credibility with users around the world if the FBI and NSA succeed with their plan. The White House is supposedly
coming out with an official policy on encryption sometime this month, according to the New York Times but the
President can save himself a lot of time and just apply his comments about China to the US government. If he
knows backdoors in encryption are bad for cybersecurity, privacy, and the economy, why is there even a debate?

#WeWinCyberwar2.0 (ST)

Notes
Brought to you by KWei and Amy from the SWS heg lab.
Email me at ghskwei@gmail.com for help/with questions.
The thing about backdoor Affs is that all of their evidence will talk about past
attacks. Press them on why their scenario is different and how these past
attacks prove that empirically, there is no impact to break-ins through
backdoors.
Also, a lot of their ev about mandating backdoors is in the context of future
legislation, not the squo.
Also, their internal links are totally fabricated.
Links to networks, neolib, and gender privacy k, you can find those in the
generics.

Links
Some links I dont have time to cut but that I think will have good args/cards:
Going dark terrorism links:
http://judiciary.house.gov/_files/hearings/printers/112th/112-59_64581.PDF
Front doors CP: http://papers.ssrn.com/sol3/papers.cfm?
abstract_id=2630361&download=yes
Military DA i/l ev: https://cyberwar.nl/d/20130200_Offensive-CyberCapabilities-are-Needed-Because-of-Deterrence_Jarno-Limnell.pdf
http://www.inss.org.il/uploadImages/systemFiles/MASA4-3Engc_Cilluffo.pdf
Military DA Iran impact:
http://www.sobiad.org/ejournals/journal_ijss/arhieves/2012_1/sanghamitra_na
th.pdf
Miltiary DA Syran impact: http://nationalinterest.org/commentary/syriapreparing-the-cyber-threat-8997

T-Domestic

1NC
NSA spies on foreign corporations through backdoors
NYT 14
(David E. Sanger and Nicole Perlroth. "N.S.A. Breached Chinese Servers Seen as Security Threat," New
York Times. 3-22-2014. http://www.nytimes.com/2014/03/23/world/asia/nsa-breached-chinese-serversseen-as-spy-peril.html//ghs-kw)
WASHINGTON American officials have long considered

Huawei, the Chinese telecommunications

giant, a security threat, blocking it from business deals in the United States for fear that the company would
create back doors in its equipment that could allow the Chinese military or Beijing-backed hackers to steal
corporate and government secrets. But even as the United States made a public case about the dangers of buying

the National Security Agency was creating its


own back doors directly into Huaweis networks. The agency pried its way
into the servers in Huaweis sealed headquarters in Shenzhen, China s industrial heart,
according to N.S.A. documents provided by the former contractor Edward J. Snowden. It obtained
information about the workings of the giant routers and complex digital switches
that Huawei boasts connect a third of the worlds population, and monitored
communications of the companys top executives. One of the goals of the operation,
code-named Shotgiant, was to find any links between Huawei and the Peoples
Liberation Army, one 2010 document made clear. But the plans went further: to exploit Huaweis technology
from Huawei, classified documents show that

so that when the company sold equipment to other countries including both allies and nations that avoid buying
American products the N.S.A. could roam through their computer and telephone networks to conduct surveillance
and, if ordered by the president, offensive cyberoperations.

NSA targets foreign systems with backdoors


Zetter 13
(Kim Zetter. "NSA Laughs at PCs, Prefers Hacking Routers and Switches," WIRED. 9-4-2013.
http://www.wired.com/2013/09/nsa-router-hacking///ghs-kw)

THE NSA RUNS a massive, full-time hacking operation targeting foreign


systems, the latest leaks from Edward Snowden show. But unlike conventional cybercriminals, the agency
is less interested in hacking PCs and Macs. Instead, Americas spooks have their eyes on the internet
routers and switches that form the basic infrastructure of the net , and are largely
overlooked as security vulnerabilities. Under a $652-million program codenamed Genie, U.S. intel agencies
have hacked into foreign computers and networks to monitor communications
crossing them and to establish control over them , according to a secret black budget document

leaked to the Washington Post. U.S. intelligence agencies conducted 231 offensive cyber operations in 2011 to
penetrate the computer networks of targets abroad. This included not only installing covert implants in foreign
desktop computers but also on routers and firewalls tens of thousands of machines every year in all. According to
the Post, the government planned to expand the program to cover millions of additional foreign machines in the
future and preferred hacking routers to individual PCs because it gave agencies access to data from entire networks
of computers instead of just individual machines. Most of the hacks targeted the systems and communications of
top adversaries like China, Russia, Iran and North Korea and included activities around nuclear proliferation. The
NSAs focus on routers highlights an often-overlooked attack vector with huge advantages for the intruder, says
Marc Maiffret, chief technology officer at security firm Beyond Trust. Hacking routers is an ideal way for an
intelligence or military agency to maintain a persistent hold on network traffic because the systems arent updated
with new software very often or patched in the way that Windows and Linux systems are. No one updates their
routers, he says. If you think people are bad about patching Windows and Linux (which they are) then they are
horrible about updating their networking gear because it is too critical, and usually they dont have redundancy to
be able to do it properly. He also notes that routers dont have security software that can help detect a breach.
The challenge [with desktop systems] is that while antivirus dont work well on your desktop, they at least do
something [to detect attacks], he says. But you dont even have an integrity check for the most part on routers
and other such devices like IP cameras. Hijacking routers and switches could allow the NSA to do more than just
eavesdrop on all the communications crossing that equipment. It would also let them bring down networks or
prevent certain communication, such as military orders, from getting through, though the Post story doesnt report
any such activities. With control of routers, the NSA could re-route traffic to a different location, or intelligence
agencies could alter it for disinformation campaigns, such as planting information that would have a detrimental
political effect or altering orders to re-route troops or supplies in a military operation. According to the budget

the CIAs Tailored Access Programs and NSAs software engineers possess
templates for breaking into common brands and models of routers, switches and
document,

firewalls. The article doesnt say it, but this would likely involve pre-written scripts or backdoor
tools and root kits for attacking known but unpatched vulnerabilities in these systems, as well as for attacking
zero-day vulnerabilities that are yet unknown to the vendor and customers. [Router software is] just an
operating system and can be hacked just as Windows or Linux would be hacked,
Maiffret says. Theyve tried to harden them a little bit more [than these other systems], but for folks at a
place like the NSA or any other major government intelligence agency, its pretty
standard fare of having a ready-to-go backdoor for your [off-the-shelf] Cisco or Juniper
models.

T-Surveillance

1NC
Backdoors are also used for cyberwarfarenot surveillance
Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.
After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

an implants purpose is to create a back door for future access. You


pry open the window somewhere and leave it so when you come back the
owner doesnt know its unlocked, but you can get back in when you want
to, said one intelligence official, who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of
anonymity to discuss sensitive technology. Under U.S. cyberdoctrine, these operations are known as exploitation,
not attack, but they are essential precursors both to attack and defense. By the end of this
year, GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S. intelligence
budget. The NSA appears to be planning a rapid expansion of those numbers , which were limited
Sometimes

until recently by the need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use of

the
NSA has brought online an automated system, code-named TURBINE, that is capable of
managing potentially millions of implants for intelligence gathering and active attack.
only 8,448 of the 68,975 machines with active implants in 2011. For GENIEs next phase, according to an authoritative reference document,

T-Surveillance (ST)

1NC
Undermining encryption standards includes commercial fines
against illegal exports
Goodwin and Procter 14
(Goodwin and Proctor, legal firm. Software Companies Now on Notice That Encryption Exports May Be
Treated More Seriously: $750,000 Fine Against Intel Subsidiary, Client Alert, 10-15-2014.
http://www.goodwinprocter.com/Publications/Newsletters/Client-Alert/2014/1015_Software-CompaniesNow-on-Notice-That-Encryption-Exports-May-Be-Treated-More-Seriously.aspx//ghs-kw)

the Department of Commerces Bureau of Industry and Security (BIS)


announced the issuance of a $750,000 penalty against Wind River Systems , an Intel
subsidiary, for the unlawful exportation of encryption software products to foreign
government end-users and to organizations on the BIS Entity List. Wind River
Systems exported its software to China, Hong Kong, Russia, Israel, South Africa, and
South Korea. BIS significantly mitigated what would have been a much larger fine
because the company voluntarily disclosed the violations. We believe this to be the first
On October 8, 2014,

penalty BIS has ever issued for the unlicensed export of encryption software that did not also involve
comprehensively sanctioned countries (e.g., Cuba, Iran, North Korea, Sudan or Syria). This suggests a fundamental
change in BISs treatment of violations of the encryption regulations. Historically, BIS has resolved voluntarily
disclosed violations of the encryption regulations with a warning letter but no material consequence, and has shown

This fine dramatically increases the


compliance stakes for software companies a message that BIS seemed intent upon making in its
announcement. Encryption is ubiquitous in software products. Companies making these
products should reexamine their product classifications, export eligibility, and
internal policies and procedures regarding the export of software that uses or
leverages encryption (even open source or third-party encryption libraries), particularly where a
potential transaction on the horizon e.g., an acquisition, financing, or initial public
offering will increase the likelihood that violations of these laws will be identified.
itself unlikely to pursue such violations that were not disclosed.

If you would like additional information about the issues addressed in this Client Alert, please contact Rich Matheny,
who chairs Goodwin Procters National Security & Foreign Trade Regulation Practice, or the Goodwin Procter
attorney with whom you typically consult.

CPs

Foreign Backdoors CP

CX
In the world of the AFF does the government no longer have access to
backdoors? So we dont use or possess backdoors in the world of the AFF,
right?

1NC
(KQ) Counterplan: the United States federal government
should ban the creation of backdoors as outlined in the Secure
Data Act of 2015 but should not ban the surveillance of
backdoors and should mandate clandestine corporate
disclosure of foreign-government-mandated backdoors to the
United States federal government.
(CT) Counterplan: The United States federal government
should not mandate the creation of surveillance backdoors in
products or request privacy keys, and should terminate current
backdoors created either by government mandates or
government requested keys but should not cease the use of
backdoors.
Backdoors are inevitablewell use backdoors created by
foreign governments
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)

Still another approach is to let other governments do the dirty work. The
computer scientists' report cites the possibility of other sovereigns adopting their
own extraordinary access regimes as a reason for the U.S. to go slow: Building in
exceptional access would be risky enough even if only one law enforcement agency
in the world had it. But this is not only a US issue. The UK government promises
legislation this fall to compel communications service providers, including US-based
corporations, to grant access to UK law enforcement agencies, and other countries
would certainly follow suit. China has already intimated that it may require
exceptional access. If a British-based developer deploys a messaging application
used by citizens of China, must it provide exceptional access to Chinese law
enforcement? Which countries have sufficient respect for the rule of law to participate in an international

exceptional access framework? How would such determinations be made? How would timely approvals be given for
the millions of new products with communications capabilities? And how would this new surveillance ecosystem be
funded and supervised? The US and UK governments have fought long and hard to keep the governance of the
Internet open, in the face of demands from authoritarian countries that it be brought under state control. Does not

the computer
scientists are correct that foreign governments will move in this direction , but I think
they are misreading the consequences of this. China and Britain will do this irrespective of
what the United States does, and that fact may well create potential
opportunity for the U.S. After all, if China and Britain are going to force
U.S. companies to think through the problem of how to provide
extraordinary access without compromising general security, perhaps the
need to do business in those countries will provide much of the incentive
to think through the hard problems of how to do it. Perhaps countries far less
solicitous than ours of the plight of tech nology companies or the privacy interests of
their users will force the research that Comey can only hypothesize. Will Apple then take the
view that it can offer phones to users in China which can be decrypted for Chinese
authorities when they require it but that it's technically impossible to do so in the
the push for exceptional access represent a breathtaking policy reversal? I am certain that

United States?

2NC O/V
Counterplan solves 100% of the casewe mandate the USFG
publicly stop creating backdoors but instead use backdoors
that are inevitably mandated by foreign nations for
surveillancesolves perception and doesnt link to the net
benefitthats Wittes

2NC Backdoors Inev


India has backdoors
Ragan 12
(Steve Ragan. Steve Ragan is a security reporter and contributor for SecurityWeek. Prior to joining the
journalism world in 2005, he spent 15 years as a freelance IT contractor focused on endpoint security
and security training. "Hackers Expose India's Backdoor Intercept Program," No Publication. 1-9-2012.
http://www.securityweek.com/hackers-expose-indias-backdoor-intercept-program//ghs-kw)
Symantec confirmed with SecurityWeek on Friday that hackers did access source code from Symantec
Endpoint Protection 11.0 and Symantec Antivirus 10.2. According to a Symantec spokesperson, SEP 11 was four
years ago to be exact. In addition, Symantec Antivirus 10.2 has been discontinued, though the company continues
to service it. Were taking this extremely seriously and are erring on the side of caution to develop and long-range
plan to take care of customers still using those products, Cris Paden, Senior Manager of Corporate Communications
at Symantec told SecurityWeek. Over the weekend, the story expanded. The Lords of Dharmaraja released a

RINOA, which earns its name from the


vendors involved - RIM, Nokia, and Apple. The memo said the vendors provided India
with backdoors into their technology in order to them to maintain a
presence in the local market space. Indias Ministry of Defense has an
agreement with all major device vendors to provide the country with the source
code and information needed for their SUR (surveillance) platform, the memo
explains. These backdoors allowed the military to conduct surveillance (RINOA SUR)
purported memo outlining the intercept program known as

against the US-China Economic and Security Review Commission. Personnel from Indian Naval Military Intelligence
were dispatched to the Peoples Republic of China to undertake Telecommunications Surveillance (TESUR) using the
RINOA backdoors and CYCADA-based technologies.

China has backdoors in 80% of global communications


Protalinski 12
(Emil Protalinski. Reporter for CNet and ZDNet. "Former Pentagon analyst: China has backdoors to
80% of telecoms," ZDNet. 7-14-2012. http://www.zdnet.com/article/former-pentagon-analyst-china-hasbackdoors-to-80-of-telecoms///ghs-kw)

The Chinese government reportedly has "pervasive access" to some 80


percent of the world's communications, thanks to backdoors it has ordered to be
installed in devices made by Huawei and ZTE Corporation. That's according to sources cited by
Michael Maloof, a former senior security policy analyst in the Office of the Secretary
of Defense, who now writes for WND: In 2000, Huawei was virtually unknown outside China, but by 2009 it had
grown to be one of the largest, second only to Ericsson. As a consequence, sources say that any information
traversing "any" Huawei equipped network isn't safe unless it has military encryption . One source
warned, "even then, there is no doubt that the Chinese are working very hard to decipher
anything encrypted that they intercept." Sources add that most corporate telecommunications
networks use "pretty light encryption" on their virtual private networks, or VPNs. I found about Maloof's report via
this week's edition of The CyberJungle podcast. Here's my rough transcription of what he says, at about 18 minutes

The Chinese government and the People's Liberation Army are so much
into cyberwarfare now that they have looked at not just Huawei but also ZTE Corporation as
providing through the equipment that they install in about 145 countries around in the world, and in
45 of the top 50 telecom centers around the world, the potential for backdooring
into data. Proprietary information could be not only spied upon but also could be altered and in some cases
and 30 seconds:

could be sabotaged. That's coming from technical experts who know Huawei, they know the company and they

it's
giving Chinese access to approximately 80 percent of the world telecoms
and it's working on the other 20 percent now.
know the Chinese. Since that story came out I've done a subsequent one in which sources tell me that

China is mandating backdoors


Mozur 1/28
(Paul Mozur. Reporter for the NYT. "New Rules in China Upset Western Tech Companies," New York
Times. 1-28-2015. http://www.nytimes.com/2015/01/29/technology/in-china-new-cybersecurity-rulesperturb-western-tech-companies.html//ghs-kw)
HONG KONG

The Chinese government has adopted new regulations requiring

companies that sell computer equipment to Chinese banks to turn over secret source code, submit
to invasive audits and build so-called back doors into hardware and software ,
according to a copy of the rules obtained by foreign technology companies that do billions of dollars worth of

The new rules, laid out in a 22-page document approved at the end of last year, are the
first in a series of policies expected to be unveiled in the coming months that Beijing
business in China.

says are intended to strengthen cybersecurity in critical Chinese industries. As copies have spread in the past
month, the regulations have heightened concern among foreign companies that the authorities are trying to force
them out of one of the largest and fastest-growing markets. In a letter sent Wednesday to a top-level Communist
Party committee on cybersecurity, led by President Xi Jinping, foreign business groups objected to the new policies
and complained that they amounted to protectionism. The groups, which include the U.S. Chamber of Commerce,
called for urgent discussion and dialogue about what they said was a growing trend toward policies that cite
cybersecurity in requiring companies to use only technology products and services that are developed and
controlled by Chinese companies. The letter is the latest salvo in an intensifying tit-for-tat between China and the
United States over online security and technology policy. While the United States has accused Chinese military
personnel of hacking and stealing from American companies, China has pointed to recent disclosures of United
States snooping in foreign countries as a reason to get rid of American technology as quickly as possible. Although
it is unclear to what extent the new rules result from security concerns, and to what extent they are cover for
building up the Chinese tech industry, the Chinese regulations go far beyond measures taken by most other
countries, lending some credibility to industry claims that they are protectionist. Beijing also has long used the

Chinese companies
must also follow the new regulations, though they will find it easier since for most, their core
Internet to keep tabs on its citizens and ensure the Communist Partys hold on power.

customers are in China. Chinas Internet filters have increasingly created a world with two Internets, a Chinese one
and a global one. The new policies could further split the tech world, forcing hardware and software makers to sell

While the
Obama administration will almost certainly complain that the new rules are protectionist in
nature, the Chinese will be able to make a case that they differ only in degree from
Washingtons own requirements.
either to China or the United States, or to create significantly different products for the two countries.

2NC AT Perm do Both


Permutation links to the net benefitthe AFF stops use of
backdoors, that was 1AC cross-ex

2NC AT Perm do the CP


The counterplan bans the creation of backdoors but not the
use of themthats different from the planthat was cross-ex
The permutation is severancethats a voting issue:
3. NEG groundmakes the AFF a shifting target which
makes it impossible to garner offensestop copying k
AFFs, vote NEG to be Dave Strauss
4. Kills advocacy skillsthey never have to defend
implementation of an advocacy

Cyberterror Advantage CP

1NC
Counterplan: the United States federal government should
substantially increase its support for renewable energy
technologies and grid decentralization.
Grid decentralization and renewables solve terror attacks
Lawson 11
(Lawson, Sean. Sean Lawson is an assistant professor in the Department of Communication at the
University of Utah. He holds a PhD in Science and Technology Studies from Rensselaer Polytechnic
Institute, a MA in Arab Studies from Georgetown University, and a BA in History from California State
University, Stanislaus. BEYOND CYBER-DOOM: Cyberattack Scenarios and the Evidence of History,
Mercatus Center at George Mason University. Working Paper No. 11-01, January 2011.
http://mercatus.org/sites/default/files/publication/beyond-cyber-doom-cyber-attack-scenariosevidence-history_1.pdf//ghs-kw)

Cybersecurity policy should promote decentralization and self-organization in efforts


to prevent, defend against, and respond to cyberattacks. Disaster researchers have
shown that victims are often themselves the first responders and that centralized,
hierarchical, bureaucratic responses can hamper their ability to respond in the
decentralized, self-organized manner that has often proved to be more effective
(Quarantelli, 2008: 895896). One way that officials often stand in the way of decentralized self-organization is by

U.S. military
doctrine increasingly has identified decentralization , self-organization, and information sharing
as the keys to effectively operating in ever-more complex conflicts that move at an
ever-faster pace and over ever-greater geographical distances (LeMay & Smith, 1968;
hoarding information (Clarke & Chess, 2009: 10001001). Similarly, over the last 50 years,

Romjue, 1984; Cebrowski & Garstka, 1998; Hammond, 2001). In the case of preventing or defending against
cyberattacks on critical infrastructure, we must recognize that most cyber and physical infrastructures are owned

a centralized, military-led effort to protect the fortress at every point


will not work. A combination of incentives, regulations, and public-private
partnerships will be necessary. This will be complex, messy, and difficult. But a cyberattack,
by private actors. Thus,

should it occur, will be equally complex, messy, and difficult, occurring instantaneously over global distances via a

The owners
and operators of our critical infrastructures are on the front lines and will be the first
responders. They must be empowered to act. Similarly, if the worst should occur,
average citizens must be empowered to act in a decentralized , self-organized way
to help themselves and others. In the case of critical infrastructures like the
electrical grid, this could include the promotion of alt ernative energy
generation and distribution methods. In this way, Instead of being passive consumers,
[citizens] can become actors in the energy network. Instead of waiting for blackouts,
they can organize alternatives and become less vulnerable to either terror or natural
medium that is almost incomprehensible in its complex interconnections and interdependencies.

catastrophe (Nye, 2010: 203)

2NC O/V
Counterplan solves all of their grid and cyber-terrorism
impactswe mandate the USFG provide incentives,
regulations, and P3s for widespread adoption of alt energy and
grid decentralizationthis means each building has its own
microgrid, which allows for local, decentralized responses to
cyberterror attacks and solves their impactthats Lawson

2NC CP>AFF
Only the CP solvesa centralized grid results in inevitable
failures and kills the economy
Warner 10
(Guy Warner. Guy Warner is a leading economist and the founder and CEO of Pareto Energy. "Moving
U.S. energy policy to a decentralized grid," Grist. 6-4-2010. http://grist.org/article/2010-06-03-movingu-s-energy-policy-to-a-decentralized-grid-rethinking-our///ghs-kw)

the
technology to deliver this energy to the places where it is most needed is decades
behind. Americas current electricity transmission and distribution grid was
built more than a century ago. Relying on the grid to relay power from wind farms in the Midwest
to cities on the east and west coast is simply not feasible. Our dated infrastructure cannot handle
the existing load power outages and disruptions currently cost the nation an
estimated $164 billion each year. Wind and solar power produce intermittent power, which, in small
doses, has little impact on grid operations. As we introduce increasingly larger amounts of
intermittent power, our transmission system will require significant upgrades and
And, while the development of renewable energy technology has sped up rapidly in recent years,

perhaps even a total grid infrastructure redesign, which could take decades and cost billions. With 9,200 power
plants that link homes and business via 164,000 miles of lines, a national retrofit is both cost-prohibitive and

One solution to this challenge is the development of microgrids. Also known as


distributed generation, microgrids produce energy closer to the user rather than
transmitting it from remote power plants. Power is generated and stored locally and
works in parallel with the main grid, providing power as needed and utilizing the
main grid at other times. Microgrids offer a decentralized power source that
can be introduced incrementally in modules now without having to deal with the years of
improbable.

delay realistically associated with building central generation facilities (e.g. nuclear) and their associated
transmission and distribution system add-ons. There is also a significant difference in the up-front capital costs that
are ultimately assigned the consumer. Introducing generation capacity into a microgrid as needed is far less capital
intensive, and some might argue more economical, than building a new nuclear plant at a cost of $5-12 billion
dollars.

Technological advancements in connectivity mean that microgrids can now be


developed for high energy use building clusters, such as trading floors and
hospitals, relieving stress on the macrogrid, and providing more reliable power. In fact,
microgrids can be viewed as the ultimate smart grid, providing local power that
meets local needs and utilizing energy sources, including renewables, that best fit
the location and use profile. For example, on the East Coast, feasibility studies are underway to retrofit
obsolete paper mills into biomass fuel generators utilizing left over pulp wood. Pulp wood, the waste left over from
logging, can be easily pelletized, is inexpensive to produce, easy to transport, and has a minimal net carbon output.
Wood pellets are also easily adaptable to automated combustion systems, making them a valuable domestic
resource that can supplement and replace our use of fossil fuels, particularly in microgrids which can be designed to
provide heating and cooling from these biomass products.

2NC Terror Solvency


Decentralization solves terror threats
Verclas 12
(Verclas, Kristen. Kirsten Verclas works as International Program Officer at the National Association of
Regulatory Utility Commissioners (NARUC) in Washington, DC. She holds a BA in International
Relations with a Minor in Economics from Franklin and Marshall College and an MA in International
Relations with a concentration in Security Studies from The Elliott School at The George Washington
University. She also earned an MS in Energy Policy and Climate from Johns Hopkins University in
August 2013. "The Decentralization of the Electricity Grid Mitigating Risk in the Energy Sector ,
American Institute for Contemporary German Studies at John Hopkins University. 4-27-2012.
http://www.aicgs.org/publication/the-decentralization-of-the-electricity-grid-mitigating-risk-in-theenergy-sector///ghs-kw)

A decentralized electricity grid has many environmental and security benefits.


Microgrids in combination with distributed energy generation provide a system of small power
generation and storage systems, which are located in a community or in individual
houses. These small power generators produce on average about 10 kW (for individual homes) to 2 MW (for
communities) of electricity. While connected to and able to feed excess energy into the grid, these generators
are simultaneously independent from the grid in that they can provide power even
when power from the main grid is not available. Safety benefits from a
decentralized grid are immense, as it has build-in redundancies. These
redundancies are needed should the main grid become inoperable due to a natural
disaster or terrorist attack. Communities or individual houses can then rely on microgrids with
distributed electricity generation for their power supply. Furthermore, having less
centralized electricity generation and fewer main critical transmission lines reduces
targets for terrorist attacks and natural disasters. Fewer people would then be impacted by subsequent
power outages. Additionally, decentralized power reduces the obstacles to disaster
recovery by allowing the focus to shift first to critical infrastructure and then to flow
outward to less integrated outlets.[ 10] Thus critical facilities such as hospitals or
police stations would be the first to have electricity restored, while non-essential
infrastructure would have energy restored at a later date. Power outages are not only
dangerous for critical infrastructure, they also cost money to business and the economy overall. EPRI reported that

Decentralized
grids are also more energy efficient than centralized electricity grids because as
electricity streams through a power line a small fraction of it is lost to various
factors. The longer the distance the greater the loss.[ 12] Savings that are realized by
having shorter transmission lines could be used to install the renewable energy sources close
to homes and communities. The decrease of transmission costs and the increase in
efficiency would cause lower electricity usage overall. A decrease in the need to generate
power outages and quality disturbances cost American businesses $119 billion per year.[11]

electricity would also increase energy securityfewer imports of energy would be needed. The U.S. especially has
been concerned with energy dependence in the last decades; decentralized electricity generation could be one of
the policies to address this issue.

Decentralization solves cyberattacks


Kiger 13
(Patrick J. Kiger. "Will Renewable Energy Make Blackouts Into a
Thing of the Past?," National Geographic Channel. 10-2-2013.
http://channel.nationalgeographic.com/americanblackout/articles/will-renewable-energy-make-blackouts-into-athing-of-the-past///ghs-kw)
The difference is that Germanys grid of the future, unlike the present U.S. system, wont rely on big power plants

a decentralized smart gridessentially, a


system composed of many small, potentially self-sufficient grids, that will obtain
and long transmission lines. Instead, Germany is creating

much of their power at the local level from renewable energy sources, such as solar
panels, wind turbines and biomass generators. And the system will be equipped with
sophisticated information and communications technology (ICT) that will enable it to
make the most efficient use of its energy resources. Some might scoff at the idea that a nation
could depend entirely upon renewable energy for its electrical needs, because both sunshine and wind tend to be
variable, intermittent producers of electricity. But the Germans plan to get around that problem by using linked
renewablesthat is, by combining multiple sources of renewable energy, which has the effect of smoothing out
the peaks and valleys of the supply. As Kurt Rohrig, the deputy director of Germanys Fraunhofer Institute for Wind
Energy and Energy System Technology, explained in a recent article on Scientific Americans website :

"Each
source of energybe it wind, sun or bio-gashas its strengths and weaknesses. If
we manage to skillfully combine the different characteristics of the regenerative
energies, we can ensure the power supply for Germany." A decentralized smart grid
powered by local renewable energy might help protect the U.S. against a
catastrophic blackout as well, proponents say. A more diversified supply with more
distributed generation inherently helps reduce vulnerability, Mike Jacobs, a
senior energy analyst at the Union of Concerned Scientists, noted in a recent blog post on the organizations

such a system would have


the ability to bank surplus electricity from wind turbines and solar panels in
numerous storage locations around the system. Utility operators could tap into
those reserves if electricity generation ebbed. Additionally , in the event of a large-scale
disruption, a smart grid would have the ability to switch areas over to power generated
by utility customers themselves, such as solar panels that neighborhood residents
have installed on their roofs. By combining these "distributed generation" resources,
a community could keep its health center, police department, traffic lights, phone
system, and grocery store operating during emergencies, DOEs website notes. "There are
lots of resources that contribute to grid resiliency and flexibility," Allison Clements, an
official with the Natural Resource Defense Council, wrote in a recent blog post on the NRDC website. "Happily,
they are the same resources that are critical to achieving a clean energy, low
carbon future." Joel Gordes, electrical power research director for the U.S. Cyber Consequences Unit, a
private-sector organization that investigates terrorist threats against the electrical grid and
other targets, also thinks that such a decentralized grid "could carry benefits
not only for protecting us to a certain degree from cyber-attacks but also providing
website. According to the U.S. Department of Energys SmartGrid.gov website,

power during any number of natural hazards." But Gordes does offer a caveatsuch a system might also offer more
potential points of entry for hackers to plant malware and disrupt the entire grid. Unless that vulnerability is
addressed, he warned in an e-mail, "full deployment of [smart grid] technology could end up to be disastrous."

Patent Reform Advantage CP

Notes
Specify reform + look at law reviews
Read the 500 bil card in the 1NC
Cut different versions w/ different mechanisms

1NC Comprehensive Reform


Counterplan: the United States federal government should
comprehensively reform its patent system for the purpose of
eliminating non-practicing entities.
Patent trolls cost the economy half a trillion and counting
larger internal link to tech and the economy
Lee 11
(Timothy B. Lee. Timothy B. Lee covers tech policy for Ars, with a particular focus on patent and
copyright law, privacy, free speech, and open government. While earning his CS master's degree at
Princeton, Lee was the co-author of RECAP, a Firefox plugin that helps users liberate public documents
from the federal judiciary's paywall. Before grad school, he spent time at the Cato Institute, where he
is an adjunct scholar. He has written for both online and traditional publications, including Slate,
Reason, Wired.com, and the New York Times. When not screwing around on the Internet, he can be
seen rock climbing, ballroom dancing, and playing soccer. He lives in Philadelphia. He has a blog at
Forbes and you can follow him on Twitter. "Study: patent trolls have cost innovators half a trillion
dollars," Ars Technica. xx-xx-xxxx. http://arstechnica.com/tech-policy/2011/09/study-patent-trolls-havecost-innovators-half-a-trillion-bucks///ghs-kw)

patent trolls has become well-known: a small company with no products of


its own threatens lawsuits against larger companies who inadvertently infringe its
portfolio of broad patents. The scenario has become so common that we don't even try to cover all the
By now, the story of

cases here at Ars. If we did, we'd have little time to write about much else. But anecdotal evidence is one thing.

Three Boston University researchers have produced a rigorous empirical


estimate of the cost of patent trolling. And the number is breath-taking: patent trolls ("nonpracticing entity" is the clinical term) have cost publicly traded defendants
$500 billion since 1990. And the problem has become most severe in recent years. In the last four
years, the costs have averaged $83 billion per year. The study says this is more than
a quarter of US industrial research and development spending during
those years. Two of the study's authors, James Bessen and Mike Meurer, wrote Patent Failure, an empirical
Data is another.

study of the patent system that has been widely read and cited since its publication in 2008. They were joined for

The most obvious


costs for defendants are legal fees and payouts to plaintiffs, but these are not
necessarily the largest costs. Often, indirect costs like employee distraction, legal
uncertainty, and the need to redesign or drop key products are even more
significant. The trio use a clever method known as a stock market event study to estimate these costs. The
this paper by a colleague, Jennifer Ford.It's hard to measure the costs of litigation directly.

theory is simple: a company's stock price represents the stock market's best estimation of the company's value. If
the company's stock drops by, say, two percent in the days after a lawsuit is filed, then the market thinks the
lawsuit will cost the company two percent of its market capitalization. Of course, this wouldn't be a very rigorous
technique if they were looking at a single lawsuit. Any number of factors could have affected the firm's stock price

with a large sample


of companies, these random factors should mostly cancel each other out, leaving the
market's rough estimate of how much patent lawsuits cost their targets. The
authors used a database of 1,630 patent troll lawsuits compiled by Patent Freedom. Because
many of the lawsuits had multiple defendants, there was a total of 4,114 plaintiff-defendant
pairs. The median defendant over all of these pairs lost $20.4 million in market capitalization, while the mean loss
that same week. Maybe the company released a bad earnings report the next day. But

was $122 million.

2NC Solvency
(Senator Orrin Hatch. "Senator Hatch: Its Time to Kill Patent Trolls for Good,"
WIRED. 3-16-2015. http://www.wired.com/2015/03/opinion-must-finallylegislate-patent-trolls-existence///ghs-kw)
There is broad agreementamong both big and small businessesthat any
serious solution must include:

Fee shifting, which will require patent trolls to pay legal fees when their
suits are unsuccessful;

Heightened pleading and discovery standards, which will raise the bar
on litigation procedure, making it increasingly difficult for trolls to file
frivolous lawsuits;

Demand letter reforms, which will require those sending demand


letters to be more specific and transparent;

Stays of customer suits, which will allow a manufacturers case to


move forward first, without binding the end user to the result of that case;

A mechanism to enable recovery of fees, which will prevent insolvent


plaintiffs from litigating and dashing.
Some critics argue that these proposals will help only large technology
companies and might even hurt startups and small businesses. In my
discussions with stakeholders, however, I have repeatedly been told that a
multi-pronged approach that tackles each of these issues is needed to
effectively combat patent trolls across all levels of industry. These
stakeholder discussions have included representatives from the hotel,
restaurant, retail, real estate, financial services, and high-tech industries, as
well as start-up and small business owners.
Enacting legislation on any topic is a major undertaking, and the added
complexities inherent in patent law make passing patent reforms especially
challenging. Crucially, we will probably have only one chance to do so for a
long while, so whatever we do must work. We must not pass any bill that
fails to provide an effective deterrent against patent trolls at all stages of
litigation.
It is my belief that any viable legislation must ensure that those who
successfully defend against abusive patent litigation and are awarded fees
will actually get paid. Even when a patent troll is a shell company with no
assets, there are usually other parties with an interest in the litigation who
do have assets. These parties, however, often keep themselves beyond the
jurisdiction of the courts. They reap benefits if the plaintiff forces a
settlement, but are protected from any liability if they lose.
Right now, thats a win-win situation for these parties, and a lose-lose
situation for Americas innovators.

Because Congress cannot force parties outside a courts jurisdiction to join in


a case, we must instead incentivize interested parties to do the right thing
and pay court-ordered fee awards. This is why we must pass legislation that
includes a recovery provision. Fee shifting without recovery is like writing a
check on an empty account. Its purporting to convey something that isnt
there. Only fee shifting coupled with a recovery provision will stop patent
trolls from litigating-and-dashing.
There is no question that American ingenuity fuels our economy. We must
ensure that our patent system is strong and vibrant and helps to protect our
countrys premier position in innovation.

Reform solves patent trolling


Roberts 14
(Jeff John Roberts. Jeff reports on legal issues that impact the future of the tech industry, such as
privacy, net neutrality and intellectual property. He previously worked as a reporter for Reuters in
Paris and New York, and his free-lance work includes clips for the Economist, the New York Times and
the Globe & Mail. A frequent guest on media outlets like NPR and Fox, Jeff is also a lawyer, having
passed the bar in New York and Ontario. "Patent reform is likely in 2015. Heres what it could look
like," No Publication. 11-19-2014. https://gigaom.com/2014/11/19/patent-reform-is-likely-in-2015heres-what-it-could-look-like///ghs-kw)

real reform
will depend on changing the economic asymmetries in patent litigation
that allow trolls to flourish, and that lead troll victims to simply pay up
rather engage in costly litigation. Here are some measures we are likely to see under
the Goodlatte bill, according to Crouch and legal sources like IAM and Law.com (subscription required) : Feeshifting: Right now, trolls typically have nothing to lose by filing a lawsuit since they
are shell companies with no assets. New fee-shifting measures, however, could put
them on the hook for their victims legal fees. Discovery limits: Currently, trolls can
exploit the discovery process in which each side must offer up documents and
depositions by drowning their targets in expensive and time-consuming requests.
Limiting the scope of discovery could take that tactic off the table. Heightened
pleading requirements: Right now, patent trolls dont have to specify how exactly a
company is infringing their technology, but can simply serve cookie-cutter
complaints that list the patents and the defendant. Pleading reform would force the
trolls to explain what exactly they are suing over, and give defendants a better
opportunity to assess the case. Identity requirements: This reform proposal is known
as real party of interest and would make it harder for those filing patent lawsuits
(often lawyers working with private equity firms) to hide behind shell companies,
and require them instead to identify themselves. Crouch also notes the possibility of
expanded post-grant review, which gives defendants a fast and cheaper tool to
invalidate bad patents at the Patent Office rather than in federal court.
A patent scholar Dennis Crouch notes, the question is how far the new law will go. In particular,

2NC O/V
The status quo patent system is hopelessly broken and allows
patent trolls to game the system by gaining broad patents for
objects such as selling objects on the internetthose firms sue
innovators and startups who violate their patents, costing
the US economy half a trillion and stifling innovationthats
Lee
The counterplan eliminates patent trolls through a set of
comprehensive reforms well describe belowsolves their
innovation argumentss and independently is a bigger internal
link to innovation and the economy
Patent reform is key to prevent patent trolling that stifle
innovation and reduce R&D by half
Bessen 14
(James Bessen. Bessen is a Lecturer in Law at the Boston
University School of Law. Bessen was also a Fellow at the
Berkman Center for Internet and Society. "The Evidence Is In:
Patent Trolls Do Hurt Innovation," Harvard Business Review.
November 2014. https://hbr.org/2014/07/the-evidence-is-inpatent-trolls-do-hurt-innovation//ghs-kw)
patent trolls, firms that make their money
asserting patents against other companies, but do not make a useful product of
their own. Both the White House and Congressional leaders have called for
patent reform to fix the underlying problems that give rise to patent troll
lawsuits. Not so fast, say Stephen Haber and Ross Levine in a Wall Street Journal Op-Ed (The Myth of the
Over the last two years, much has been written about

Wicked Patent Troll). We shouldnt reform the patent system, they say, because there is no evidence that trolls are
hindering innovation; these calls are being driven just by a few large companies who dont want to pay inventors.
But there is evidence of significant harm. The White House and the Congressional Research Service both cited many

patent litigation harms innovation. And three new


empirical studies provide strong confirmation that patent litigation is
reducing venture capital investment in startups and is reducing R&D
spending, especially in small firms. Haber and Levine admit that patent litigation is surging.
There were six times as many patent lawsuits last year than in the 1980s. The
number of firms sued by patent trolls grew nine-fold over the last decade; now a
majority of patent lawsuits are filed by trolls. Haber and Levine argue that this is not a problem: it
research studies suggesting that

might instead reflect a healthy, dynamic economy. They cite papers finding that patent trolls tend to file suits in
innovative industries and that during the nineteenth century, new technologies such as the telegraph were
sometimes followed by lawsuits. But this does not mean that the explosion in patent litigation is somehow normal.
Its true that plaintiffs, including patent trolls, tend to file lawsuits in dynamic, innovative industries. But thats just
because they follow the money. Patent trolls tend to sue cash rich companies, and innovative new technologies
generate cash. The economic burden of todays patent lawsuits is, in fact, historically unprecedented.

Research shows that patent trolls cost defendant firms $29 billion per year
in direct out-of-pocket costs; in aggregate, patent litigation destroys over
$60 billion in firm wealth each year. While mean damages in a patent lawsuit ran around
$50,000 (in todays dollars) at the time the telegraph, mean damages today run about $21 million. Even taking into
account the much larger size of the economy today, the economic impact of patent litigation today is an order of

these costs fall


disproportionately on innovative firms: the more R&D a firm performs, the
more likely it is to be sued for patent infringement , all else equal. And,
magnitude larger than it was in the age of the telegraph. Moreover,

although this fact alone does not prove that this litigation reduces firms innovation, other evidence suggests that

medical imaging businesses


sued by a patent troll reduced revenues and innovations relative to comparable
companies that were not sued. But the biggest impact is on small startup firms
contrary to Haber and Levine, most patent trolls target firms selling less than $100 million a year. One survey
of software startups found that 41% reported significant operational impacts from
patent troll lawsuits, causing them to exit business lines or change strategy. Another
survey of venture capitalists found that 74% had companies that experienced
significant impacts from patent demands. Three recent econometric studies
confirm these negative effects. Catherine Tucker of MIT analyzed venture capital investing relative to
this is exactly what happens. A researcher at MIT found, for example, that

patent lawsuits in different industries and different regions of the country. Controlling for the influence of other

lawsuits from frequent litigators (largely patent trolls) were


responsible for a decline of $22 billion in venture investing over a five-year period.
That represents a 14% decline. Roger Smeets of Rutgers looked at R&D spending by small firms,
factors, she estimates that

comparing firms that were hit by extensive lawsuits to a carefully chosen comparable sample. The comparison
sample allowed him to isolate the effect of patent lawsuits from other factors that might also influence R&D

Prior to the lawsuit, firms devoted 20% of their operating expenditures to


R&D; during the years after the lawsuit, after controlling for other factors, they reduced that
spending by 3% to 5% of operating expenditures, representing about a 19% reduction in relative
R&D spending. And researchers from Harvard and the University of Texas recently examined R&D spending
spending.

of publicly listed firms that had been sued by patent trolls. They compared firms where the suit was dismissed,
representing a clear win for the defendant, to those where the suit was settled or went to final adjudication
(typically much more costly). As in the previous paper, this comparison helped them isolate the effect of lawsuits

firms reduced their R&D


spending by $211 million and reduced their patenting significantly in subsequent
years. The reduction in R&D spending represents a 48% decline. Importantly,
from other factors. They found that when lawsuits were not dismissed,

these studies are initial releases of works in progress; the researchers will refine their estimates of harm over the

across a significant
number of studies using different methodologies and performed by different
researchers, a consistent picture is emerging about the effects of patent litigation: it
costs innovators money; many innovators and venture capitalists report that it
significantly impacts their businesses; innovators respond by investing less in R&D
and venture capitalists respond by investing less in startups. Haber and Levine might not like
the results of this research. But the weight of the evidence from these many studies cannot be ignored; patent
trolls do, indeed, cause harm. Its time for Congress to do something about it.
coming months. Perhaps some of the estimates may shrink a bit. Nevertheless,

2NC Comprehensive Reform


Comprehensive reform solves patent trolling
Downes 7/6
(Larry Downes. Larry Downes is an author and project director at the Georgetown Center for Business
and Public Policy. His new book, with Paul Nunes, is Big Bang Disruption: Strategy in the Age of
Devastating Innovation. Previous books include the best-selling Unleashing the Killer App: Digital
Strategies for Market Dominance. "What would 'real' patent reform look like?," CNET. 7-6-2015.
http://www.cnet.com/news/what-does-real-patent-reform-look-like///ghs-kw)

reversing the damage to the


innovation economy caused by years of overly generous patent policies requires far
stronger medicine than Congress is considering or the courts seem willing to swallow on their own. The bills
And a new report (PDF) from technology think tank Lincoln Labs argues that

making their way through Congress, for example, focus almost entirely on curbing abuses by companies that buy
up often overly broad patents and then, rather than produce goods, simply sue manufacturers and users they argue

nonpracticing entities, referred to derisively as patent


trolls, are widely seen as a serious drag on innovation, particularly in fastevolving technology industries. Trolling behavior, according to studies from
Stanford Law School professor and patent expert Mark Lemley, does little to nothing
to promote the Constitutional goal of patents to encourage innovation by granting
inventors temporary monopolies during which they can recover their investment. The
are infringing their patents. These

House of Representatives passed antitrolling legislation in 2013, but a Senate version was killed by then-Majority

"Patent trolls," said Gary Shapiro, president and CEO of the Consumer
$1.5 billion a week from the US economy -- that's almost
$120 billion since the House passed a patent reform bill in December of 2013." A call for 'real' patent reform The
Lincoln Labs report agrees with these and other criticisms of patent trolling, but argues for more
fundamental changes to the system, or what the report calls "real" patent reform. The
report, authored by former Republican Congressional staffer Derek Khanna, urges a complete overhaul
of the process by which the Patent Office reviews applications, as well as the
elimination of patents for software, business methods, and a special class of patents
for design elements -- a category that figured prominently in the smartphone wars. Khanna claims that the
Leader Harry Reid (D-Nev.) in May 2014.
Electronics Association, "bleed

Patent Office has demonstrated an "abject failure" to enforce fundamental legal requirements that patents only be

To reverse that trend, the report calls


on Congress to change incentives for patent examiners that today weigh the scales
in favor of approval, add a requirement for two examiners to review the most
problematic categories of patents, and allow crowdsourced contributions to Patent
Office databases of "prior art" to help filter out nonnovel inventions. Khanna
estimates these reforms alone "would knock out a large number of software
patents, perhaps 75-90%, where the economic argument for patents is exceedingly
difficult to sustain." The report also calls for the elimination of design patents, which
offer protection for ornamental features of manufactured products, such as the
original design of the Coca-Cola bottle.
granted for inventions that are novel, nonobvious and useful.

Reg-Neg CP

1NC Shell
Text: the United States federal government should enter into a
process of negotiated rulemaking over _______<insert
plan>______________ and implement the results of negotiation.
The CP is plan minusit doesnt mandate the plan, just that a
regulatory negotiations committee is created to discuss the
plan
And, it competesreg neg is not normal means
USDA 06
(The U.S. Department of Agricultures Agricultural Marketing Service administers programs that
facilitate the efficient, fair marketing of U.S. agricultural products, including food, fiber, and specialty
crops What is Negotiated Rulemaking?. Last updated June 6th 2014.
http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELPRDC5089434) //ghs-kw)

reg-neg differs from traditional notice-and-comment rulemaking The


traditional notice-and-comment rulemaking provided in the Administrative Procedure Act (APA)
requires an agency planning to adopt a rule on a particular subject to publish a
proposed rule (NPRM) in the Federal Register and to offer the public an opportunity to
comment. The APA does not specify who is to draft the proposed rule nor any
particular procedure to govern the drafting process. Ordinarily, agency staff performs this
function, with discretion to determine how much opportunity is allowed for public input. Typically, there is no
opportunity for interchange of views among potentially affected parties,
even where an agency chooses to conduct a hearing . The traditional notice-andHow

comment rulemaking can be very adversarial. The dynamics encourage parties to take extreme positions in their
written and oral statements in both pre-proposal contacts as well as in comments on any published proposed rule
as well as withholding of information that might be viewed as damaging. This adversarial atmosphere may
contribute to the expense and delay associated with regulatory proceedings, as parties try to position themselves
for the expected litigation. What is lacking is an opportunity for the parties to exchange views, share information,

In negotiated rulemaking, the


agency, with the assistance of one or more neutral advisors known as convenors,
assembles a committee of representatives of all affected interests to
negotiate a proposed rule. Sometimes the law itself will specify which interests are to be included on
and focus on finding constructive, creative solutions to problems.

the committee. Once assembled, the next goal is for members to receive training in interest-based problem-solving

They then must make sure that all views are heard and that
each committee member agrees to a set of ground rules for the negotiated rulemaking
process. The ultimate goal is to reach consensus on a text that all parties
can accept. The agency is represented at the table by an official who is sufficiently senior to be able to speak
authoritatively on its behalf. Negotiating sessions are chaired by a neutral mediator or
facilitator skilled in assisting in the resolution of multiparty disputes. The Checklist
and consensus-decision making.

Advantages as well as Misperceptions The advantages of negotiated rulemaking include: Producing greater
information sharing and better communication; Enhancing public awareness and involvement; Providing a
reality check to agencies and other interests; Encouraging discovery of more creative options for rulemaking;
Increasing compliance with rules; Saving time, money and effort in the long run; Allowing earlier
implementation dates; Building cooperative relationships among key parties; Increasing the certainty of the
outcome for all and thus enabling better planning; Producing superior rules on technically complex topics
because of the input of all parties; Giving rise to fewer legislative end runs against the rule; and Reducing
post-issuance contentiousness and litigation. What negotiating rulemaking does not do: It does not cause the
agency to delegate its ultimate obligation to determine the content of the proposed and final regulations; It does
not exempt the agency from any statutory or other requirements; It does not eliminate the agencys obligation to
produce any economic analysis; paperwork or other regulatory analysis requirements imposed by law or agency
policy; It does not require parties or non-parties to set aside their legal or political rights as a condition of
participating; and It is not compulsory, participation is voluntary, for the agency and for others.

<Insert specific solvency advocate>


Reg neg solvesempirics prove
Knaster 10
(Alana Knaster is the Deputy Director of the Resource Management Agency. She was Senior Executive
in the Monterey County Planning Department for five years with responsibility for planning, building,
and code enforcement programs. Prior to joining Monterey County, Alana was the President of the
Mediation Institute, a national non-profit firm specializing in the resolution of complex land use
planning and environmental disputes. Many of the disputes that she successfully mediated, involved
dozens of stakeholder groups including government agencies, major corporations and public interest
groups. She served in that capacity for 15 years. Alana was Mayor of the City of Hidden Hills,
California from 1981-88 and represented her City on a number of regional planning agencies and
commissions. She also has been on the faculty of Pepperdine University Law School since 1989,
teaching courses in environmental and public policy mediation. Knaster, A. Resolvnig Conflicts Over
Climate Change Solutions: Making the Case for Mediation, Pepperdine Dispute Resolution Law
Journal, Vol 10, No 3, 2010. 465-501. http://law.pepperdine.edu/dispute-resolution-law-journal/issues/volumeten/Knaster%20Article.pdf//ghs-kw)

Federal and international dispute resolution process models. There are also models in
U.S. and Canadian legislation supporting the use of consensus-based processes. These
processes have been successfully applied to resolve dozens of disputes
that involved multiple stakeholder interests, on technically and politically
complex environmental and public policy issues. For example, the Negotiated
Rulemaking Act of 1990 was enacted by Congress to formalize a process for negotiating contentious new
regulations.118 The Act provides a process called reg neg by which representatives of interest
groups that could be substantially affected by the provisions of a regulation, and
agency staff negotiate the provisions.119 The meetings are open to the public;
however, the process does enable negotiators to hold private interest group caucuses. If a consensus is
reached on the provisions of the rule, the Agency commits to publish the consensus
rule in the Federal Register for public comment. 120 The participants in the reg neg
agree that as long as the final regulation is consistent with what they have jointly
recommended, they will not challenge it in court. The assumption is that parties will
support a product that they negotiated.121 Reg neg has been utilized by
numerous federal agencies to negotiate rules pertaining to a diverse
range of topics including safe drinking water, fugitive gasoline emissions,
eligibility for educational loans, and passenger safety .122 In 1991, in Canada, an
initiative was launched by the National Task Force on Consensus and Sustainability to develop a guidance document
that would govern how federal, provincial, and municipal governments would address resource management
disputes. The document that was negotiated, Building Consensus for a Sustainable Future: Guiding Principles, was
adopted by consensus in 1994.123 The document outlined principles for building a consensus and process steps.
The ten principles included provisions regarding inclusivity of the process (this was particularly important in Canada
with respect to inclusion of Aboriginal peoples), voluntary participation, accountability to constituencies, respect for

The consensus principles were


subsequently utilized to resolve disputes over issues that included sustainable
forest management, siting of solid waste facilities, impacts of pulp mill expansion,
and economic diversification based on sustainable wildlife resources .125 The reg neg
and Consensus for Sustainable Future model represent codified mediated negotiation
processes that have withstood the test of legal challenge and have been strongly
endorsed by the groups that have participated in these processes.
diverse interests, and commitment to any agreement adopted.124

1NC Ptix NB
Doesnt link to politicsempirics prove
USDA 6/6
(The U.S. Department of Agricultures Agricultural Marketing Service administers programs that
facilitate the efficient, fair marketing of U.S. agricultural products, including food, fiber, and specialty
crops What is Negotiated Rulemaking?. Last updated June 6th 2014 @
http://www.ams.usda.gov/AMSv1.0/getfile?dDocName=STELPRDC5089434)

Congress endorsed use by federal agencies of an alternative procedure known as "negotiated


rulemaking,"'' also called "regulatory negotiation," or "reg-neg." It has been used by agencies to bring interested parties into the rule-drafting process at an early stage,
History In 1990,

under circumstances that foster cooperative efforts to achieve solutions to regulatory problems. Where successful, negotiated rulemaking can lead to better, more acceptable rules,

Negotiated rules may be easier to enforce and


less likely to be challenged in litigation. The results of reg-neg usage by the
federal government, which began in the early 1980s, are impressive: large-scale regulators as the
Environmental Protection Agency, Nuclear Regulatory Commission, Federal Aviation Administration, and the Occupational Safety and Health
Administration used the process on many occasions . Building on these positive experiences, several states, including Massachusetts, New
York, and California, have also begun using the procedure for a wide range of rules. The very first negotiated rule-making was convened
by the Federal Mediation and Conciliation Service (FMCS) working with the Department of
Transportation, the Federal Aviation Administration, airline pilots and other interested groups to deal with regulations concerning flight and duty time for pilots. The negotiated
rulemaking was a success and a draft rule was agreed upon that became the final rule. Since that first reg-neg. FMCS has assisted in both the convening and
facilitating stages in many such procedures at the Departments of Labor, Health and Human Services (HRSA), Interior, Housing and
Urban Development, and the EPA, as well as state-level processes, and other forms of consensus-based decision-making programs such as public policy dialogues, hearings, focus
based on a clearer understanding of the concerns of all those affected.

groups, and meetings.

1NC Fism NB
Failure to use reg neg results in a federalism crisisREAL ID
proves
Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)

value of negotiated rulemaking to federalism


bargaining may be best understood in relief against the failure of alternatives in
federalism-sensitive [*57] contexts. Particularly informative are the strikingly different state responses to
b. A Cautionary Tale: The REAL ID Act The

the two approaches Congress has recently taken in tightening national security through identifi-cation reform--one

requiring regulations through negotiated rulemaking, and the other through traditional
notice and comment. After the 9/11 terrorist attacks, Congress ordered the Department of Homeland

Security (DHS) to establish rules regarding valid identification for federal purposes (such as boarding an aircraft or
accessing federal buildings). n291 Recognizing the implications for state-issued driver's licenses and ID cards,

ne-gotiated rulemaking to forge consensus among the states about how best
States leery of the stag-gering costs associated with proposed reforms
participated actively in the process. n293 However, the subsequent REAL ID Act of
2005 repealed the ongoing negotiated rulemaking and required DHS to prescribe top-down federal requirements for state-issued licenses. n294 The resulting DHS rules have been bitterly opposed
by the majority of state governors, legislatures, and motor vehicle administrations,
n295 prompting a virtual state rebellion that cuts across the red-state/blue-state
political divide. n296 No state met the December 2009 deadline initially contemplated by the
statute, and over half have enacted or considered legislation prohibiting compliance
with the Act, defunding its implementation, or calling for its repeal . n297 In the face of
this unprecedented state hostility, DHS has extended compliance deadlines even for those that did not
request extensions, and bills have been introduced in both houses of Congress to repeal
the Act. n298 Efforts to repeal what is increasingly referred to as a "failed" policy have won
endorsements [*58] from or-ganizations across the political spectrum . n299 Even the
Congress required DHS to use
to proceed. n292

Executive Director of the ACLU, for whom federalism concerns have not historically ranked highly, opined in USA
Today that the REAL ID Act violates the Tenth Amendment. n300

US federalism will be modelled globallysolves human rights,


free trade, war, and economic growth
Calabresi 95
(Steven G. Calabresi is a Professor of Law at Northwestern University and is a graduate of the Yale
Law School (1983) and of Yale College (1980). Professor Calabresi was a Scholar in Residence at
Harvard Law School from 2003 to 2005, and he has been a Visiting Professor of Political Science at
Brown University since 2010. Professor Calabresi was also a Visiting Professor at Yale Law School in
the Fall of 2013. Professor Calabresi served as a Law Clerk to Justice Antonin Scalia of the United
States Supreme Court, and he also clerked for U.S. Court of Appeals Judges Robert H. Bork and Ralph
K. Winter. From 1985 to 1990, he served in the Reagan and first Bush Administrations working both in

the West Wing of the Reagan White House and before that in the U.S. Department of Justice. In 1982,
Professor Calabresi co-founded The Federalist Society for Law & Public Policy Studies, a national
organization of lawyers and law students, and he currently serves as the Chairman of the Societys
Board of Directors a position he has held since 1986. Since joining the Northwestern Faculty in 1990,
he has published more than sixty articles and comments in every prominent law review in the country.
He is the author with Christopher S. Yoo of The Unitary Executive: Presidential Power from
Washington to Bush (Yale University Press 2008); and he is also a co-author with Professors Michael
McConnell, Michael Stokes Paulsen, and Samuel Bray of The Constitution of the United States (2nd ed.
Foundation Press 2013), a constitutional law casebook. Professor Calabresi has taught Constitutional
Law I and II; Federal Jurisdiction; Comparative Law; Comparative Constitutional Law; Administrative
Law; Antitrust; a seminar on Privatization; and several other seminars on topics in constitutional law.
Calabresi, S. G. Government of Limited and Enumerated Powers: In Defense of United States v.
Lopez, A Symposium: Reflections on United States v. Lopez, Michigan Law Review, Vol 92, No 3,
December 1995. Ghs-kw)

a desire for both international and devolutionary federalism has swept


across the world in recent years. To a significant extent, this is due to global fascination with
and emulation of our own American federalism success story. The global trend toward
federalism is an enormously positive development that greatly increases the likelihood of future
peace, free trade, economic growth, respect for social and cultural diversity, and
protection of individual human rights. It depends for its success on the willingness
of sovereign nations to strike federalism deals in the belief that those deals will be
kept.233 The U.S. Supreme Court can do its part to encourage the future striking of such
deals by enforcing vigorously our own American federalism deal . Lopez could be
We have seen that

a first step in that process, if only the Justices and the legal academy would wake up to the importance of what is at
stake.

Federalism solves economic growth


Bruekner 05
(Jan K. Bruekner is a Professor of Economics University of California, Irvine. He is a Member member of
the Institute of Transportation Studies, Institute for Mathematical Behavioral Sciences, and a former
editor of the Journal of Urban Economics. Bruekner, J. K. Fiscal Federalism and Economic Growth,
CESifo Working Paper No. 1601, Novermber 2005. https://www.cesifogroup.de/portal/page/portal/96843357AA7E0D9FE04400144FAFBA7C//ghs-kw)

faster economic growth may constitute an additional


federalism beyond those already well recognized. This result, which matches the
conjecture of Oates (1993) and the expectations of most empirical researchers who have
studied the issue, arises from an unexpected source: a greater incentive to save
when public-good levels are tailored under federalism to suit the differing
demands of young and old consumers. This effect grows out of a novel
interaction between the rules of public-good provision which apply cross-sectionally
at a given time and involve the young and old consumers of different generations,
and the savings decision of a given generation, which is intertemporal in nature. This crossThe analysis in this paper suggests that

benefit of

fiscal

sectional/intertemporal interaction yields the link between federalism and economic growth. While it is encouraging

the papers results match recent empirical findings showing a positive growth
impact from fiscal decentralization, additional theoretical work exploring other possible sources of such
that

a link is clearly needed. The present results emerge from a model based on very minimal assumptions, but
exploration of richer models may also be fruitful.

US economic growth solves war, collapse ensures instability


National Intelligence Council, 12 (December, Global Trends 2030:
Alternative Worlds
http://www.dni.gov/files/documents/GlobalTrends_2030.pdf)

a reinvigorated US economy would increase the


prospects that the growing global and regional challenges would be addressed. A stronger
US economy dependent on trade in services and cutting-edge technologies would be a boost for the world
economy, laying the basis for stronger multilateral cooperation. Washington would
have a stronger interest in world trade, potentially leading a process of World Trade Organization reform
Big Stakes for the International System The optimistic scenario of

that streamlines new negotiations and strengthens the rules governing the international trading system.
The US would be in a better position to boost support for a more democratic Middle
East and prevent the slide of failing states . The US could act as balancer ensuring
regional stability, for example, in Asia where the rise of multiple powersparticularly India
and Chinacould spark increased rivalries. However, a reinvigorated US would not necessarily be a panacea. Terrorism,
proliferation, regional conflicts, and other ongoing threats to the international order will be affected by the presence or absence of strong US leadership

The US impact is much more clear-cut in the negative case in


which the US fails to rebound and is in sharp economic decline. In that scenario, a large and
dangerous global power vacuum would be created and in a relatively short
space of time. With a weak US, the potential would increase for the European
economy to unravel. The European Union might remain, but as an empty shell around a fragmented continent. Progress on trade reform
as well as financial and monetary system reform would probably suffer. A weaker and less secure international
community would reduce its aid efforts, leaving impoverished or crisis-stricken countries to fend for themselves,
multiplying the chances of grievance and peripheral conflicts . In this scenario, the US would
be more likely to lose influence to regional hegemons China and India in Asia and
Russia in Eurasia. The Middle East would be riven by numerous rivalries which could
erupt into open conflict, potentially sparking oil-price shocks. This would be a world
reminiscent of the 1930s when Britain was losing its grip on its global leadership
role.
but are also driven by their own dynamics.

2NC O/V
The counterplan convenes a regulatory negotiation committee
to discuss the implementation of the plan. Stakeholders decide
how and if the plan is implementedthen implements the
decision - solves better than the AFF:
2. Agency actiontraditional notice-and-comment rulemaking
incentivizes actors to withhold information which prevents
agency action and guts implementation of the plan CP
facilitates cooperationthats Siegel 9.
3. Collaborationreg neg facilitates government-civilian
cooperation, results in greater satisfaction with regulations
and better compliance after implementationsocial
psychology and empirics prove
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford
University, a Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a
Doctors of Jurisdictional Science from Harvard University. She served as Counselor for Energy and
Climate Change in the Obama White House in 2009-2010. Freeman is a prominent scholar of
regulation and institutional design, and a leading thinker on collaborative and contractual
approaches to governance. After leaving the White House, she advised the National Commission
on the Deepwater Horizon oil spill on topics of structural reform at the Department of the Interior.
She has been appointed to the Administrative Conference of the United States, the government
think tank for improving the effectiveness and efficiency of federal agencies, and is a member of
the American College of Environmental Lawyers. Laura I Langbein is the Professor of Quantitative
Methods, Program Evaluation, Policy Analysis, and Public Choice and American College. She holds
a PhD in Political Science from the University of North Carolina, a BA in Government from Oberlin
College. Freeman, J. Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U.
Environmental Journal, Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy
%20benefit.pdf/)
D. Compliance The compliance implications of consensus-based processes remain a matter of speculation.360
No one has yet produced empirical data on the relationship between negotiated rulemaking and compliance,
let alone data comparing the compliance implications of negotiated and conventional rules.361 However, the

The data shows reg-neg


participants to be significantly more likely than conventional rulemaking
participants to report the perception that others will be able to comply with the
final rule.362 Perceiving that others will comply might induce more compliance among competitors, along
the lines of game theoretic models, at least until evidence of defection emerges.363 Moreover, to the
extent that compliance failures are at least partly due to technical and
information deficitsrather than to mere political resistanceit seems plausible
that reports of the learning effect and more horizontal sharing of information
might help to improve compliance in the long run.364 The claim that reg-neg
could improve compliance is consistent with social psychology studies showing
that in both legal and organizational settings, fair procedures lead to greater
compliance with the rules and decisions with which they are associated .365
Similarly, negotiated rulemaking might facilitate compliance by bringing to the
surface some of the contentious issues earlier in the rulemaking process, where
Phase II results introduce interesting new findings into the debate.

they might be solved collectively rather than dictated by the agency. Although speculative, these hypotheses
seem to fit better with Kerwin and Langbeins data than do the rather negative expectations about compliance.

Higher satisfaction could well translate into better long-term compliance, even if
litigation rates remained the same. Consistent with our contention that process matters, we
expect it to matter to compliance as well. In any event, empirical studies of compliance should
no longer be so difficult to produce. A number of negotiated rules are now several years old, with
some in the advanced stages of implementation. A study of compliance might compare numbers of
enforcement actions for negotiated as compared to conventional rules, measured by notices of violation, or

might investigate as well whether compliance methods


differ between the two types of rules: perhaps the enforcement of negotiated
rules occurs more cooperatively, or informally, than enforcement of conventional
rules. Possibly, relationships struck during the negotiated rulemaking make a
difference at the compliance stage.367 To date, the effects of how the rule is developed on
penalties, for example.366 It

eventual compliance remain a matter of speculation, even though it is ultimately an empirical issue on which
both theory and empirical evidence must be brought to bear.

And, well win new net benefits here that ALL turn the aff
a. Delayscps regulatory negotiation means that rules wont
be challenged during the regulation creation processempirics
prove the CP solves faster than the AFF
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl
F. Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the
design of many of the major developments of administrative law in the past 40 years. He is the
author of more than 50 papers and books on administrative law and has been a visiting professor
or guest lecturer internationally, including at the University of Paris II, Humboldt University
(Berlin) and the University of the Western Cape (Cape Town). He has consulted on environmental
mediation and public participation in rulemaking in China, including a project sponsored by the
Supreme Peoples Court. He has received multiple awards for his achievements in administrative
law. He is listed in Who's Who in America and is a member of the Administrative Conference of the
United States.Harter, P. J. Assessing the Assessors: The Actual Performance of Negotiated
Rulemaking, December 1999. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghskw)

the average length of EPAs negotiated


rulemakings the time it took EPA to fulfill its goal was 751 days or
32% faster than traditional rulemaking. This knocks a full year off the
average time it takes EPA to develop rule by the traditional method. And, note
these are highly complex and controversial rules and that one of them survived
Presidential intervention. Thus, the dynamics surrounding these rules are by no
mean average. This means that reg negs actual performance is much better
than that. Interestingly and consistently, the average time for all of EPAs reg negs when viewed in context
Properly understood, therefore,

is virtually identical to that of the sample drawn by Kerwin and Furlong77 differing by less than a month.
Furthermore, if all of the reg negs that were conducted by all the agencies that were included in Coglianeses

the average time for all


negotiated rulemakings drops to less than 685 days .80 No Substantive Review of Rules
table78 were analyzed along the same lines as discussed here,79

Based on Reg Neg Consensus. Coglianese argues that negotiated rules are actually subjected to a higher
incident of judicial review than are rules developed by traditional methods, at least those issued by EPA.81 But,
like his analysis of the time it takes to develop rules, Coglianese fails to look at either what happened in the
negotiated rulemaking itself or the nature of any challenge. For example, he makes much of the fact that the
Grand Canyon visibility rule was challenged by interests that were not a party to the negotiations;82 yet, he
also points out that this rule was not developed under the Negotiated Rulemaking Act83 which explicitly
establishes procedures that are designed to ensure that each interest can be represented. This challenge
demonstrates the value of convening negotiations.84 And, it is significantly misleading to include it when
discussing the judicial review of negotiated rules since the process of reg neg was not followed. As for
Reformulated Gasoline, the rule as issued by EPA did not reflect the consensus but rather was modified by EPA
under the direction of President Bush.85 There were, indeed, a number of challenges to the application of the
rule,86 but amazingly little to the rule itself given its history. Indeed, after the proposal was changed, many

members of the committee continued to meet in an effort to put Humpty Dumpty back together again, which

the fact that the rule had been negotiated not only resulted in a
much better rule,87 it enabled the rule to withstand in large part a massive
assault. Coglianese also somehow attributes a challenge within the World Trade Organization to a
they largely did;

shortcoming of reg neg even though such issues were explicitly outside the purview of the committee; to
criticize reg neg here is like saying surgery is not effective when the patient refused to undergo it. While the
Underground Injection rule was challenged, the committee never reached an agreement88 and, moreover, the
convening report made clear that there were very strong disagreements over the interpretation of the
governing statute that would likely have to be resolved by a Court of Appeals. Coglianese also asserts that the
Equipment Leaks rule was the subject of review; it was, but only because the Clean Air requires parties to file
challenges in a very short period, and a challenger therefore filed a defensive challenge while it worked out
some minor details over the regulation. Those negotiations were successful and the challenge was withdrawn.
The Chemical Manufacturers Association, the challenger, had no intention of a substantive challenge.89
Moreover, a challenge to other parts of the HON should not be ascribed to the Equipment Leaks part of the
rule. The agreement in the Asbestos in Schools negotiation explicitly contemplated judicial review strange,
but true and hence it came as no surprise and as no violation of the agreement. As for the Wood Furniture
Rule, the challenges were withdrawn after informal negotiations in which EPA agreed to propose amendments
to the rule.90 Similarly, the challenge to EPAs Disinfectant By-Products Rule91 was withdrawn. In short, the
rules that have emerged from negotiated rulemaking have been remarkably resistant to substantive
challenges. And, indeed, this far into the development of the process, the standard of review and the extent to
which an agreement may be binding on either a signatory or someone whom a party purports to represent are

Coglianese
paints a substantially misleading picture by failing to distinguish substantive
challenges to rules that are based on a consensus from either challenges to
issues that were not the subject of negotiations or were filed while some details
were worked out. Properly understood, reg negs have been phenomenally
successful in warding off substantive review.
still unknown the speculation of many an administrative law class.92 Thus, here too,

B. More democraticreg neg encourages private sector


participationmeans that regulations arent unilaterally
created by the USFGCP results in a fair playing field for the
entirety of the private sector
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate
Change in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to
governance. Laura Langbein is the Professor of Quantitative Methods, Program Evaluation, Policy
Analysis, and Public Choice and American College. She holds a PhD in Political Science from the
University of North Carolina, a BA in Government from Oberlin College. Freeman, J. Langbein, R. I.
Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal, Volume 9,
2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

Negotiated Rulemaking Is Fairer to Regulated Parties than


Conventional Rulemaking To test whether reg neg was fairer to regulated parties, Ker-win and Langbein
asked respondents whether EPA solicited their participation and whether they believed
anyone was left out of the process. They also examined how much the parties learned in each process, and
2.

whether they experienced resource or information disparities. Negotiated rule participants were significantly more likely to say

that the EPA encouraged their participation than conventional rule participants (65% versus 33%
respectively). Al-though a higher proportion of conventional rulemaking participants reported that a party that should have
been represented in the rulemaking was omitted, the difference is not statistically significant. Specifically, "a majority of both
negotiated and conventional rule participants believed that the parties who should have been involved were involved (66%
versus 52% respectively)." In addition, as reported above, participants in regulatory negotiations reported significantly more
learning than their conventional rulemaking counterparts. Indeed, the disparity between the two types of participants in terms
of their reports about learning was one of the study's most striking results. At the same time, the resource disadvantage of

while smaller
groups did report suffering from a lack of resources during regulatory
poorer, smaller groups was no greater in negotiated rulemaking than in conventional rulemaking. So,

negotiation, they reported the same in conventional rulemakings; no disparity


existed between the two processes on this score. Finally, the data suggest that the agency
is equally responsive to the parties in both negotiated and conventional
rulemakings. This result, together with the finding that participants in regulatory negotiations perceived
disproportionate influence to be about evenly distributed, suggests that reg neg is at least as fair to the parties as conventional

because participant learning was so much greater in


regulatory negotiation, the process may in fact be more fair.
rulemaking. Indeed,

2NC Solves Better


Reg neg is better for complex rules
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

Complex Rules Are More Likely To Be Settled Through Negotiated


Rulemaking Recall that theorists disagree over whether complex or simple issues are best suited for
4.

negotiation. The data suggest that negotiated and conventional rules differ in systematic ways, indicating that EPA
officials do not select just any rule for negotiation. When asked how the issues for rulemaking were established, reg
neg participants reported more often than their counterparts that the participants established at least some of them

Conventional rulemaking participants more often admitted to being


uninformed of the process for establishing issues (17% versus 0%) or offered that regulated
(44% versus 0%).

entities set the issues (11% to 0%). A majority of both groups reported that the EPA or the governing legislation

types of issues indeed


appeared to differ between negotiated and conventional rules . When asked about the type
of issues to be decided, 52% of participants in conventional groups identified issues regarding
the standard, including its level, timing, or measurement ( compared to 31% of negotiated rule
participants), while 58% of the negotiating group identified compliance and implementation
issues (compared to 39% of participants in the conventional group). More reg neg participants (53%)
also cited compliance issues as causing the greatest conflict, compared to 32% of
conventional participants. Conventional participants more often reported that the rulemaking failed to
resolve all of the issues (30% versus 14%), but also more often reported that they encountered no
established at least some of the issues. Kerwin and Langbein found that the

"surprise" issues (74% versus 44%). Participants perceived negotiated rules to be more complex, with more issues

reg neg
participants tended to develop a more detailed view about the issues to be decided
than did their conventional counterparts. The researchers interpreted this disparity in reported
and more sides per issue than conventional rules. Kerwin and Langbein learned in interviews that

detail as a perception of complexity. To measure it they computed a complexity score: the more issues and the
more sides to each issue that respondents in a rulemaking could identify, relative to the number of respondents, the
more nuanced or complex the rulemaking. Using this calculation, the rules ranged in com plexity from 1.9 to 5.0,
with a mean complexity score of 3.6. The mean complexity score for reg negs (4.1) was significantly higher than the
score (2.5) for conventional rulemaking. Reg neg participants also presented a clearer understanding of the issues
to be decided than did conventional participants. To test clarity, Kerwin and Langbein developed a measure that
would reflect the striking variation among respondents in the number of different issues and different sides they
perceived in their rulemaking. Some respondents could identify very few separate issues and sides (e.g., "the level
of the standard is the single issue and the sides are business, environmentalists, and EPA"), while others detected
as many as four different issues, with three sides on some and two on others. Kerwin and Langbein's measurement
was in units of issue/sides, representing a combination of the two variables, the recognition of which they were
measuring; the mentions ranged from 3 to 10 issue/sides, with a mean of 7.9. Negotiated rulemaking participants
mentioned an average of 8.9 issue/sides, compared to an average of 6issue/sides mentioned by their conventional
counterparts, a statistically significant difference. To illustrate the difference between complexity and clarity: If a
party identified the compliance standard as the sole issue, but failed to identify a number of sub-issues, they would
be classified as having a clear understanding but not a complex one. similarly, if the party identified two sides
(business vs. environment) without recognizing distinctions among business participants or within an environmental

The differences in
complexity might be explained by the higher reported rates of learning by reg neg
participants, rather than by differences in the types of rules processed by reg neg
coalition, they would also be classified as clear but not complex in their understanding.

versus conventional rulemaking. Kerwin and Langbein found that complexity and clarity
were both positively and significantly correlated with learning by
respondents, but the association between learning and complexity/clarity disappeared when the type of
rulemaking was held constant. However, when the amount learned was held constant, the association between

the
association between learning and complexity/clarity was due to the negotiation
process. In other words, the differences in complexity/clarity are not attributable to higher
learning but rather to differences between the processes. The evidence is consistent with the
hypothesis that issues selected for regulatory negotiation are different from and more
complicated than those chosen for conventional rulemaking. The data associating
reg negs with complexity, together with the finding that more issues settle in reg
negs, are consistent with the proposition that issues with more (and more di verse)
sub-issues and sides settle more easily than simple issues.
complexity/clarity and the type of rulemaking remained positive and significant. This signifies that

Reg neg is better than conventional rulemaking


Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)
In this article, we present an original analysis and summary of new empirical evidence from Neil Kerwin and Laura
Langbein's two-phase study of Environmental Protection Agency (EPA) negotiated rulemakings. n5 Their qualitative
and (*62) quantitative data reveal more about reg neg than any empirical study to date; although not published in
a law review article until now, they unquestionably bear upon the ongoing debate among legal scholars over the
desirability of negotiating rules. Most importantly, this is the first study to compare participant attitudes toward
negotiated rulemaking with attitudes toward conventional rulemaking. The findings of the studies tend, on balance,
to undermine arguments made by the critics of regulatory negotiation and to bolster the claims of proponents.

reg neg generates more


learning, better quality rules, and higher satisfaction compared to conventional
rulemaking. n6 At the same time, stakeholder influence on the agency remains
about the same using either approach. n7 Based on the results, we recommend
more frequent use of regulatory negotiation, accompanied by further comparative
and empirical study, for the purposes of establishing regulatory standards and
resolving implementation and compliance issues. This recommendation contradicts
the prevailing view that the process is best used sparingly, n8 and even then, only
for narrow questions of implementation. n9
Kerwin and Langbein found that, according to participants in the study,

Reg negs solve better


Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing

the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.


http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)
The Primary Objective of Negotiated Rulemaking Is To Create Better and More Widely Accepted Rules. Coglianese
argues throughout his article that the primary benefits of negotiated rules were seen by its advocates as being the
reduction in time and in the incidence of litigation.93 While, both benefits have been realized, neither was seen by
those who established it as the predominant factor in its use. For example, Peter Schuck wrote an important early

the benefits of negotiated solutions over those imposed by a


hierarchy.94 Schuck emphasized a number of shortcomings of the adjudicatory nature of hybrid rulemaking and
many benefits of direct negotiations among the affected parties. The tenor of his thinking is reflected
by his statement, a bargained solution depends for its legitimacy not upon its
objective rationality, inherent justice, or the moral capital of the institution that
fashioned it, but upon the simple fact that it was reached by consent of the parties
affected.95 And, it encourages diversity, stimulates the parties to develop relevant information about facts
article in which he described

and values, provides a counter-weight to concentrations of power, and advances participation by those the
decisions affect.96 Nowhere in his long list of benefits was either speed or reduced litigation, except by implication
of the acceptability of the results. My own article that developed the recommendations97 on which the ACUS
Recommendation,98 the Negotiated Rulemaking Act, and the practice itself are based describes the anticipated

Negotiating has many advantages over the adversarial


process. The parties participate directly and immediately in the decision. They share
in its development and concur in it, rather than participate by submitting
information that the decisionmaker considers in reaching the decision. Frequently,
those who participate in the negotiations are closer to the ultimate decisionmaking
authority of the interest they represent than traditional intermediaries that
represent the interest in an adversarial proceeding. Thus, participants in
negotiations can make substantive decisions, rather than acting as experts in the
decisionmaking process. In addition, negotiation can be a less expensive means of
decisionmaking because it reduces the need to engage in defensive research in
anticipation of arguments made by adversaries. Undoubtedly the prime benefit of
direct negotiations is that it enables the participants to focus squarely on
their respective interests.99 The article quotes John Dunlop, a true pioneer in using negotiations
benefits of negotiated rulemaking:

among the affected interests in the public sphere,100 as saying In our society, a rule that is developed with the
involvement of the parties who are affected is more likely to be accepted and to be effective in accomplishing its

Reducing time and litigation exposure was not emphasized if even


mentioned directly To be sure, the Congressional findings that precede the Negotiated Rulemaking Act
mention the savings of time and litigation, but they are largely the by-product of far
more significant benefits:102 (2) Agencies currently use rulemaking procedures that
may discourage the affected parties from meeting and communicating with each
other, and may cause parties with different interest to assume conflicting and
antagonistic positions and to engage in expensive and time-consuming litigation
over agency rules. (3) Adversarial rulemaking deprives the affected parties and the
public of the benefits of face-to-face negotiations and cooperation in developing and
reaching agreement on a rule. It also deprives them of the benefits of shared
information, knowledge, expertise, and technical abilities possessed by the affected
parties 4) Negotiated rulemaking, in which the parties who will be significantly
affected by a rule participate directly in the development of the rule, can provide
significant advantages over adversarial rulemaking. (5) Negotiated rulemaking can
increase the acceptability and improve the substance of rules, making it less likely
that the affected parties will resist enforcement or challenge such rules in court. It
may also shorten the amount of time needed to issue final rules . Thus, those who
were present at the creation of reg neg sought neither expedition nor a
shield against litigation. Rather, they saw direct negotiations among the
parties a form of representational democracy not explicitly recognized in the Administrative Procedure Act
as resulting in rules that are substantively better and more widely
intended purposes.101

accepted. Those benefits were seen as flowing from the participation of those
affected who bring with them a practical insight and expertise that can result in
rules that are better informed, more tailored to achieving the actual regulatory goal
and hence more effective, and able to be enforced.

Reg negs are the best type of negotiations


Hsu 02
(Shi-Ling Hsu is the Larson Professor of Law at the Florida State University College of Law. Professor
Hsu has a B.S. in Electrical Engineering from Columbia University, and a J.D. from Columbia Law
School. He also has an M.S. in Ecology and a Ph.D. in Agricultural and Resource Economics, both from
the University of California, Davis. Professor Hsu has taught in the areas of environmental and natural
resource law, law and economics, quantitative methods, and property. Prior to his current
appointment, Professor Hsu was a Professor of Law and Associate Dean for Special Projects at the
University Of British Columbia Faculty Of Law. He has also served as an Associate Professor at the
George Washington University Law School, a Senior Attorney and Economist for the Environmental
Law Institute in Washington D.C, and a Deputy City Attorney for the City and County of San Francisco.
A Game Theoretic Approach to Regulatory Negotiation: A Framework for Empirical Analysis, Harvard
Environmental Law Review, Vol 26, No 2, February2002. http://papers.ssrn.com/sol3/papers.cfm?
abstract_id=282962//ghs-kw)

There are reasons to be optimistic about what regulatory negotiations can produce
in even a troubled administrative state. Jody Freeman noted that one important finding from the
Kerwin and Langbein studies were that parties involved in negotiated rulemaking were able to
use the face-to-face contact as a learning experience .49 Barton Thompson has noted in his
article on common-pool resources problems50 that one reason that resource users resist collective action solutions

it is evidently human nature to blame others for the existence of resource


shortages. That in turn leads to an extreme reluctance by resource users to agree to
a collective action solution if it involves even the most minimal personal sacrifices. Thompson suggests
that the one hope for curing resource users of such self-serving myopia is face-to-face
contact and the exchange of views. The vitriol surrounding some environmental regulatory issues
suggests that there is a similar human reaction occurring with respect to some resource
conflicts.51 Solutions to environmental problems and resource conflicts on which regulated parties and
environmental organizations hold such strong and disparate views may require face-to-face contact to
defuse some of the tension and remove some of the demonization that has arisen in the
these conflicts. Reinvention, with the emphasis on negotiations and face-to-face contact, provides
such an opportunity. 52 Farber has argued for making the best of this trend towards regulatory negotiation
characterizing negotiated rulemaking and reinvention. 53 Faced with the reality that some negotiation will
inevitably take place because of the slippage inherent in our system of
regulation, Farber argues that the best model for allowing it to go forward is a bilateral
one. A system of bilateral negotiation would clearly be superior to a system of selfregulation, as such a Farber has argued for making the best of this trend towards regulatory negotiation
is that

characterizing negotiated rulemaking and reinvention. A system of bilateral negotiation would clearly be superior to

a
system of bilateral negotiation between agencies and regulated parties would even
be superior to a system of multilateral negotiation, due to the transaction costs of
assembling all of the affected stakeholders in a multilateral effort, and the
difficulties of reaching a consensus among a large number of parties. Moreover, multilateral
a system of self-regulation, as such a system would inevitably descend into a tragedy of the commons.54 But

negotiation gives rise to the troubling idea that there should be joint governance among the parties. Since
environmental organizations lack the resources to participate in post-negotiation governance, there is a heightened

The correct balance between


regulatory flexibility and accountability, argues Farber, is to allow bilateral negotiation
but with built-in checks to ensure that the negotiation process is not captured by
regulated parties. Built-in checks would include transparency, so that environmental organizations can
danger of regulatory capture by the better-financed regulated parties.55

monitor regulatory bargains, and the availability of citizen suits, so that environmental organizations could remedy
regulatory bargains that exceed the dictates of the underlying statute. Environmental organizations would thus play
the role of the watchdog, rather than the active participant in negotiations. The finding of Kerwin and Langbein that

resource constraints sometimes caused environmental organizations, especially smaller local ones, to skip

A much more efficient use of


limited resources would require that the environmental organization attempt to play
a deterrent role in monitoring negotiated rulemakings.
negotiated rulemakings would seem to support this conclusion. 56

2NC Cybersecurity Solvency


Reg neg solves cybersecurity
Sales 13
(Sales, Nathan Alexander. Assistant Professor of Law, George Mason University School of Law.
REGULATING CYBERSECURITY, Northwestern University Law Review. 2013.
http://www.rwu.edu/sites/default/files/downloads/cyberconference/cyber_threats_and_cyber_realities_r
eadings.pdf//ghs-kw)

An alternative would be a form of enforced self-regulation 324 in which private


companies develop the new cybersecurity protocols in tandem with the
government.325 These requirements would not be handed down by
administrative agencies, but rather would be developed through a
collaborative partnership in which both regulators and regulated would
play a role. In particular, firms might prepare sets of industrywide security
standards. (The National Industrial Recovery Act, famously invalidated by the Supreme Court in 1935, contained such a
mechanism,326 and today the energy sector develops reliability standards in the same way.327) Or agencies could
sponsor something like a negotiated rulemaking in which regulators, firms, and
other stakeholders forge a consensus on new security protocols. 328 In either
case, agencies then would ensure compliance through standard administrative
techniques like audits, investigations, and enforcement actions. 329 This approach
would achieve all four of the benefits of private action mentioned above: It avoids (some) problems with
information asymmetries, takes advantage of distributed private sector
knowledge about vulnerabilities and threats, accommodates rapid
technological change, and promotes innovation. On the other hand, allowing firms to help set
the standards that will be enforced against them may increase the risk of regulatory capture the danger that agencies will come to
promote the interests of the companies they regulate instead of the publics interests.330 The risk of capture is always present in
regulatory action, but it is probably even more acute when regulated entities are expressly invited to the decisionmaking table.331

2NC Encryption Advocate


Heres a solvency advocate
DMCA 05
(Digital Millenium Copyright Act, Supplement in 2005. https://books.google.com/books?id=nL0s81xgVwC&pg=PA481&lpg=PA481&dq=encryption+AND+(+%22regulatory+negotiation%22+OR+
%22negotiated+rulemaking%22)&source=bl&ots=w9mrCaTJs4&sig=1mVsh_Kzk1p26dmT9_DjozgVQI&hl=en&sa=X&ved=0CB4Q6AEwAGoVChMIxtPG5YH9xgIVwx0eCh2uEgMJ#v=onepa
ge&q&f=false//ghs-kw)

Some encryption supporters advocate use of advisory committee and negotiated


rulemaking procedures to achieve consensus around an encryption
standard. See Motorola Comments at 10-11; Veridian Reply Comments at 20-23.

Reg negs are key to wireless technology innovation


Chamberlain 09
(Chamberlain, Inc. Comments before the Federal Communications Commission. 11-05-2009.
https://webcache.googleusercontent.com/search?
q=cache:dfYcw45dQZsJ:apps.fcc.gov/ecfs/document/view%3Bjsessionid
%3DSQnySfcTVd22hL6ZYShTpQYGY1X27xB14p3CS1y01XW15LQjS1jj!-1613185479!153728702%3Fid
%3D7020245982+&cd=2&hl=en&ct=clnk&gl=us//ghs-kw)

Chamberlain supports solutions that will balance the needs of stakeholders in both
the licensed and unlicensed bands. Chamberlain and other manufacturers of unlicensed
devices such as Panasonic are also uniquely able to provide valuable contributions
from the perspective of unlicensed operators with a long history of innovation in the
unlicensed bands. Moreover, as the Commission has recognized in recent proceedings,
alternative mechanisms for gathering data and evaluating options may assist the
Commission in reaching a superior result.19 For these reasons, Chamberlain would
support a negotiated rulemaking process, the use of workshops -both large and small- or any other
alternative process that ensures the widest level of participation from stakeholders across
the wireless market.

2NC Privacy Solvency


Reg neg is key to privacy
Rubinstein 09
(Rubinstein, Ira S. Adjunct Professor of Law and Senior Fellow, Information Law Institute, New York
University School of Law. PRIVACY AND REGULATORY INNOVATION: MOVING BEYOND VOLUNTARY
CODES, Workshop for Federal Privacy Regulation, NYU School of Law. 10/2/2009.
https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-commentproject-no.p095416-544506-00103/544506-00103.pdf//ghs-kw)

self-regulation is a recurrent theme in the


US approach to online privacy and perhaps a permanent part of the regulatory
landscape. This Articles goal has been to consider new strategies for overcoming observed weaknesses in
Whatever its shortcoming, and despite its many critics,

self-regulatory privacy programs. It began by examining the FTCs intermittent embrace of self-regulation, and
found that the Commissions most recent foray into self regulatory guidelines for online behavioral advertising is
not very different from earlier efforts, which ended in frustration and a call for legislation. It also reviewed briefly
the more theoretical arguments of privacy scholars for and against self-regulation, but concluded that the market
oriented views of those who favor open information flows clashed with the highly critical views of those who detect
a market failure and worry about the damaging consequences of profiling and surveillance not only to individuals,
but to society and to democratic self-determination. These views seem irreconcilable and do not pave the way for
any applied solutions. Next, this Article presented three case studies of mandated self-regulation. This included
overviews of the NAI Principles and the SHA, as well as a more empirical analysis of the CARU safe harbor program.
An assessment of these case studies against five criteria (completeness, free rider problems, oversight and

self-regulation undergirded by law


in other words, a statutory safe harboris a more effective and efficient
instrument than any self-regulatory guidelines in which industry is chiefly
responsible for developing principles and /or enforcing them. In a nutshell, well-designed
safe harbors enable policy makers to imagine new forms of self-regulation that
build on its strengths while compensating for its weaknesses .268 This embrace of
enforcement, transparency, and formation of norms) concluded that

statutory safe harbors led to a discussion of how to improve them by importing second-generation strategies from
environmental law. Rather than summarizing these strategies and how they translate into the privacy domain, this
Article concludes with a set of specific recommendations based on the ideas discussed in Part III.C. If Congress
enacts comprehensive privacy legislation based on FIPPs, the first recommendation is that the new law
include a safe harbor program, which should echo the COPPA safe harbor to the extent of encouraging groups to
submit self-regulatory guidelines and, if approved by the FTC, treat compliance with these guidelines as deemed
compliance with statutory requirements. The FTC should be granted APA rulemaking powers to implement
necessary rules including a safe harbor rule.

Congress should also consider whether to mandate a

negotiated rulemaking for an OBA safe harbor or for safe harbor programs more generally. In any case,
FTC should give serious thought to using the negotiated rulemaking process in
developing a safe harbor program or approving specific guidelines. In addition, the safe harbor program should be
overhauled to reflect second-generation strategies. Specifically, the statute should articulate default requirements
but allow FTC more discretion in determining whether proposed industry guidelines achieve desired outcomes,
without firms having to match detailed regulatory requirements on a point by point basis.

2NC Fism NB
Reg negs are better and solves federalismplan fails
Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)
1. Negotiated Rulemaking Although the most conventional of the less familiar forms, " negotiated

rulemaking" between federal agencies and state stakeholders is a sparingly used tool that holds promise
for facilitating sound administrative policymaking in disputed federalism contexts,
such as those implicating environmental law, national security, and consumer
safety. Under the Administrative Procedure Act, the traditional "notice and comment"
administrative rulemaking pro-cess allows for a limited degree of
participation by state stakeholders who comment on a federal agency's
proposed rule. The agency publishes the proposal in the Federal Register, invites public comments
critiquing the draft, and then uses its discretion to revise or defend the rule in response to comments. n256 Even
this iterative process con-stitutes a modest negotiation, but it leaves participants so frequently unsatisfied that
many agencies began to in-formally use more extensive negotiated rulemaking in the 1970s. n257 In 1990,
Congress passed the Negotiated Rulemaking Act, amending the Administrative Procedure Act to allow a more
dynamic [*52] and inclusive rulemaking process, n258 and a subsequent Executive Order required all federal
agencies to consider negotiated rulemaking when developing regulations. n259 Negotiated rulemaking allows

Under notice and comment,


public participation is limited to criticism of well-formed rules in which the agency is
already substantially invested. n260 By contrast, stakeholders in negotiated
rulemaking collectively design a proposed rule that takes into account their
respective interests and expertise from the beginning. n261 The concept, outline, and/or text of
stakeholders much more influence over unfolding regulatory decisions.

a rule is hammered out by an advisory committee of carefully balanced representation from the agency, the
regulated public, community groups and NGOs, and state and local governments. n262 A professional intermediary
leads the effort to ensure that all stakeholders are appropriately involved and to help interpret prob-lem-solving
opportunities. n263 Any consensus reached by the group becomes the basis of the proposed rule, which is still
subject to public comment through the normal notice-and-comment procedures. n264 If the group does not reach
consensus, then the agency proceeds through the usual notice-and-comment process. n265 The negotiated
rulemaking process, a tailored version of interest group bargaining within established legisla-tive constraints, can

The process is usually more subjectively satisfying [*53] for all


stakeholders, including the government agency representatives. n267 More
cooperative relationships are estab-lished between the regulated parties and the
agencies, facilitating future implementation and enforcement of new rules. n268 Final
regulations include fewer technical errors and are clearer to stakeholders, so that
less time, money and effort is expended on enforcement. n269 Getting a proposed rule out for
yield important benefits. n266

public comment takes more time under negotiated rulemaking than standard notice and comment, but thereafter,

negotiated rules receive fewer and more moderate public comment, and are less
frequently challenged in court by regulated entities . n270 Ultimately, then, final
regulations can be implemented more quickly following their debut in the Federal

Register, and with greater compliance from stakeholders. n271 The process also
confers valuable learning benefits on participants, who come to better understand
the concerns of other stakeholders, grow invested in the consensus they help
create, and ulti-mately campaign for the success of the regulations within their own
constituencies. n272 Negotiated rulemaking offers additional procedural benefits because it
ensures that agency personnel will be unambiguously informed about the full
federalism implications of a proposed rule by the impacted state interests. Federal agencies are
already required by executive order to prepare a federalism impact statement for rulemaking with federalism

the quality of state-federal communication within negotiated


rulemaking enhances the likelihood that federal agencies will appreciate and
understand the full extent of state [*54] con-cerns. Just as the consensus-building process invests
participating stakeholders with respect for the competing concerns of other stake-holders, it invests
participating agency personnel with respect for the federalism concerns of state
stakeholders. n274 State-side federalism bargainers interviewed for this project
consistently reported that they always prefer negotiated rulemaking to notice and
comment--even if their ultimate impact remains small--because the products of fully
informed federal consultation are always preferable to the alternative. n275
implications, n273 but

Reg negs solve federalismtraditional rulemaking fails


Ryan 11
(Erin Ryan holds a B.A. 1991 Harvard-Radcliffe College, cum laude, M.A. 1994 Wesleyan University, J.D.
2001 Harvard Law School, cum laude. Erin Ryan teaches environmental and natural resources law,
property and land use, water law, negotiation, and federalism. She has presented at academic and
administrative venues in the United States, Europe, and Asia, including the Ninth Circuit Judicial
Conference, the U.S.D.A. Office of Ecosystem Services and Markets, and the United Nations Institute
for Training and Research. She has advised National Sea Grant multilevel governance studies
involving Chesapeake Bay and consulted with multiple institutions on developing sustainability
programs. She has appeared in the Chicago Tribune, the London Financial Times, the PBS Newshour
and Christian Science Monitors Patchwork Nation project, and on National Public Radio. She is the
author of many scholarly works, including Federalism and the Tug of War Within (Oxford, 2012).
Professor Ryan is a graduate of Harvard Law School, where she was an editor of the Harvard Law
Review and a Hewlett Fellow at the Harvard Negotiation Research Project. She clerked for Chief Judge
James R. Browning of the U.S. Court of Appeals for the Ninth Circuit before practicing environmental,
land use, and local government law in San Francisco. She began her academic career at the College of
William & Mary in 2004, and she joined the faculty at the Northwestern School of Law at Lewis & Clark
College in 2011. Ryan spent 2011-12 as a Fulbright Scholar in China, during which she taught
American law, studied Chinese governance, and lectured throughout Asia. Ryan, E. Boston Law Review,
2011. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1583132//ghs-kw)

bargaining in which the normative leverage of federalism values heavily


influences the ex-change offers the most reliable interpretive tools, smoothing out
leverage imbalances and focusing bargainers' in-terlinking interests . n619
Negotiations in which participants are motivated by shared regard for checks, localism, accountability, and
synergy naturally foster constitutional process and hedge against non-consensual dealings. All federalism
bargaining trades on the normative values of federalism to some degree, and any given
Unsurprisingly,

negotiation may feature it more or less prominently based on the factual particulars. n620 Yet the taxonomy reveals
several forms in which federalism values predominate by design, and which may prove especially valuable in
fraught federalism contexts: negotiated rulemaking, policymaking laboratory negotiations, and iterative federalism.
n621 These ex-amples indicate the potential for purposeful federalism engineering to reinforce procedural regard

Negotiated Rulemaking between state


and federal actors improves upon traditional administrative rule-making in fostering
participation, localism, and synergy by incorporating genuine state input into
federal regula-tory planning. n622 Most negotiated rulemaking also uses professional
intermediaries to ensure that all stake-holders are appropriately engaged and to
facilitate the search for outcomes that meet parties' dovetailing interests. n623 For
for state and fed-eral roles within the American system. (1)

example, after discovering that extreme local variability precluded a uniform federal program, Phase LI stormwater
negotiators invited municipal dischargers to design individually [*123] tailored programs within general federal
limits. n624

Considering the massive number of municipalities involved, the fact that

the rule faced legal challenge from only a handful of Texas municipalities testifies to
the strength of the consensus through which it was created. By contrast, the
iterative exchange within standard notice-and-comment rulemaking--also an
example of feder-alism bargaining--can frustrate state participation by denying
participants meaningful opportunities for consulta-tion, collaborative
problem-solving, and real-time accountability The contrast between noticeand-comment and negotiated rulemaking, exemplified by the two phases of REAL ID rulemaking,
demonstrates the difference be-tween more and less successful instances
of federalism bargaining. n625 Moreover, the difficulty of asserting state consent to the
products of the REAL ID notice-and-comment rulemaking (given the outright rebellion that fol-lowed) limits its
interpretive potential. Negotiated rulemakings take longer than other forms of administrative
rulemaking, but are more likely to succeed over time. Regulatory matters best suited for state-federal
negotiated rulemaking include those in which a decisive federal rule is needed to overcome spillover effects,
holdouts, and other collective action problems, but unique and diverse state expertise is needed for the creation of

Matters in contexts of overlap least suited for negotiated rulemaking


include those in which the need for immediate policy overcomes the need for broad
participation--but even these leave open possibilities for incremental rulemaking, in which the initial federal
wise policy.

rule includes mechanisms for periodic reevaluation with local input.

2NC Fism NB Heg Impact


Fast growth promotes US leadership and solves great power
war
Khalilzad 11 PhD, Former Professor of Political Science @ Columbia,
Former ambassador to Iraq and Afghanistan
(Zalmay Khalilzad was the United States ambassador to Afghanistan, Iraq,
and the United Nations during the presidency of George W. Bush and the
director of policy planning at the Defense Department from 1990 to 1992.
"The Economy and National Security" Feb 8
http://www.nationalreview.com/articles/259024/economy-and-nationalsecurity-zalmay-khalilzad)//BB
economic
trends pose the most severe long-term threat to the U nited States
position as global leader. While the United States suffers from
low economic
growth, the economies of rival powers are developing rapidly. continuation
could
lead to a shift from American primacy toward a multi-polar global system, leading
to
geopolitical rivalry and war among the great powers.
Today,

and fiscal

fiscal imbalances and

The

of these two trends

in turn

increased

even

The current recession is the result of a deep financial crisis,

not a mere fluctuation in the business cycle. Recovery is likely to be protracted. The crisis was preceded by the buildup over two decades of enormous amounts of debt throughout the U.S. economy ultimately totaling
almost 350 percent of GDP and the development of credit-fueled asset bubbles, particularly in the housing sector. When the bubbles burst, huge amounts of wealth were destroyed, and unemployment rose to over 10
percent. The decline of tax revenues and massive countercyclical spending put the U.S. government on an unsustainable fiscal path. Publicly held national debt rose from 38 to over 60 percent of GDP in three years .

Without faster economic growth


rates

and actions to reduce deficits, publicly held national debt is projected to reach dangerous proportions. If

were to rise significantly, annual interest payments which already are larger than the defense budget

would crowd out other spending

interest

or require substantial

tax increases that would undercut economic growth. Even worse, if unanticipated events trigger what economists call a sudden stop in credit markets for U.S. debt, the United States would be unable to roll over its
outstanding obligations, precipitating a sovereign-debt crisis that would almost certainly compel a radical retrenchment of the United States internationally. Such scenarios would reshape the international order.

the economic devastation of Britain and France


countries to relinquish their empires

during World War II, as well as the rise of other powers,

It was

that led both

. In the late 1960s, British leaders concluded that they lacked the economic capacity to maintain a presence east of Suez. Soviet

economic weakness, which crystallized under Gorbachev, contributed to their decisions to withdraw from Afghanistan, abandon Communist regimes in Eastern Europe, and allow the Soviet Union to fragment. If the U.S. debt

the United States would be compelled to retrench,


shedding
international commitments We face this domestic challenge while other major
problem goes critical,

reducing its military spending and

powers are experiencing rapid economic growth

. Even though countries such as China, India, and Brazil have profound political, social,

If U.S.
policymakers fail to act
The closing of the
gap
could intensify geopolitical competition among major powers,
and
the higher risk of escalation.
the longest period of peace among the great powers has been the era of
U.S. leadership
multi-polar systems have been unstable, with
major wars among the great powers.
American
retrenchment could have devastating consequences
there would be a heightened possibility of arms races,
miscalculation, or other crises spiraling into all-out conflict
weaker powers may shift their geopolitical posture away from the United States.
hostile states would be emboldened to make aggressive moves in their regions
demographic, and economic problems, their economies are growing faster than ours, and this could alter the global distribution of power. These trends could in the long term produce a multi-polar world.
and other powers continue to grow, it is not a question of whether but when a new international order will emerge.

between the United States and its rivals

for local powers to play major powers against one another,

increase incentives

undercut our will to preclude or respond to international crises because of

The stakes are high. In modern history,

. By contrast,

crises and

their competitive dynamics resulting in frequent

Failures of multi-polar international systems produced both world wars.

. Without an American security blanket, regional powers could rearm in an

attempt to balance against emerging threats. Under this scenario,

. Alternatively, in seeking to accommodate the stronger powers,

Either way,

Slow growth leads to hegemonic wars relative gap is key


Goldstein 7 - Professor of Global Politics and International Relations @
University of Pennsylvania,
(Avery Goldstein, Power transitions, institutions, and China's rise in East
Asia: Theoretical expectations and evidence, Journal of Strategic Studies,
Volume30, Issue 4 & 5 August, EBSCO)
Two closely related, though distinct, theoretical arguments focus explicitly on the consequences for international
politics of a shift in power between a dominant state and a rising power. In War and Change in World Politics,
Robert

Gilpin suggested that

peace prevails when a dominant states capabilities enable it to govern an

as economic and technological diffusion


proceeds during eras of peace and development, other states are empowered. Moreover, the
international order that it has shaped. Over time, however,

burdens of international governance drain and distract the reigning hegemon,

challengers eventually emerge who seek to rewrite the rules of governance. As the
power advantage of the erstwhile hegemon ebbs, it may become desperate
enough to resort to theultima ratio of international politics, force, to forestall the increasingly
urgent demands of a rising challenger. Or as the power of the challenger rises, it may
be tempted to press its case with threats to use force. It is the rise and fall of the great
powers that creates the circumstances under which major wars, what Gilpin labels hegemonic
wars, break out.13 Gilpins argument logically encourages pessimism about the implications of a rising China. It
and

leads to the expectation that international trade, investment, and technology transfer will result in a

diffusion of American economic power, benefiting the rapidly developing states of


the world, including China. As the US simultaneously scurries to put out the many brushfires that threaten its farflung global interests (i.e., the classic problem of overextension), it will be unable to devote sufficient
resources to maintain or restore its former advantage over emerging competitors like China. While
the erosion of the once clear American advantage plays itself out, the US will find
it ever more difficult to preserve the order in Asia that it created during its era of
preponderance. The expectation is an increase in the likelihood for the use of force
either by a Chinese challenger able to field a stronger military in support of its demands for greater
influence over international arrangements in Asia , or by a besieged American
hegemon desperate to head off further decline. Among the trends that alarm those
who would look at Asia through the lens of Gilpins theory are Chinas expanding share of world
trade and wealth(much of it resulting from the gains made possible by the international economic order a
dominant US established); its acquisition of technology in key sectors that have both civilian and
steady

military applications (e.g., information, communications, and electronics linked with to forestall, and the challenger
becomes increasingly determined to realize the transition to a new international order whose contours it will
define. the revolution in military affairs); and an expanding military burden for the US (as it copes with the
challenges of its global war on terrorism and especially its struggle in Iraq) that limits the resources it can devote to
preserving its interests in East Asia.14 Although similar to Gilpins work insofar as it emphasizes the importance of
shifts in the capabilities of a dominant state and a rising challenger, the power-transition theory A. F. K. Organski
and Jacek Kugler present in The War Ledger focuses more closely on the allegedly dangerous phenomenon of
crossover the point at which a dissatisfied challenger is about to overtake the established leading state.15 In

when the power gap narrows, the dominant state becomes increasingly
desperate. Though suggesting why a rising China may ultimately present grave dangers for international
peace when its capabilities make it a peer competitor of America, Organski and Kuglers power-transition
theory is less clear about the dangers while a potential challenger still lags far behind and faces a difficult
such cases,

struggle to catch up. This clarification is important in thinking about the theorys relevance to interpreting Chinas
rise because a broad consensus prevails among analysts that Chinese military capabilities are at a minimum two

points with alarm to trends


in Chinas growing wealth and power relative to the United States, but especially looks
ahead to what it sees as the period of maximum danger that time when a
dissatisfied China could be in a position to overtake the US on dimensions believed
crucial for assessing power. Reports beginning in the mid-1990s that offered
extrapolations suggesting Chinas growth would give it the worlds largest gross domestic
product (GDP aggregate, not per capita) sometime in the first few decades of the twentieth
century fed these sorts of concerns about a potentially dangerous challenge to American leadership in Asia.17
decades from putting it in a league with the US in Asia.16 Their theory, then,

The huge gap between Chinese and American military capabilities (especially in terms of technological
sophistication) has so far discouraged prediction of comparably disquieting trends on this dimension, but inklings of
similar concerns may be reflected in occasionally alarmist reports about purchases of advanced Russian air and
naval equipment, as well as concern that Chinese espionage may have undermined the American advantage in
nuclear and missile technology, and speculation about the potential military purposes of Chinas manned space

because a dominant state may react to the prospect of a crossover


and believe that it is wiser to embrace the logic of preventive war and act early to
delay a transition while the task is more manageable, Organski and Kuglers powertransition theory also provides grounds for concern about the period prior to the
possible crossover.19
program.18 Moreover,

2NC Ptix NB
Reg negs are bipartisan
Copeland 06
(Curtis W. Copeland, PhD, was formerly a specialist in American government at the Congressional
Research Service (CRS) within the U.S. Library of Congress. Copeland received his PhD degree in
political science from the University of North Texas.His primary area of expertise is federal rulemaking
and regulatory policy. Before coming to CRS in January 2004, Dr. Copeland worked at the U.S. General
Accounting Office (GAO, now the Government Accountability Office) for 23 years on a variety of issues,
including federal personnel policy, pay equity, ethics, procurement policy, management reform, the
Office of Management and Budget (OMB), and, since the mid-1990s, multiple aspects of the federal
rulemaking process. At CRS, he wrote reports and testified before Congress on such issues as federal
rulemaking, regulatory reform, the Congressional Review Act, negotiated rulemaking, the Paperwork
Reduction Act, the Regulatory Flexibility Act, OMBs Office of Information and Regulatory Affairs,
Executive Order 13422, midnight rulemaking, peer review, and risk assessment. He has also written
and testified on federal personnel policies, the federal workforce, GAOs pay-for-performance system,
and efforts to oversee the implementation of the Troubled Asset Relief Program. From 2004 until 2007,
Dr. Copeland headed the Executive Branch Operations section within CRSs Government and Finance
Division. Copeland, C. W. Negotiated Rulemaking, Congressional Research Service, September 18,
2006. http://crs.wikileaks-press.org/RL32452.pdf//ghs-kw)

Negotiated rulemaking (sometimes referred to as regulatory negotiation or reg-neg) is a supplement to


the traditional APA rulemaking process in which agency representatives and representatives of affected parties
work together to develop what can ultimately become the text of a proposed rule.1 In this approach,

negotiators try to reach consensus by evaluating their priorities and making


tradeoffs, with the end result being a draft rule that is mutually acceptable.
Negotiated rulemaking has been encouraged (although not usually required) by both congressional and
executive branch actions, and has received bipartisan support as a way to involve
affected parties in rulemaking before agencies have developed their proposals. Some
questions have been raised, however, regarding whether the approach actually speeds rulemaking or reduces
litigation.

Reg neg solves controversyno link to ptix


Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

agencies have used reg neg to


develop some of their most contentious rules. For example, the Federal Aviation
Administration and the National Park Service used a variant of the process to write
the regulations and policies governing sightseeing flights over national parks; the
issue had been sufficiently controversial that the President had to intervene and direct
Recent Agency Use of Reg Neg. And, indeed, in the past few years

the two agencies to develop rules for the management of sightseeing aircraft in the National Parks where it is
deemed necessary to reduce or prevent the adverse effects of such aircraft.22 The Department of Transportation
used it to write a regulation governing the delivery of propane and other compressed gases when the regulation
became ensnared in litigation and Congressional action.23 The Occupational Safety and Health Administration used
it to address the erection of steel structures, an issue that had been on its docket for more than a decade with two
abortive attempts at rulemaking when OSHA turned to reg neg.24 Th e

Forest Service has just


published a notice of intent to establish a reg neg committee to develop policies
governing the use of fixed anchors for rock climbing in designated wilderness areas
administered by the Forest Service.25 This issue has become extremely controversial .26

Negotiated rulemaking has proven enormously successful in developing agreements


in highly polarized situations and has enabled the parties to address the best, most
effective or efficient way of solving a regulatory controversy. Agencies have
therefore turned to it to help resolve particularly difficult, contentious issues that
have eluded closure by means of traditional rulemaking procedures

2NC CP Solves Ptix Link


The counterplan breaks down adversarialism, is seen as
legitimate, and is key to effective regulation
Mee 97
(Siobhan, Jd, An Attorney In The Complex And Class Action
Litigation Group, Focuses Her Practice On A Broad Range Of
Commercial Litigation, Negotiated Rulemaking And Combined
Sewer Overflows (Csos): Consensus Saves Ossification?, Fall,
1997 25 B.C. Envtl. Aff. L. Rev. 213, Pg Lexis//Um-Ef)
Benefits that accrue to negotiated rulemaking participants correspond to the criticisms of traditional
rulemaking. n132 In particular, proponents of negotiated rulemaking claim that it increases public participation , n133
fosters nonadversarial relationships, n134 and reduces long-term regulatory costs.
n135 Traditionally, agencies have limited the avenues for public participation in
the rulemaking process to reaction and criticism, releasing rules for the public's comment after they have been
developed [*229] internally. n136 In contrast, negotiated rulemaking elicits wider involvement at
the early stages of production . n137 Input from non-agency and nongovernmental actors, who may possess the most relevant knowledge and
who will be most affected by the rule, is a prerequisite to effective
regulation. n138 Increased participation also leads to what Professor Harter considers the overarching benefit of
negotiations: greater legitimacy. n139 Whereas traditional rulemaking lends itself to
adversarialism, n140 negotiated rulemaking is designed to foster cooperation and
accommodation. n141 Rather than clinging to extreme positions, parties prioritize the underlying issues
and seek trade-offs to maximize their overall interests . n142 Participants, including the
agency, discover and address one another's concerns directly . n143 The give-and-take of this process
provides an opportunity for parties with differing viewpoints to test data and arguments directly. n144 The resultant exploration of
different approaches is more likely than the usual notice and comment process to
generate creative solutions and avoid ossification. n145 [*230] Whether or not it results in a rule,
negotiated rulemaking establishes valuable links between groups that otherwise
would only communicate in an adversarial context . n146 Rather than trying to outsmart one another, former
competitors become part of a team which must consider the needs of each member. n147 Working relationships developed during negotiations give
participants an understanding of the other side. n148 As one negotiator reflected, in "working with the opposition you find they're not quite the ogres you

The chance to iron out what are often


long-standing disagreements can only improve future interactions . n150
thought they were, and they don't hate you as much as you thought." n149

2NC AT Perm do Both


Perm do both links to the net benefitdoes the entirety of the
AFF which _____________

2NC AT Perm do the CP


CP is plan minus since it only mandates the creation of a reg
neg committeeonly does the plan if and only if the committee
decides to do sothat means that the CP is uncertain. Perm
severs the certainty of the plan:
Substantially means certain and real
Words and Phrases 1964 (40 W&P 759) (this edition of W&P is out of print;
the page number no longer matches up to the current edition and I was
unable to find the card in the new edition. However, this card is also
available on google books, Judicial and statutory definitions of words and
phrases, Volume 8, p. 7329)
The words outward, open, actual, visible, substantial, and exclusive, in connection with a change of possession,
mean substantially the same thing. They mean not concealed; not hidden; exposed to view; free from concealment,
denoting that which not merely can be, but is
opposed to potential, apparent, constructive, and imaginary; veritable; genuine; certain; absolute; real at
dissimulation, reserve, or disguise; in full existence;

present time, as a matter of fact, not merely nominal; opposed to form; actually existing; true; not including admitting,
or pertaining to any others; undivided; sole; opposed to inclusive. Bass v. Pease, 79 Ill. App. 308, 318.

Should means mustits certain


Supreme Court of Oklahoma 94
(Kelsey v. Dollarsaver Food Warehouse of Durant, Supreme
Court of Oklahoma, 1994.
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn14//ghs-kw)
The turgid phrase - "should be and the same hereby is" - is a tautological absurdity. This is so because "

should" is synonymous with

must

ought or
and is in itself sufficient to effect an inpraesenti ruling - one that is couched in "a present indicative synonymous with ought." See infra
note 15. 3 Carter v. Carter, Okl., 783 P.2d 969, 970 (1989); Horizons, Inc. v. Keo Leasing Co., Okl., 681 P.2d 757, 759 (1984); Amarex, Inc. v. Baker, Okl.,
655 P.2d 1040, 1043 (1983); Knell v. Burnes, Okl., 645 P.2d 471, 473 (1982); Prock v. District Court of Pittsburgh County, Okl., 630 P.2d 772, 775 (1981);
Harry v. Hertzler, 185 Okl. 151, 90 P.2d 656, 659 (1939); Ginn v. Knight, 106 Okl. 4, 232 P. 936, 937 (1925). 4 "Recordable" means that by force of 12 O.S.
1991 24 an instrument meeting that section's criteria must be entered on or "recorded" in the court's journal. The clerk may "enter" only that which is
"on file." The pertinent terms of 12 O.S. 1991 24 are: "Upon the journal record required to be kept by the clerk of the district court in civil cases . . . shall
be entered copies of the following instruments on file: 1. All items of process by which the court acquired jurisdiction of the person of each defendant in
the case; and 2. All instruments filed in the case that bear the signature of the and judge and specify clearly the relief granted or order made." [Emphasis
added.] 5 See 12 O.S. 1991 1116 which states in pertinent part: "Every direction of a court or judge made or entered in writing, and not included in a
judgment is an order." [Emphasis added.] 6 The pertinent terms of 12 O.S. 1993 696.3 , effective October 1, 1993, are: "A. Judgments, decrees and
appealable orders that are filed with the clerk of the court shall contain: 1. A caption setting forth the name of the court, the names and designation of the
parties, the file number of the case and the title of the instrument; 2. A statement of the disposition of the action, proceeding, or motion, including a
statement of the relief awarded to a party or parties and the liabilities and obligations imposed on the other party or parties; 3. The signature and title of
the court; . . ." 7 The court holds that the May 18 memorial's recital that "the Court finds that the motions should be overruled" is a "finding" and not a
ruling. In its pure form, a finding is generally not effective as an order or judgment. See, e.g., Tillman v. Tillman, 199 Okl. 130, 184 P.2d 784 (1947), cited in
the court's opinion. 8 When ruling upon a motion for judgment n.o.v. the court must take into account all the evidence favorable to the party against
whom the motion is directed and disregard all conflicting evidence favorable to the movant. If the court should conclude the motion is sustainable, it must
hold, as a matter of law, that there is an entire absence of proof tending to show a right to recover. See Austin v. Wilkerson, Inc., Okl., 519 P.2d 899, 903
(1974). 9 See Bullard v. Grisham Const. Co., Okl., 660 P.2d 1045, 1047 (1983), where this court reviewed a trial judge's "findings of fact", perceived as a
basis for his ruling on a motion for judgment n.o.v. (in the face of a defendant's reliance on plaintiff's contributory negligence). These judicial findings were
held impermissible as an invasion of the providence of the jury and proscribed by OKLA. CONST. ART, 23, 6 . Id. at 1048. 10 Everyday courthouse
parlance does not always distinguish between a judge's "finding", which denotes nisi prius resolution of fact issues, and "ruling" or "conclusion of law". The
latter resolves disputed issues of law. In practice usage members of the bench and bar often confuse what the judge "finds" with what that official
"concludes", i.e., resolves as a legal matter. 11 See Fowler v. Thomsen, 68 Neb. 578, 94 N.W. 810, 811-12 (1903), where the court determined a ruling that
"[1] find from the bill of particulars that there is due the plaintiff the sum of . . ." was a judgment and not a finding. In reaching its conclusion the court
reasoned that "[e]ffect must be given to the entire in the docket according to the manifest intention of the justice in making them." Id., 94 N.W. at 811. 12
When the language of a judgment is susceptible of two interpretations, that which makes it correct and valid is preferred to one that would render it
erroneous. Hale v. Independent Powder Co., 46 Okl. 135, 148 P. 715, 716 (1915); Sharp v. McColm, 79 Kan. 772, 101 P. 659, 662 (1909); Clay v. Hildebrand,
34 Kan. 694, 9 P. 466, 470 (1886); see also 1 A.C. FREEMAN LAW OF JUDGMENTS 76 (5th ed. 1925). 13 "Should" not only is used as a "present indicative"
synonymous with ought but also is the past tense of "shall" with various shades of meaning not always easy to analyze. See 57 C.J. Shall 9, Judgments
121 (1932). O. JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St. Louis & S.F.R. Co. v. Brown, 45 Okl. 143, 144 P. 1075, 1080-

Certain contexts mandate a


construction of the term "should" as more than merely indicating preference or
desirability. Brown, supra at 1080-81 (jury instructions stating that jurors "should" reduce the amount of damages in proportion to the amount of
81 (1914). For a more detailed explanation, see the Partridge quotation infra note 15.

contributory negligence of the plaintiff was held to imply an obligation and to be more than advisory); Carrigan v. California Horse Racing Board, 60 Wash.
App. 79, 802 P.2d 813 (1990) (one of the Rules of Appellate Procedure requiring that a party "should devote a section of the brief to the request for the fee

mean that a party is under an obligation to include the requested


segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958) ("should" would mean the same as "shall" or "must"
or expenses" was interpreted to

when used in an instruction to the jury which tells the triers they "should disregard false testimony").

2NC AT Theory
Counterinterp: process CPs are legitimate if we have a
solvency advocate
AND, process CPs good:
7. Key to educationwe need to be able to debate the
desirability of the plans regulatory process; testing all
angles of the AFF is key to determine the best policy
option
8. Key to neg groundits the only CP we can run against
regulatory AFFs
9. Predictability and fairnesstheres a huge lit base and
solvency advocate ensures its predictable
Applegate 98
(John S. Applegate holds a law degree from Harvard Law School and a bachelors degree in
English from Haverford College. Nationally recognized for his work in environmental risk
assessment and policy analysis, Applegate has written books and articles on the regulation of
toxic substances, defense nuclear waste, public participation in environmental decisions, and
international environmental law. He serves on the National Academy of Sciences Nuclear and
Radiation Studies Board. In addition, he is an award-winning teacher, known for his ability to
present complex information with an engaging style and wry wit. Before coming to IU,
Applegate was the James B. Helmer, Jr. Professor of Law at the University of Cincinnati College
of Law. He also was a visiting professor at the Vanderbilt University School of Law. From 1983
to 1987, Applegate practiced environmental law in Washington, D.C., with the law firm of
Covington & Burling. He clerked for the late Judge Edward S. Smith of the U.S. Court of
Appeals for the Federal Circuit. John S. Applegate was named Indiana Universitys first vice
president for planning and policy in July 2008. In March 2010, his portfolio was expanded and
his title changed to vice president for university regional affairs, planning, and policy. In
February 2011, he became executive vice president for regional affairs, planning, and policy.
As Executive Vice President for University Academic Affairs since 2013, his office ensures
coordination of university academic matters, strategic plans, external academic relations,
enterprise systems, and the academic policies that enable the university to most effectively
bring its vast intellectual resources to bear in serving the citizens of the state and nation. The
regional affairs mission of OEVPUAA is to lead the development of a shared identity and
mission for all of IU's regional campuses that complements each campus's individual identity
and mission. In addition, Executive Vice President Applegate is responsible for public safety
functions across the university, including police, emergency management, and environmental
health and safety. In appointing him in 2008, President McRobbie noted that "John Applegate
has proven himself to be very effective at many administrative and academic initiatives that
require a great deal of analysis and coordination within the university and with external
agencies, including the Indiana Commission for Higher Education. His experience and
understanding of both academia and the law make him almost uniquely suited to take on
these responsibilities. In 2006, John Applegate was appointed Indiana Universitys first
Presidential Fellow, a role in which he served both President Emeritus Adam Herbert and
current President Michael McRobbie. A distinguished environmental law scholar, Applegate
joined the IU faculty in 1998. He is the Walter W. Foskett Professor of Law at the Indiana
University Maurer School of Law in Bloomington and also served as the schools executive
associate dean for academic affairs from 2002-2009. Applegate, J. S. Beyond the Usual
Suspects: The Use of Citizen Advisory Boards in Environmental Decisionmaking, Indiana Law
Journal, Volume 73, Issue 3, July 1, 1998.
http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=1939&context=ilj//ghs-kw)

There is substantial literature on negotiated rulemaking . The


interested reader might begin with the Negotiated Rulemaking Act of 1990, 5
U.S.C. 561-570 (1994 & Supp. II 1996), Freeman, supra note 53, Philip J. Harter,
Negotiating Regulations: A Cure for Malaise , 71 GEO. L.J. I (1982), Henry E. Perritt,
Jr., Negotiated Rulemaking Before Federal Agencies: Evaluation of the
Recommendations by the Administrative Conference of the United States , 74
GEO. L.J. 1625 (1986), Lawrence Susskind & Gerard McMahon, The Theory and

Practice of Negotiated Rulemaking, 3 YALE J. ON REG. 133 (1985), and an excellent,


just-published issue on regulatory negotiation, Twenty-Eighth Annual
Administrative Law Issue, 46 DUKE L.J. 1255 (1997)

10.
Decision making skillsreg neg is uniquely key to
decision making skills
Fiorino 88
(Daniel J. Fiorino holds a PhD & MA in Political Science from Johns Hopkins University and a BA
in Political Science & Minor in Economics from Youngstown State University. Daniel J. Fiorino is
the Director of the Center for Environmental Policy and Executive in Residence in the School of
Public Affairs at American University. As a faculty member in the Department of Public
Administration and Policy, he teaches courses on environmental policy, energy and climate
change, environmental sustainability, and public management. Dan is the author or co-author
of four books and some three dozen articles and book chapters in his field. According to
Google Scholar, his work has been cited some 2300 times in the professional literature. His
book, The New Environmental Regulation, won the Brownlow Award of the National Academy
of Public Administration (NAPA) for excellence in public administration literature in 2007.
Altogether his publications have received nine national and international awards from the
American Society for Public Administration, Policy Studies Organization, Academy of
Management, and NAPA. His most recent refereed journal articles were on the role of
sustainability in Public Administration Review (2010); explanations for differences in national
environmental performance in Policy Sciences (2011); and technology innovation in renewable
energy in Policy Studies Journal (2013). In 2009 he was a Public Policy Scholar at the Woodrow
Wilson International Center for Scholars. He also serves as an advisor on environmental and
sustainability issues for MDB, Inc., a Washington, DC consulting firm. Dan joined American
University in 2009 after a career at the U.S. Environmental Protection Agency (EPA). Among
his positions at EPA were the Associate Director of the Office of Policy Analysis, Director of the
Waste and Chemicals Policy Division, Senior Advisor to the Assistant Administrator for Policy,
and the Director of the National Environmental Performance Track. The Performance Track
program was selected as one of the top 50 innovations in American government 2006 and
recognized by Administrator Christine Todd Whitman with an EPA Silver Medal in 2002. In
1993, he received EPAs Lee M. Thomas Award for Management Excellence. He has appeared
on or been quoted in several media outlets: the Daily Beast, Newsweek, Christian Science
Monitor, Australian Broadcasting Corporation, Agence France-Presse, and CCTV, on such topics
as air quality, climate change, the BP Horizon Oil Spill, carbon trading, EPA, and U.S.
environmental and energy politics. He currently is co-director of a project on Conceptual
Innovations in Environmental Policy with James Meadowcroft of Carleton University, funded
by the Canada Research Council on Social Sciences and the Humanities. He is a member of the
Partnership on Technology and the Environment with the Heinz Center, Environmental Defense
Fund, Nicholas Institute, EPA, and the Wharton School. He is conducting research on the role
of sustainability in policy analysis and the effects of regulatory policy design and
implementation on technology innovation. In 2013, he created the William K. Reilly Fund for
Environmental Governance and Leadership within the Center for Environmental Policy, working
with associates of Mr. Reilly and several corporate and other sponsors. He is a Fellow of the
National Academy of Public Administration. Dan is co-editor, with Robert Durant, of the
Routledge series on Environmental Sustainability and Public Administration. He is often is
invited to speak to business and academic audiences, most recently as the keynote speaker at
a Tel Aviv University conference on environmental regulation in May 2013. In the summer of
2013 he will present lectures and take part in several events as the Sir Frank Holmes Visiting
Fellow at Victoria University in New Zealand. Fiorino, D. J. Regulatory Negotiations as a Policy
Process, Public Administration Review, Vol 48, No 4, pp 764-772, July-August 1988.
http://www.jstor.org/discover/10.2307/975600?
uid=3739728&uid=2&uid=4&uid=3739256&sid=21104541489843//ghs-kw)

regulatory negotiation reflects the trend


toward alternative dispute settlement. However, because regulatory negotiation is
prospective and general in its application rather than limited to a specific dispute, it also
reflects another theme in American public policy making. That theme is pluralism, or what Robert
Thus, in its premises, objectives, and techniques,

Reich has described in the context of administrative rulemaking interest-group mediation (Reich 1985,

negotiation as a form of regulatory


policy making, especially its contrasts with more analytical policy models. Reich
proposes interest-group mediation and net-benefit maximization as the two
visions that dominate administrative policy making. The first descends from
pluralist political science and was more influential in the 1960s and early 1970s. The second
pp. 1619-1620).[20] Reich's analysis sheds light on

descends from decision theory and micro-economics, and it was more influential in the
late 1970s and early 1980s. In the first, the administrator is a referee who brings affected interests into the
policy process to reconcile their demands and preferences. In the net-benefit model, the administrator is
an analyst who defines policy options, quantifies the likely consequences of each, compares them to a
given set of objectives, and then selects the option offering the greatest net benefit or social utility.

Under the interest-group model, objectives emerge from the bargaining


among influential groups, and a good decision is one to which the parties will
agree. Under the net-benefit model, objectives are articulated in advance as
external guides to the policy process. A good decision is one that meets the
criterion of economic efficiency, defined ideally as a state in which no one
party can improve its position without worsening that of another. 21

11.
Policy educationreg negs are a key part of the
policy process
Spector 99,
(Bertram I. Spector, Senior Technical Director at Management Systems International (MSI) and
Executive Director of the Center for Negotiation Analysis. Ph.D. in Political Science from New
York University, May, 1999, Negotiated Rulemaking: A Participative Approach to ConsensusBuilding for Regulatory Development and Implementation, Technical Notes: A Publication of
USAIDs Implementing Policy Change Project, http://www.negotiations.org/Tn-10%20%20Negotiated%20Rulemaking.pdf) AJ

Why use negotiated rulemaking? What are the implications for policy reform, the
implementation of policy changes, and conflict between stakeholders and government? First, the
process generates an environment for dialogue that facilitates the reality
testing of regulations before they are implemented. It enables policy reforms
to be discussed in an open forum by stakeholders and for tradeoffs to be
made that expedite compliance among those who are directly impacted by
the reforms. Second, negotiated rulemaking is a process of
empowerment. It encourages the participation and enfranchisement of parties that have a stake in
reform. It provides voice to interests, concerns and priorities that otherwise
might not be heard or considered in devising new policy. Third, it is a
process that promotes creative but pragmatic solutions. By encouraging a
holistic examination of the policy area, negotiated rulemaking asks the participants to
assess the multiple issues and subissues involved, set priorities among them,
and make compromises. Such rethinking often yields novel and unorthodox
answers. Fourth, negotiated rulemaking offers an efficient mechanism
for policy implementation. Experience shows that it results in earlier
implementation; higher compliance rates; reduced time, money and effort
spent on enforcement; increased cooperation between the regulator and
regulated parties; and reduced litigation over the regulations. Regulatory
negotiations can yield both better solutions and more efficient
compliance.

12.

At worse, reject the argument, not the team

2NC AT Agency Responsiveness


No difference in agency responsiveness
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)
3. Negotiated Rulemaking Does Not Abrogate the Agency's Responsibility to Execute Delegated Authority Overall,
the evidence from Phase II is generally inconsistent with the theoretical but empirically untested claim that EPA has
failed to retain its responsibility for writing rules in negotiated settings. Recall that theorists disagree over whether
reg neg will increase agency responsiveness. Most scholars assume that EPA retains more authority in conventional
rulemaking, and that participants exert commensurately less influence over conventional as opposed to negotiated
rules. To test this hypothesis, Kerwin and Langbein asked participants about disproportionate influence and about
agency responsiveness to the respondent personally, as well as agency responsiveness to the public in general. The

the agency is equally responsive to participants in


conventional and negotiated rulemaking, consistent with the hypothesis
that the agency listens to the affected parties regardless of the method of
rule development. Further, when asked what they disliked about the process, less than 10% of both
results suggest that

negotiated and conventional participants volunteered "disproportionate influence." When asked whether any party
had disproportionate influence during rule development, 44% of conventional respondents answered "yes,"

EPA was as likely to be viewed as having


disproportionate influence in negotiated as conventional rules (25% versus 32%
compared to 48% of reg neg respondents. In addition,

respectively). It follows that roughly equal proportions of participants in negotiated and conventional rules viewed
other participants, and especially EPA, as having disproportionate influence. Kerwin and Langbein asked those who
reported disproportionate influence what about the rule led them to believe that lopsided influence existed. In

negotiated rulemaking participants were significantly more likely to see


excessive influence by one party in the process rather than in the rule itself, as
compared to conventional participants (55% versus 13% respectively). However, when asked what it
response,

was about the process that fostered disproportionate influence, conventional rule participants were twice as likely
as negotiated rule participants to point to the central role of EPA (63% versus 30% respectively). By contrast,
negotiated rule participants pointed to other participants who were particularly vocal and active during the
negotiation sessions (26% of negotiated rule respondents versus no conventional respondents). When asked about
agency responsiveness, negotiated rule participants were significantly more likely than conventional rule
participants to view both general participation, and their personal participation, as having a "major" impact on the
proposed rule. By contrast, conventional participants were more likely to see "major" differences between the
proposed and final rule and to believe that public participation and their own participation had a "moderate" or

negotiated rules are


designed so that public participation should have its greatest impact on the
proposed rule; conventional rules are structured so that public participation should
have its greatest impact on the final rule. Given these differences in how the two processes are designed, Kerwin and Langbein sought to measure agency responsiveness overall , rather than at the two
separate moments of access. Although the differences were not statistically significant , the results
"major" impact on that change. These results conform to the researchers' expectations:

suggest that conventional participants perceived their public and personal contribution to rulemaking to have had

given the
absence of statistical significance, we agree with the researchers that it is safer to conclude that the
agency is equally responsive to both conventional and negotiated rule
participants.
slightly more impact than negotiated rule participants perceived their contribution to have had. Still,

2NC AT Cost
Reg negs are more cost effective
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

Negotiated Rulemaking Has Fulfilled its Goals. If better rules were the aspirations for negotiated
rulemaking, the question remains as to whether the process has lived up to the expectations. From my own
personal experience, the

rules that emerge from negotiated rulemaking tend to be


both more stringent and yet more cost effective to implement. That
somewhat paradoxical result comes precisely from the practical orientation of the
committee: it can figure out what information is needed to make a reasonable,
responsible decision and then what actions will best achieve the goal; it can,
therefore, avoid common regulatory mistakes that are costly but do not contribute
substantially to accomplishing the task. The only formal evaluation of negotiated rulemaking that
has been conducted supports these observations. After his early article analyzing the time required for negotiated
rulemaking, Neil Kerwin undertook an evaluation of negotiated rulemaking at the Environmental Protection Agency

Kerwin and Langbein conducted a study of negotiated


rulemaking by examining what actually occurs in a reg neg versus the development of rules by conventional
with Dr. Laura Langbein.103

means. To establish the requisite comparison, they collected data on litigation, data from the comments on
proposed rules, and data from systematic, open-ended interviews with participants in 8 negotiated rules . . . and in
6 comparable conventional rules.104 They interviewed 51 participants of conventional rulemaking and 101 from

Kerwin and Langbeins important work provides


the only rigorous, empirical evaluation that compares a number of factors of
conventional and negotiated rulemaking. Their overall conclusion is: Our research contains strong
but qualified support for the continued use of negotiated rulemaking. The strong
support comes in the form of positive assessments provided by participants in
negotiated rulemaking compared to assessments offered by those involved in
conventional form of regulation development. Further, there is no evidence that
negotiated rules comprise an abrogation of agency authority, and negotiated rules
appear no more (or less) subject to litigation that conventional rules. It is also true that
various negotiated rulemaking committees.105

negotiated rulemaking at the EPA is used largely to develop rules that entail particularly complex issues regarding
the implementation and enforcement of legal obligations rather than those that set the substantive standards
themselves. However, participants

assessments of the resulting rules are more positive


when the issues to be decided entail those of establishing rather than enforcing the
standard. Further, participants assessments are also more positive when the issues
to be decided are relatively more complex. Our research would support a recommendation that

negotiated rulemaking continue to be applied to complex issues, and more widely applied to include those entailing
the standard itself.106 Their findings are particularly powerful when comparing individual attributes of negotiated

negotiated rules
were viewed more favorably in every criteria, and significantly so in several
dimensions that are often contentious in regulatory debates the economic efficiency of
and conventional rules. Table 3 contains a summary of those comparisons. Importantly,

the rule and its cost effectiveness the quality of the scientific evidence and the incorporation of appropriate
technology, and personal experience is not usually considered in dialogues over regulatory procedure, Kerwin

The benefits envisioned by the


proponents of negotiated rulemaking have indeed been realized. That is
and Langbeins findings here too favor negotiated rules. Conclusion.

demonstrated both by Coglianeses own methodology when properly


understood and by the only careful and comprehensive comparative study .
Reg neg has proven to be an enormously powerful tool in addressing highly
complex, politicized rules. These are the very kind that stall agencies when using
traditional or conventional procedures.107 Properly understood and used
appropriately, negotiated rulemaking does indeed fulfill its expectations

Reg negs are cheaper


Langbein and Kerwin 00
(Laura I. Langbein is a quantitative methodologist and professor of public administration and policy at
American University in Washington, D.C. She teaches quantitative methods, program evaluation,
policy analysis, and public choice. Her articles have appeared in journals on politics, economics, policy
analysis and public administration. Langbein received a BA in government from Oberlin College in
1965 and a PhD in political science from the University of North Carolina at Chapel Hill in 1972. She
has taught at American University since 1973: until 1978 as an assistant professor in the School of
Government and Public Administration; from 1978 to 1983 as an associate professor in the School of
Government and Public Administration; and since 1983 as a professor in the School of Public Affairs.
She is also a private consultant on statistics, research design, survey research, and program
evaluation and an accomplished clarinetist. Cornelius Martin "Neil" Kerwin (born April 10, 1949)(2) is
an American educator in public administration and president of American University. A 1971
undergraduate alumnus of American University, Kerwin continued his education with a Master of Arts
degree in political science from the University of Rhode Island in 1973. In 1975, Kerwin returned to his
alma mater and joined the faculty of the American University School of Public Affairs, then the School
of Government and Public Administration. Kerwin completed his doctorate in political science from
Johns Hopkins University in 1978 and continued to teach until 1989, when he became the dean of the
school. Langbein, L. I. Kerwin, C. M. Regulatory Negotiation versus Conventional Rule Making: Claims,
Counterclaims, and Empirical Evidence, Journal of Public Administration Research and Theory, July
2000. http://jpart.oxfordjournals.org/content/10/3/599.full.pdf//ghs-kw)

negotiated rule making. The


strong support comes in the form of positive assessments provided by participants
in negotiated rule making compared to assessments offered by those involved in
conventional forms of regulation development. There is no evidence that negotiated
rules comprise an abrogation of agency authority , and negotiated rules appear no more (or less)
subject to litigation than conventional rules. It is also true that negotiated rule making at the EPA is used
largely to develop rules that entail particularly complex issues regarding the
implementation and enforcement of legal obligations rather than rules that set substantive
Our research contains strong but qualified support for the continued use of

standards. However, participants' assessments of the resulting rules are more positive when the issues to be
decided entail those of establishing rather than enforcing the standard. Participants' assessments are also more
positive when the issues to be decided are relatively less complex. But even when these and other variables are

assessments are significantly more positive than those of


participants in conventional rule making. In short, the process itself seems to affect
participants' views of the rule making, independent of differences between the
types of rules chosen for conventional and negotiated rule making, and independent
of differences among the participants, including differences in their views of the
economic net benefits of the particular rule . This finding is consistent with theoretical expectations
controlled, reg neg participants' overall

regarding the importance of participation and the importance of face-to-face communication to increase the
likelihood of Pareto-improving social outcomes. With respect to participation, previous research indicates that

compliance with a law or regulation and support for policy choice are more likely to
be forthcoming not only when it is economically rational but also when the process
by which the decision is made is viewed as fair (Tyler 1990; Kunreuther et al. 1993; Frey and
Oberholzer-Gee 1996). While we did not ask respondents explicitly to rate the fairness of the rule-making process in

evidence presented in this study shows that reg neg participants


rated the overall process (with and without statistical controls in exhibits 9 and 1 respectively) and the
ability of EPA equitably to implement the rule (exhibit 1) significantly higher than
conventional rule-making participants did. Further, while conventional rule-making participants were
which they participated,

more likely to say that there was no party with disproportionate influence during the development of the rule, reg
neg participants voluteered significantly more positive comments and significantly fewer negative comments about
the process overall. In general,

reg neg appears more likely than conventional rule making to

leave participants with a warm glow about the decision-making process. While the
regression results show that the costs and benefits of the rule being promulgated figure prominently into the

process matters too. Participants care not only


about how rules and policies affect them economically, they also care about how
the authorities who make and implement rules and policies treat them (and others). In
fact, one reg neg respondent, the owner of a small shop that manufactured wood
burning stoves, remarked about the woodstoves rule, which would put him out of
business, that he felt satisfied even as he participated in his own "wake." It remains for
respondents' overall assessment of the final rule,

further research to show whether this warm glow affects long term compliance and whether it extends to affected
parties who were not direct participants in the negotiation process. It is unclear from our research whether greater
satisfaction with negotiated rules implies that negotiated rules are Pareto-superior to conventionally written
rules.13 Becker's (1983) theory of political competition among interest groups implies that in the absence of
transactions costs, groups that bear large costs and opposing groups that reap large benefits have directly
proportional and equal incentives to lobby. Politicians who seek to maximize net political support respond by
balancing costs and benefits at the margin, and the resulting equilibrium will be no worse than market failure would
be. Transactions costs, however, are not zero, and they may not be equal for interests on each side of an issue. For
example, in many environmental policy issues, the benefits are dispersed and occur in the future, while some, but

transactions costs are different


for beneficiaries than for losers. If reg neg reduces transactions costs compared to conventional rule
not all, costs are concentrated and occur now. The consequence is that

making, or if reg neg reduces the imbalance in transactions costs between winners and losers, or among different

it might be reasonable to expect negotiated rules to be


Pareto-superior to conventionally written rules. Reg neg may reduce transactions
costs in two ways. First, participation in writing the proposed rule (which sets the agenda
that determines the final rule) is direct, at least for the participants. In conventional rule making, each
interest has a repeated, bilateral relation with the rule-making agency; the rule-making agency proposes
the rule (and thereby controls the agenda for the final rule), and affected interests respond separately to what is
kinds of winners and losers, then

in the agency proposal. In negotiated rule making, each interest (including the agency) is in a repeated N-person
set of mutual relations; the negotiating group drafts the proposed rule, thereby setting the agenda for the final rule.

Since the agency probably knows less about each group's costs and benefits than
the group knows about its own costs and benefits, the rule that emerges from direct
negotiation should be a more accurate reflection of net benefits than one that is
written by the agency (even though the agency tries to be responsive to the affected parties). In effect,
reg neg can be expected to better establish a core relationship of trust,
reputation, and reciprocity that Ostrom (1998) argues is central to improving net social
benefits. Reg neg may reduce transactions costs not only by entailing repeated
mutual rather than bilateral relations, but also by face to face communication . Ostrom
(1998, 13) argues that face-to-face communication reduces transactions costs by making
it easier to assess trustworthiness and by lowering the decision costs of reaching a
"contingent agreement," in which "individuals agree to contribute x resources to a
common effort so long as at least y others also contribute." In fact, our survey results show
that reg neg participants are significantly more likely than conventional rule-making
participants to believe that others will comply with the final rule (exhibit 1). In the absence
of outside assessments that compare net social benefits of the conventional and negotiated rules in this study,15
the hypothesis that reg neg is Pareto superior to conventional rule making remains an untested speculation.
Nonetheless, it seems to be a plausible hypothesis based on recent theories regarding the importance of
institutions that foster participation in helping to effect Pareto-preferred social outcomes.

2NC AT Consensus
Negotiating parties fear the alternative, which is worse than
reg neg
Perritt 86
(Professor Perritt earned his B.S. in engineering from MIT in 1966, a master's degree in management
from MIT's Sloan School in 1970, and a J.D. from Georgetown University Law Center in 1975. Henry H.
Perritt, Jr., is a professor of law at IIT Chicago-Kent College of Law. He served as Chicago-Kent's dean
from 1997 to 2002 and was the Democratic candidate for the U.S. House of Representatives in the
Tenth District of Illinois in 2002. Throughout his academic career, Professor Perritt has made it
possible for groups of law and engineering students to work together to build a rule of law, promote
the free press, assist in economic development, and provide refugee aid through "Project Bosnia,"
"Operation Kosovo" and "Destination Democracy." Professor Perritt is the author of more than 75 law
review articles and 17 books on international relations and law, technology and law, employment law,
and entertainment law, including Digital Communications Law, one of the leading treatises on Internet
law; Employee Dismissal Law and Practice, one of the leading treatises on employment-at-will; and
two books on Kosovo: Kosovo Liberation Army: The Inside Story of an Insurgency, published by the
University of Illinois Press, and The Road to Independence for Kosovo: A Chronicle of the Ahtisaari
Plan, published by Cambridge University Press. He is active in the entertainment field, as well, writing
several law review articles on the future of the popular music industry and of video entertainment. He
also wrote a 50-song musical about Kosovo, You Took Away My Flag, which was performed in Chicago
in 2009 and 2010. A screenplay for a movie about the same story and characters has a trailer online
and is being shopped to filmmakers. His two new plays, Airline Miles and Giving Ground, are scheduled
for performances in Chicago in 2012. His novel, Arian, was published by Amazon.com in 2012. He has
two other novels in the works. He served on President Clinton's Transition Team, working on
telecommunications issues, and drafted principles for electronic dissemination of public information,
which formed the core of the Electronic Freedom of Information Act Amendments adopted by Congress
in 1996. During the Ford administration, he served on the White House staff and as deputy under
secretary of labor. Professor Perritt served on the Computer Science and Telecommunications Policy
Board of the National Research Council, and on a National Research Council committee on "Global
Networks and Local Values." He was a member of the interprofessional team that evaluated the FBI's
Carnivore system. He is a member of the bars of Virginia (inactive), Pennsylvania (inactive), the
District of Columbia, Maryland, Illinois and the United States Supreme Court. He is a member of the
Council on Foreign Relations and served on the board of directors of the Chicago Council on Foreign
Relations, on the Lifetime Membership Committee of the Council on Foreign Relations, and as
secretary of the Section on Labor and Employment Law of the American Bar Association. He is vicepresident and a member of the board of directors of The Artistic Home theatre company, and is
president of Mass. Iota-Tau Association, the alumni corporation for the SAE fraternity chapter at MIT.
Perritt, H. H. Negotiated Rulemaking Before Federal Agencies: Evaluation of Recommendations By the
Administrative Conference of the United States, Georgetown Law Journal, Volume 74. August, 1976.
http://www.kentlaw.edu/perritt/publications/74_GEO._L.J._1625.htm//ghs-kw)

The negotiations moved slowly until the FAA submitted a draft rule to the
participants. This reinforced the view that the FAA would move unilaterally. It also
reminded the parties that there would be things in a unilaterally
promulgated rule that they would not like--thus reminding them that their
BATNAs were worse than what was being considered at the negotiating
table. Participation by the Vice President's Office, the Office of the Secretary of Transportation, and the OMB at
the initial session discouraged participants from thinking they could influence the contents of the rule outside the
negotiation process. One attempt to communicate with the Administrator while the negotiations were underway

The participants tacitly agreed that it would not be feasible to


develop a 'total package' to which the participants formally could agree. Instead, their
objectives were to narrow differences, explore alternative ways of achieving
objectives at less disruption to operational exigencies, and educate the FAA on
practical issues. The mediator had an acute sense that the negotiation process
should stop before agreement began to erode. Accordingly, he forbore to force
explicit agreement on difficult issues, took few votes, and adjourned the
negotiations when things began to unravel. In addition, the FAA, the mediator, and
participants were tolerant of the political need of participants to adhere to
positions formally, even though signals were given that participants could
live with something else. Agency participation in the negotiating sessions was crucial to the
was rebuffed. [FN263]

usefulness of this type of process. Because the agency was there, it could form its own impressions of what a

it was easy for the agency to


proceed with a consensus standard because it had an evolving sense of the
consensus. Without agency participation, a more formal step would have been necessary to communicate
party's real position was, despite adherence to formal positions. In addition,

negotiating group views to the agency. Taking this formal step could have proven difficult or impossible because it

the presence of an outside


contractor who served as drafter was of some assistance. The drafter, a former FAA
employee, assisted informally in resolving internal FAA disagreements over the
proposed rule after negotiations were adjourned.
would have necessitated more formal participant agreement. In addition,

Reg neg produces participant satisfaction and reduces conflict


consensus will happen
Langbein and Kerwin 00
(Laura I. Langbein is a quantitative methodologist and professor of public administration and policy at
American University in Washington, D.C. She teaches quantitative methods, program evaluation,
policy analysis, and public choice. Her articles have appeared in journals on politics, economics, policy
analysis and public administration. Langbein received a BA in government from Oberlin College in
1965 and a PhD in political science from the University of North Carolina at Chapel Hill in 1972. She
has taught at American University since 1973: until 1978 as an assistant professor in the School of
Government and Public Administration; from 1978 to 1983 as an associate professor in the School of
Government and Public Administration; and since 1983 as a professor in the School of Public Affairs.
She is also a private consultant on statistics, research design, survey research, and program
evaluation and an accomplished clarinetist. Cornelius Martin "Neil" Kerwin (born April 10, 1949)(2) is
an American educator in public administration and president of American University. A 1971
undergraduate alumnus of American University, Kerwin continued his education with a Master of Arts
degree in political science from the University of Rhode Island in 1973. In 1975, Kerwin returned to his
alma mater and joined the faculty of the American University School of Public Affairs, then the School
of Government and Public Administration. Kerwin completed his doctorate in political science from
Johns Hopkins University in 1978 and continued to teach until 1989, when he became the dean of the
school. Langbein, L. I. Kerwin, C. M. Regulatory Negotiation versus Conventional Rule Making: Claims,
Counterclaims, and Empirical Evidence, Journal of Public Administration Research and Theory, July
2000. http://jpart.oxfordjournals.org/content/10/3/599.full.pdf//ghs-kw)

negotiated rule making. The


strong support comes in the form of positive assessments provided by participants
in negotiated rule making compared to assessments offered by those involved in
conventional forms of regulation development. There is no evidence that negotiated
rules comprise an abrogation of agency authority , and negotiated rules appear no more (or less)
subject to litigation than conventional rules. It is also true that negotiated rule making at the EPA is used
largely to develop rules that entail particularly complex issues regarding the
implementation and enforcement of legal obligations rather than rules that set substantive
Our research contains strong but qualified support for the continued use of

standards. However, participants' assessments of the resulting rules are more positive when the issues to be
decided entail those of establishing rather than enforcing the standard. Participants' assessments are also more
positive when the issues to be decided are relatively less complex. But even when these and other variables are

assessments are significantly more positive than those of


conventional rule making. In short, the process itself seems to affect
participants' views of the rule making, independent of differences between the
types of rules chosen for conventional and negotiated rule making, and independent
of differences among the participants, including differences in their views of the
economic net benefits of the particular rule . This finding is consistent with theoretical expectations
controlled, reg neg participants' overall
participants in

regarding the importance of participation and the importance of face-to-face communication to increase the
likelihood of Pareto-improving social outcomes. With respect to participation, previous research indicates that

compliance with a law or regulation and support for policy choice are more likely to
be forthcoming not only when it is economically rational but also when the process
by which the decision is made is viewed as fair (Tyler 1990; Kunreuther et al. 1993; Frey and

Oberholzer-Gee 1996). While we did not ask respondents explicitly to rate the fairness of the rule-making process in

evidence presented in this study shows that reg neg participants


rated the overall process (with and without statistical controls in exhibits 9 and 1 respectively) and the
ability of EPA equitably to implement the rule (exhibit 1) significantly higher than
which they participated,

conventional rule-making participants did. Further, while conventional rule-making participants were
more likely to say that there was no party with disproportionate influence during the development of the rule, reg
neg participants voluteered significantly more positive comments and significantly fewer negative comments about

reg neg appears more likely than conventional rule making to


leave participants with a warm glow about the decision-making process. While the
the process overall. In general,

regression results show that the costs and benefits of the rule being promulgated figure prominently into the

process matters too. Participants care not only


about how rules and policies affect them economically, they also care about how
the authorities who make and implement rules and policies treat them (and others). In
fact, one reg neg respondent, the owner of a small shop that manufactured wood
burning stoves, remarked about the woodstoves rule, which would put him out of
business, that he felt satisfied even as he participated in his own "wake." It remains for
respondents' overall assessment of the final rule,

further research to show whether this warm glow affects long term compliance and whether it extends to affected
parties who were not direct participants in the negotiation process. It is unclear from our research whether greater
satisfaction with negotiated rules implies that negotiated rules are Pareto-superior to conventionally written
rules.13 Becker's (1983) theory of political competition among interest groups implies that in the absence of
transactions costs, groups that bear large costs and opposing groups that reap large benefits have directly
proportional and equal incentives to lobby. Politicians who seek to maximize net political support respond by
balancing costs and benefits at the margin, and the resulting equilibrium will be no worse than market failure would
be. Transactions costs, however, are not zero, and they may not be equal for interests on each side of an issue. For
example, in many environmental policy issues, the benefits are dispersed and occur in the future, while some, but

transactions costs are different


for beneficiaries than for losers. If reg neg reduces transactions costs compared to conventional rule
not all, costs are concentrated and occur now. The consequence is that

making, or if reg neg reduces the imbalance in transactions costs between winners and losers, or among different

it might be reasonable to expect negotiated rules to be


Pareto-superior to conventionally written rules. Reg neg may reduce transactions
costs in two ways. First, participation in writing the proposed rule (which sets the agenda
that determines the final rule) is direct, at least for the participants. In conventional rule making, each
interest has a repeated, bilateral relation with the rule-making agency; the rule-making agency proposes
the rule (and thereby controls the agenda for the final rule), and affected interests respond separately to what is
kinds of winners and losers, then

in the agency proposal. In negotiated rule making, each interest (including the agency) is in a repeated N-person
set of mutual relations; the negotiating group drafts the proposed rule, thereby setting the agenda for the final rule.

Since the agency probably knows less about each group's costs and benefits than
the group knows about its own costs and benefits, the rule that emerges from direct
negotiation should be a more accurate reflection of net benefits than one that is
written by the agency (even though the agency tries to be responsive to the affected parties). In effect,
reg neg can be expected to better establish a core relationship of trust,
reputation, and reciprocity that Ostrom (1998) argues is central to improving net social
benefits. Reg neg may reduce transactions costs not only by entailing repeated
mutual rather than bilateral relations, but also by face to face communication . Ostrom
(1998, 13) argues that face-to-face communication reduces transactions costs by making
it easier to assess trustworthiness and by lowering the decision costs of reaching a
"contingent agreement," in which "individuals agree to contribute x resources to a
common effort so long as at least y others also contribute." In fact, our survey results show
that reg neg participants are significantly more likely than conventional rule-making
participants to believe that others will comply with the final rule (exhibit 1). In the absence
of outside assessments that compare net social benefits of the conventional and negotiated rules in this study,15
the hypothesis that reg neg is Pareto superior to conventional rule making remains an untested speculation.
Nonetheless, it seems to be a plausible hypothesis based on recent theories regarding the importance of
institutions that foster participation in helping to effect Pareto-preferred social outcomes.

A consensus will be reachedparties have incentives to


cooperate and compromise
Harter 09
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.

Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States. Harter, P. J.
Collaboration: The Future of Governance, Journal of Dispute Resolution, Volume 2009, Issue 2,
Article 7. 2009. http://scholarship.law.missouri.edu/cgi/viewcontent.cgi?
article=1581&context=jdr//ghs-kw)

Consensus is often misunderstood. It is typically used, derisively, to mean a group decision that is the

consequence of a "group think" that resulted from little or no exploration of the issues, with neither general inquiry,
discussion, nor deliberation. A common example would be the boss's saying, "Do we all agree? . . . Good, we have
a consensus!" In this context, consensus is the acquiescence to an accepted point of view. It is, as is often alleged,
the lowest common denominator that is developed precisely to avoid controversy as opposed to generating a better
answer. It is a decision resulting from the lack of diversity. It is in fact actually a cascade that may be more extreme
than the views of any member! Thus, the question legitimately is, if this is the understanding of the term, would
you want it if you could get it, or would the result to too diluted? A number of articles posit, with neither
understanding nor research, that it always results in the least common denominator. Done right, however,

consensus is exactly the opposite: it is the wisdom of crowds. It builds on the insights and
experiences of diversity. And it is a vital element of collaborative governance in terms of actually reaching
agreement and in terms of the quality of the resulting agreement. That undoubtedly sounds
counterintuitive, especially for the difficult, complex, controversial matters that are
customarily the subject of direct negotiations among governments and their constituents.
Indeed, you often hear that it can't be done. One would expect that the controversy
would make consensus unlikely or that if concurrence were obtained, it would likely be so watered down
that least common denominator againthat it would not be worth much. But, interestingly, it has
exactly the opposite effect. Consensus can mean many things so it is important to understand what
is consensus for these purposes. The default definition of consensus in the Negotiated Rulemaking Act is the

each interest has


a veto over the decision, and any party may block a final agreement by withholding
concurrence. Consensus has a significant impact on how the negotiations actually
function: It makes it "safe" to come to the table. If the committee were to
make decisions by voting, even if a supermajority were required, a party might fear
being outvoted. In that case, it would logically continue to build power to achieve its will
outside the negotiations. Instead, it has the power inside the room to prevent
something from happening that it cannot live with . Thus, at least for the duration of the
negotiations, the party can focus on the substance of the policy and not build political
might. The committee is converted from a group of disparate , often antagnistic,
interests into one with a common purpose: reaching a mutually acceptable
agreement. During a policy negotiation such as this, you can actually feel the committee snap
together into a coherent whole when the members realize that . It forces the parties
to deal with each other which prevents "rolling" someone : "OK, I have the votes, so shut up
"unanimous concurrence among the interests represented on [the] . . . committee." Thus,

and let's vote." Rolling someone in a negotiation is a very good way to create an opponent, to you and to any

Having to actually listen to each other also creates a friction of ideas that
generates the "wisdom of crowds." It enables
the parties to make sophisticated proposals in which they agree to do something,
but only if other parties agree to do something in return. These "if but only if offers cannot
be made in a voting situation for fear that the offeror would not obtain the necessary quid pro quo. It also
enables the parties to develop and present information they might otherwise be reluctant to
share for fear of its being misused or used against them. A veto prevents that. If a party cannot control
the decision, it will logically amass as much factual information as possible in order to
resulting agreement.

results in better decisionsinstead of a cascade, it

limit the discretion available to the one making the decision; the theory is that if you win on the facts, the range of
choices as to what to do on the policy is considerably narrowed. Thus, records are stuffed with data that may
wellbe irrelevant to the outcome or on which the parties largely agree.

If the decision is made by

consensus, the parties do control the outcome , and as a result, they can concentrate on
making the final decision. The question for the committee then becomes, how much information do we
need to make a responsible resolution? The committee may not need to resolve many of the underlying facts before

the use of consensus can significantly reduce the


amount of defensive (or probably more accurately, offensive) record-building that customarily
attends adversarial processes. It forces the parties to look at the agreement as a
wholeconsensus is reached only on the entire package, not its individual elements. The very essence of
negotiation is that different parties value issues differently. What is important to one party is not so important to
another, and that makes for trades that maximize overall value. The resulting agreement
can be analogized to buying a house: something is always wrong with any house
you would consider buying (price, location, kitchen needs repair, etc.), but you cannot buy only
part of a house or move it to another location; the choice must be made as to which
housethe entire thingyou will purchase. It also means that the resulting decision will
not stray from the statutory mandate . That is because one of the parties to the negotiation is very
a policy choice is clear. Interestingly, therefore,

likely to benefit from an adherence to the statutory requirements and would not concur in a decision that did not

if all of the parties represented concur in the outcome, the


likelihood of a successful challenge is greatly reduced so that the decision
has a rare degree of finality.
implement it. Finally,

2NC AT Speed
Reg neg is bettersolves faster
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

the average length of EPAs negotiated rulemakings


the time it took EPA to fulfill its goal was 751 days or 32% faster than
traditional rulemaking. This knocks a full year off the average time it takes EPA to
develop rule by the traditional method. And, note these are highly complex and
controversial rules and that one of them survived Presidential intervention. Thus,
the dynamics surrounding these rules are by no mean average. This means that
reg negs actual performance is much better than that . Interestingly and consistently, the
Properly understood, therefore,

average time for all of EPAs reg negs when viewed in context is virtually identical to that of the sample drawn by
Kerwin and Furlong77 differing by less than a month. Furthermore, if all of the reg negs that were conducted by
all the agencies that were included in Coglianeses table78 were analyzed along the same lines as discussed

here,79 the average time for all negotiated rulemakings drops to less than 685 days .80
No Substantive Review of Rules Based on Reg Neg Consensus. Coglianese argues that negotiated rules are actually
subjected to a higher incident of judicial review than are rules developed by traditional methods, at least those
issued by EPA.81 But, like his analysis of the time it takes to develop rules, Coglianese fails to look at either what
happened in the negotiated rulemaking itself or the nature of any challenge. For example, he makes much of the
fact that the Grand Canyon visibility rule was challenged by interests that were not a party to the negotiations;82
yet, he also points out that this rule was not developed under the Negotiated Rulemaking Act83 which explicitly
establishes procedures that are designed to ensure that each interest can be represented. This challenge
demonstrates the value of convening negotiations.84 And, it is significantly misleading to include it when discussing
the judicial review of negotiated rules since the process of reg neg was not followed. As for Reformulated Gasoline,
the rule as issued by EPA did not reflect the consensus but rather was modified by EPA under the direction of
President Bush.85 There were, indeed, a number of challenges to the application of the rule,86 but amazingly little
to the rule itself given its history. Indeed, after the proposal was changed, many members of the committee

the fact
that the rule had been negotiated not only resulted in a much better rule, 87 it
enabled the rule to withstand in large part a massive assault. Coglianese also somehow
continued to meet in an effort to put Humpty Dumpty back together again, which they largely did;

attributes a challenge within the World Trade Organization to a shortcoming of reg neg even though such issues
were explicitly outside the purview of the committee; to criticize reg neg here is like saying surgery is not effective
when the patient refused to undergo it. While the Underground Injection rule was challenged, the committee never
reached an agreement88 and, moreover, the convening report made clear that there were very strong
disagreements over the interpretation of the governing statute that would likely have to be resolved by a Court of
Appeals. Coglianese also asserts that the Equipment Leaks rule was the subject of review; it was, but only because
the Clean Air requires parties to file challenges in a very short period, and a challenger therefore filed a defensive
challenge while it worked out some minor details over the regulation. Those negotiations were successful and the
challenge was withdrawn. The Chemical Manufacturers Association, the challenger, had no intention of a
substantive challenge.89 Moreover, a challenge to other parts of the HON should not be ascribed to the Equipment
Leaks part of the rule. The agreement in the Asbestos in Schools negotiation explicitly contemplated judicial review
strange, but true and hence it came as no surprise and as no violation of the agreement. As for the Wood
Furniture Rule, the challenges were withdrawn after informal negotiations in which EPA agreed to propose
amendments to the rule.90 Similarly, the challenge to EPAs Disinfectant By-Products Rule91 was withdrawn. In
short, the rules that have emerged from negotiated rulemaking have been remarkably resistant to substantive
challenges. And, indeed, this far into the development of the process, the standard of review and the extent to
which an agreement may be binding on either a signatory or someone whom a party purports to represent are still

Coglianese paints a
substantially misleading picture by failing to distinguish substantive challenges to
unknown the speculation of many an administrative law class.92 Thus, here too,

rules that are based on a consensus from either challenges to issues that were not
the subject of negotiations or were filed while some details were worked out.
Properly understood, reg negs have been phenomenally successful in
warding off substantive review.

Reg negs solve faster and betterCoglianeses results


concluded neg when properly interpreted
Harter 99
(Philip J. Harter received his AB (1964), Kenyon College, MA (1966), JD, magna cum laude (1969),
University of Michigan. Philip J. Harter is a scholar in residence at Vermont Law School and the Earl F.
Nelson Professor of Law Emeritus at the University of Missouri. He has been involved in the design of
many of the major developments of administrative law in the past 40 years. He is the author of more
than 50 papers and books on administrative law and has been a visiting professor or guest lecturer
internationally, including at the University of Paris II, Humboldt University (Berlin) and the University
of the Western Cape (Cape Town). He has consulted on environmental mediation and public
participation in rulemaking in China, including a project sponsored by the Supreme Peoples Court. He
has received multiple awards for his achievements in administrative law. He is listed in Who's Who in
America and is a member of the Administrative Conference of the United States.Harter, P. J. Assessing
the Assessors: The Actual Performance of Negotiated Rulemaking, December 1999.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=202808//ghs-kw)

Negotiated Rulemaking Has Fulfilled its Goals. If better rules were the aspirations for negotiated
rulemaking, the question remains as to whether the process has lived up to the expectations. From my own
personal experience, the

rules that emerge from negotiated rulemaking tend to be


both more stringent and yet more cost effective to implement. That
somewhat paradoxical result comes precisely from the practical orientation of the
committee: it can figure out what information is needed to make a reasonable,
responsible decision and then what actions will best achieve the goal; it can,
therefore, avoid common regulatory mistakes that are costly but do not contribute
substantially to accomplishing the task. The only formal evaluation of negotiated rulemaking that
has been conducted supports these observations. After his early article analyzing the time required for negotiated
rulemaking, Neil Kerwin undertook an evaluation of negotiated rulemaking at the Environmental Protection Agency

Kerwin and Langbein conducted a study of negotiated


rulemaking by examining what actually occurs in a reg neg versus the development of rules by conventional
with Dr. Laura Langbein.103

means. To establish the requisite comparison, they collected data on litigation, data from the comments on
proposed rules, and data from systematic, open-ended interviews with participants in 8 negotiated rules . . . and in
6 comparable conventional rules.104 They interviewed 51 participants of conventional rulemaking and 101 from

Kerwin and Langbeins important work provides


the only rigorous, empirical evaluation that compares a number of factors of
conventional and negotiated rulemaking. Their overall conclusion is: Our research contains strong
but qualified support for the continued use of negotiated rulemaking. The strong
support comes in the form of positive assessments provided by participants in
negotiated rulemaking compared to assessments offered by those involved in
conventional form of regulation development. Further, there is no evidence that
negotiated rules comprise an abrogation of agency authority, and negotiated rules
appear no more (or less) subject to litigation that conventional rules. It is also true that
various negotiated rulemaking committees.105

negotiated rulemaking at the EPA is used largely to develop rules that entail particularly complex issues regarding
the implementation and enforcement of legal obligations rather than those that set the substantive standards
themselves. However, participants

assessments of the resulting rules are more positive


when the issues to be decided entail those of establishing rather than enforcing the
standard. Further, participants assessments are also more positive when the issues
to be decided are relatively more complex. Our research would support a recommendation that

negotiated rulemaking continue to be applied to complex issues, and more widely applied to include those entailing
the standard itself.106 Their findings are particularly powerful when comparing individual attributes of negotiated

negotiated rules
were viewed more favorably in every criteria, and significantly so in several
and conventional rules. Table 3 contains a summary of those comparisons. Importantly,

dimensions that are often contentious in regulatory debates

the economic efficiency of


the rule and its cost effectiveness the quality of the scientific evidence and the incorporation of appropriate
technology, and personal experience is not usually considered in dialogues over regulatory procedure, Kerwin

The benefits envisioned by the


proponents of negotiated rulemaking have indeed been realized . That is
demonstrated both by Coglianeses own methodology when properly
understood and by the only careful and comprehensive comparative study .
Reg neg has proven to be an enormously powerful tool in addressing highly
complex, politicized rules. These are the very kind that stall agencies when using
traditional or conventional procedures.107 Properly understood and used
appropriately, negotiated rulemaking does indeed fulfill its expectations
and Langbeins findings here too favor negotiated rules. Conclusion.

2NC AT Transparency
The process is transparent
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

negotiated rules were far from secret deals. The


Negotiated Rulemaking Act of 1990 (NRA) requires federal agencies to provide
notice of regulatory negotiations in the Federal Register,50 to formally charter reg
neg committees,51 and to observe the transparency and accountability
requirements52 of the Federal Advisory Committee Act. 53 Any individual or organization that
Defenders of reg neg retorted that

might be significantly affected by a proposed rule can apply for membership in a reg neg committee,54 and even
if the agency rejects their application, they remain free to attend as spectators.55 Most significantly, the

requires that the agency submit negotiated rules to traditional notice and
comment.56

NRA

2NC AT Undemocratic
The process is equal and fair
Freeman and Langbein 00
(Jody Freeman is the Archibald Cox Professor at Harvard Law School and a leading expert on
administrative law and environmental law. She holds a Bachelor of the Arts from Stanford University, a
Bachelor of Laws from the University of Toronto, and a Master of Laws in addition to a Doctors of
Jurisdictional Science from Harvard University. She served as Counselor for Energy and Climate Change
in the Obama White House in 2009-2010. Freeman is a prominent scholar of regulation and
institutional design, and a leading thinker on collaborative and contractual approaches to governance.
After leaving the White House, she advised the National Commission on the Deepwater Horizon oil spill
on topics of structural reform at the Department of the Interior. She has been appointed to the
Administrative Conference of the United States, the government think tank for improving the
effectiveness and efficiency of federal agencies, and is a member of the American College of
Environmental Lawyers. Laura I Langbein is the Professor of Quantitative Methods, Program
Evaluation, Policy Analysis, and Public Choice and American College. She holds a PhD in Political
Science from the University of North Carolina, a BA in Government from Oberlin College. Freeman, J.
Langbein, R. I. Regulatory Negotiation and the Legitimacy Benefit, N.Y.U. Environmental Journal,
Volume 9, 2000. http://www.law.harvard.edu/faculty/freeman/legitimacy%20benefit.pdf//ghs-kw)

reg neg is superior to


conventional rulemaking on virtually all of the measures that were considered. Strikingly, the process
engenders a significant learning effect, especially compared to conventional
rulemaking; participants report, moreover, that this learning has long-term value not
confined to a particular rulemaking. Most significantly, the negotiation of rules appears to
enhance the legitimacy of outcomes. Kerwin and Langbein's data indicate that process
matters to perceptions of legitimacy. Moreover, as we have seen, reg neg participant reports
On balance, the combined results of Phase I and II of the study suggest that

of higher satisfaction could not be explained by their assessments of the outcome alone. Instead, higher
satisfaction seems to arise in part from a combination of process and substance variables. This suggests a link
between procedure and satisfaction, which is consistent with the mounting evidence in social psychology that
"satisfaction is one of the principal consequences of procedural fairness." This potential for procedure to enhance
satisfaction may prove especially salutary precisely when participants do not favor outcomes. As Tyler and Lind
have suggested, "hedonic glee" over positive outcomes may "obliterate" procedural effects; perceptions of
procedural fairness may matter more, however, "when outcomes are negative (and) organizations have the
greatest need to render decisions more palatable, to blunt discontent, and to give losers reasons to stay committed

At a minimum, the data call into question and sometimes flatly


contradictmost of the theoretical criticisms of reg neg that have surfaced in the
scholarly literature over the last twenty years. There is no evidence that
negotiated rulemaking abrogates an agency's responsibility to implement
legislation. Nor does it appear to exacerbate power imbalances or increase
the risk of capture. When asked whether any party seemed to have
disproportionate influence during the development of the rule, about the
same proportion of reg neg and conventional participants said yes.
Parties perceived their influence to be about the same for conventional
and negotiated rules, undermining the hypothesis that reg neg
exacerbates capture.
to the organization."

Commissions CP

1NC
Counterplan: The United States Congress should establish an
independent commission empowered to submit to Congress
recommendations regarding domestic federal government
surveillance. Congress will allow 60 days to pass legislation
overriding recommendations by a two-thirds majority. If
Congress doesnt vote within the specified period, those
recommendations will become law. The Commission should
recommend to Congress that _____<insert the plan>_______
Commission solves the plan
RWB 13
(Reporters Without Borders is a UNESCO and UN Consultant and a non-profit organization. US
congress urged to create commission to investigate mass snooping, RWB, 06-10-2013.
https://en.rsf.org/united-states-us-congress-urged-to-create-10-06-2013,44748.html//ghs-kw)

Reporters Without Borders calls on the US Congress to create a commission of


enquiry into the links between US intelligence agencies and nine leading
Internet sector companies that are alleged to have given them access to
their servers. The commission should also identify all the countries and
organizations that have contributed to the mass digital surveillance machinery that
according to reports in the Washington Post and Guardian newspapers in the past few days the US
authorities have created. According to these reports, the telephone company Verizon hands over
the details of the phone calls of millions of US and foreign citizens every day to the
National Security Agency (NSA), while nine Internet majors including Microsoft, Yahoo,
Facebook, Google and Apple have given the FBI and NSA direct access to
their users data under a secret programme called Prism. US intelligence agencies are
reportedly able to access all of the emails, audio and video files, instant messaging
conversations and connection data transiting through these companies servers .
According to The Guardian, Government Communication Headquarters (GCHQ), the NSAs British equivalent, also

The proposed congressional commission should evaluate


the degree to which the collected data violates privacy and therefore also freedom
of expression and information. The commissions findings must not be classified as
defence secrets. These issues protection of privacy and freedom of expression are matters of
public interest.
has access to data collected under Prism.

2NC O/V
Counterplan solves 100% of the caseCongress creates an
independent commission comprised of experts to debate the
merits of the plan, and the commission recommends to
Congress that it passes the planCongress must pass
legislation specifically blocking those recommendations within
60 days or the commissions recommendations become law
AND, that solves the AFFcommissions are empowered to
debate Internet backdoors and submit recommendations
thats RWB

2NC Solvency
Empirics prove commissions solve
FT 10
(Andrews, Edmund. Deficit Panel Faces Obstacles in Poisonous Political Atmosphere, Fiscal Times.
02-18-2010. http://www.thefiscaltimes.com/Articles/2010/02/18/Fiscal-Commission-Faces-BigObstacles?page=0%2C1//ghs-kw)

at least two previous presidential commissions


succeeded at breaking through intractable political problems when Congress was
paralyzed. The 1983 Greenspan commission, headed by Alan Greenspan, who later became
chairman of the Federal Reserve, reached an historic agreement to gradually raise
Social Security taxes and gradually increase the minimum age at which
workers qualify for Social Security retirement benefits. Those
recommendations passed both the House and Senate, and averted a
potentially catastrophic financial crisis with Social Security.
Supporters of a bipartisan deficit commission note that

2NC Solves Better


CP solves bettertechnical complexity
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

Congress may choose to establish a commission when legislators and


their staffs do not currently have sufficient knowledge or expertise in a complex policy
area.22 By assembling experts with backgrounds in particular policy areas to focus
on a specific mission, legislators might efficiently obtain insight into complex public
policy problems.23
Obtaining Expertise

2NC Politics NB
No link to politicscommissions result in bipartisanship and
bypass Congressional politics
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)
Overcoming Political Complexity Complex policy issues may also create institutional problems because they do not
fall neatly within the jurisdiction of any particular committee in Congress.26 By virtue of their ad hoc status,

a commission may allow particular legislation


or policy solutions to bypass the traditional development process in Congress,
potentially removing some of the impediments inherent in a decentralized legislature. 27
Consensus Building Legislators seeking policy changes may be confronted by an array of
political interests, some in favor of proposed changes and some against. When
these interests clash, the resulting legislation may encounter gridlock in the highly
structured political institution of the modern Congress .28 By creating a
commission, Congress can place policy debates in a potentially more
flexible environment, where congressional and public attention can be
developed over time.29 Reducing Partisanship Solutions to policy problems produced
within the normal legislative process may also suffer politically from charges of
partisanship.30 Similar charges may be made against investigations conducted by Congress.31 The nonpartisan or bipartisan character of most congressional commissions may
make their findings and recommendations less susceptible to such charges
and more politically acceptable to diverse viewpoints . The bipartisan or
nonpartisan arrangement can potentially give their recommendations
strong credibility, both in Congress and among the public, even when
dealing with divisive issues of public policy. 32 Commissions may also give
political factions space to negotiate compromises in good faith, bypassing the shortterm tactical political maneuvers that accompany public negotiations. 33 Similarly,
because commission members are not elected, they may be better suited to
suggesting unpopular, but necessary, policy solutions. 34 Solving Collective Action Problems A
commission may allow legislators to solve collective action problems, situations in
which all legislators individually seek to protect the interests of their own district,
despite widespread agreement that the collective result of such interests is
something none of them prefer. Legislators can use a commission to jointly tie their
hands in such circumstances, allowing general consensus about a particular policy
solution to avoid being impeded by individual concerns about the effect or
implementation of the solution.35 For example, in 1988 Congress established the Base
Closure and Realignment Commission (BRAC) as a politically and geographically
neutral body to make independent decisions about closures of military bases. 36 The
list of bases slated for closure by the commission was required to be either accepted or
rejected as a whole by Congress, bypassing internal congressional politics over
which individual bases would be closed, and protecting individual Members from political
charges that they didnt save their districts base.37
commissions may circumvent such issues. Similarly,

CP avoids the focus link to politics


Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)
Overcoming Issue Complexity

Complex policy issues may cause time management

challenges for Congress. Legislators often keep busy schedules and may not have
time to deal with intricate or technical policy problems, particularly if the issues
require consistent attention over a period of time. 24 A commission can devote itself
to a particular issue full-time, and can focus on an individual problem without
distraction.25

No link to politicscommissions create bipartisan negotiations


Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

The third major reason for Congress to delegate to a commission is the


strategy of distancing itself from a politically risky decision. These instances

generally occur when Congress faces redistributive policy problems, such as Social Security, military base closures,

Such problems are the most difficult because legislators must take
a clear policy position on something that has greater costs to their districts than benefits, or that
Medicare, and welfare.

shifts resources visibly from one group to another. Institutionally, Congress has to make national policy that has a

Members realize that their


individual interests, based on constituents demands, may be at odds with the
national interest, and this can lead to possible electoral repercussions. 55 Even when
pursuing policies that are in the interests of the country as a whole, legislators do
not want to be blamed for causing losses to their constituents. In such an event, the
split characteristics of the institution come into direct conflict. Many on Capitol Hill endorse a
collective benefit, but the self-interest of lawmakers often gets in the way.

commission for effectively resolving a policy problem rather than the other machinery available to Congress. A
commission finds remedies when the normal decision making process has stalled. A long-time Senate staff director

At their
most effective, these panels allow Congress to realize purposes most
members cannot find the confidence to do unless otherwise done behind
the words of the commission. 56 When an issue imposes concentrated costs on individual districts
said of the proposed Second National Blue Ribbon Commission to Eliminate Waste in Government:

yet provides dispersed benefits to the nation, Congress responds by masking legislators individual contributions

Members avoid blame


and promote good policy by saying something is out of their hands. This method
allows legislators especially those aiming for reelection to vote for the general
benefit of something without ever having to support a plan that directly imposes
large and traceable geographic costs on their constituents. The avoidance or share-the-blame
route was much of the way Congress and the president finally dealt with
the problem of financially shoring up Social Security in the late 1980s. One
and delegates responsibility to a commission for making unpleasant decisions. 57

senior staff assistant to a western Republican representative observed that the creation of the Social Security
Commission was largely for avoidance: There are sacred cows and then there is Social Security. Neither party or
any politician wants to cut this. Regardless of what you say or do about it, in the end, you defer. Everyone backs
away from this. Similarly, a legislative director to a southern Democratic representative summarized: So many
people are getting older and when you take a look at who turns out, who registers, people over sixty-five have the
highest turnout and they vote like clockwork. The Commission on Executive, Legislative, and Judicial Salaries, later
referred to as the Quadrennial Commission (1967), is another example. Lawmakers delegated to a commission the
power to set pay for themselves and other top federal officials, whose pay they linked to their own, to help them

Because the
proposal made by the commission would take effect unless Congress voted
to oppose it, the use of the commission helped insulate legislators from
avoid blame. Increasing their own pay is a decision few politicians willingly endorse.

political hazards. 58 That is, because it was the commission that granted pay raises, legislators could tell
their constituents that they would have voted against the increase if given the chance. Members could get the pay
raise and also the credit for opposing it. Redistribution is the most visible public policy type because it involves the
most conspicuous, long run allocations of values and resources. Most divisive socioeconomic issues affirmative
action, medical care for the aged, aid to depressed geographic areas, public housing, and the elimination of
identifiable governmental actions involve debates over equality or inequality and degrees of redistribution.

These are political hot potatoes, in which a commission is a good means of putting
a fire wall between you [the lawmaker] and that hot potato, the chief of staff to a
midwestern Democratic representative acknowledged. Base closing took on a redistributive character as federal
expenditures outpaced revenues. It was marked not only by extreme conflict but also by techniques to mask or

The Base Closure Commission (1991)


was created with an important provision that allowed for silent congressional
approval of its recommendations. Congress required the commission to submit its reports of proposed
sugarcoat the redistributions or make them more palatable.

closures to the secretary of defense. The president had fifteen days to approve or disapprove the list in its entirety.
If approved, the list of recommended base closures became final unless both houses of Congress adopted a joint
resolution of disapproval within forty-five days. Congress had to consider and vote on the recommendations en bloc
rather than one by one, thereby giving the appearance of spreading the misery equally to affected clienteles. A
former staff aide for the Senate Armed Services Committee who was active in the creation of the Base Closure
Commission contended, There was simply no political will by Congress. The then-secretary of
defense started the process [base closing] with an in-house commission [within the Defense Department].

Eventually, however, Congress used the commission idea as a scheme for a way
out of a box. CONCLUSION Many congressional scholars attribute delegation principally to electoral

considerations. 59 For example, in the delegation of legislative authority to standing committees, legislators, keen
on maximizing their reelection prospects, request assignments to committees whose jurisdictions coincide with the
interests of key groups in their districts. Delegation of legislative functions to the president, to nonelected officials

delegation
fosters the avoidance of blame. 60 Mindful that most policies entail both costs and
benefits, and apprehensive that those suffering the costs will hold them responsible,
members of Congress often find that the most attractive option is to let someone
else make the tough choices. Others see congressional delegation as unavoidable (and even desirable)
in the federal bureaucracy, or to ad hoc commissions also grows out of electoral motives. Here,

in light of basic structural flaws in the design of Congress. 61 They argue that Congress is incapable of crafting

congressional
action can be stymied at several junctures in the legislative policymaking process.
Congress is decentralized, having few mechanisms for integrating or coordinating
its policy decisions; it is an institution of bargaining, consensus-seeking, and
compromise. The logic of delegation is broad: to fashion solutions to tough
problems, to broker disputes, to build consensus, and to keep fragile coalitions
together. The commission co-opts the most publicly ideological and privately
pragmatic, the liberal left and the conservative right. Leaders of both parties or
their designated representatives can negotiate a deal without the media, the public, or interest
groups present. When deliberations are private, parties can make offers without being
denounced either by their opponents or by affected groups. Removing external contact
policies that address the full complexity of modern-day problems. 62 Another charge is that

reduces the opportunity to use an offer from the other side to curry favor with constituents.

2NC Commissions Popular


Commissions give political coverresult in compromise
Fiscal Seminar 9
(The Fiscal Seminar is a group of scholars who meet on a regular basis, under the auspices of The
Brookings Institution and The Heritage Foundation, to discuss federal budget and fiscal policy issues.
The members of the Fiscal Seminar acknowledge the contributions of Paul Cullinan, a former colleague
and Brookings scholar, in the development of this paper, and the editorial assistance of Emily Monea.
THE POTENTIAL ROLE OF ENTITLEMENT OR BUDGET COMMISSIONS IN ADDRESSING LONG-TERM
BUDGET PROBLEMS, The Fiscal Seminar. 06-2009.)

the Greenspan Commission provided a forum for developing a political


compromise on a set of politically unsavory changes. In this case, the political parties
shared a deep concern about the impending insolvency of the Social Security system but feared the
exposure of promoting their own solutions. The commission created political cover
for the serious background negotiations that resulted in the ultimate compromise.
The structure of the commission reflected these concerns and was composed of
fifteen members, with the President, the Senate Majority Leader, and the Speaker of
the House each appointing five members to the panel.
In contrast,

2NC AT Perm do the CP


Permutation is severance:
5. Severance: CPs mechanism is distinctdelegates to the
commission and isnt Congressional action
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He
received his Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from
California State University, Chico. Prior to joining the National War College, Dr. Campbell was a
Legislative Aide to Representative Mike Thompson (CA-01), chair of the House Intelligence
Committee's Subcommittee on Terrorism, Analysis and Counterintelligence, where he handled
Appropriations, Defense and Trade matters for the congressman. Before that, he was an
Analyst in American National Government at the Congressional Research Service, an Associate
Professor of Political Science at Florida International University, and an American Political
Science Association Congressional Fellow, where he served as a policy adviser to Senator Bob
Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of 11 books on
Congress, most recently the Guide to Political Campaigns in America, and Impeaching Clinton:
Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT,
USA: Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

why and when does Congress formulate policy by commissions rather than
by the normal legislative process? Lawmakers have historically delegated
authority to others who could accomplish ends they could not. Does this
So

form of congressional delegation thus reflect the particularities of an issue area? Or does it mirror deeper
structural reasons such as legislative organization, time, or manageability? In the end, what is the impact
on representation versus the effectiveness of delegating discretionary authority to temporary entities
composed largely of unelected officials, or are both attainable together?

6. Severs resolved: resolved means to enact by lawnot the


counterplan mandate
Words and Phrases 64 vol 37A
Definition of the word resolve, given by Webster is to express an opinion or determination
by resolution or vote; as it was resolved by the legislature; It is of similar force to the word
enact, which is defined by Bouvier as meaning to establish by law.

7. Severs should: Should requires immediate action


Summers 94 (Justice Oklahoma Supreme Court, Kelsey v.
Dollarsaver Food Warehouse of Durant, 1994 OK 123, 11-8,
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn13)

The legal question to be resolved by the court is whether the word


"should"13 in the May 18 order connotes futurity or may be deemed a ruling in
praesenti.14 The answer to this query is not to be divined from rules of grammar;15 it must be
4

governed by the age-old practice culture of legal professionals and its immemorial language usage. To
determine if the omission (from the critical May 18 entry) of the turgid phrase, "and the same hereby is",
(1) makes it an in futuro ruling - i.e., an expression of what the judge will or would do at a later stage - or
(2) constitutes an in in praesenti resolution of a disputed law issue, the trial judge's intent must be
garnered from the four corners of the entire record.16 [CONTINUES TO FOOTNOTE] 13 "Should" not only
is used as a "present indicative" synonymous with ought but also is the past tense of "shall" with various
shades of meaning not always easy to analyze. See 57 C.J. Shall 9, Judgments 121 (1932). O.
JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St. Louis & S.F.R. Co. v. Brown,
45 Okl. 143, 144 P. 1075, 1080-81 (1914). For a more detailed explanation, see the Partridge quotation

Certain contexts mandate a construction of the term "should" as


more than merely indicating preference or desirability. Brown, supra at 1080-81 (jury
instructions stating that jurors "should" reduce the amount of damages in proportion to the amount of
contributory negligence of the plaintiff was held to imply an obligation and to be more
than advisory); Carrigan v. California Horse Racing Board, 60 Wash. App. 79, 802 P.2d 813 (1990) (one
infra note 15.

of the Rules of Appellate Procedure requiring that a party "should devote a section of the brief to the

a party is under an obligation to


include the requested segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958) ("should"
request for the fee or expenses" was interpreted to mean that

would mean the same as "shall" or "must" when used in an instruction to the jury which tells the triers

In praesenti means literally "at the present


time." BLACK'S LAW DICTIONARY 792 (6th Ed. 1990). In legal parlance the phrase denotes that
which in law is presently or immediately effective, as opposed to something that
will or would become effective in the future [in futurol]. See Van Wyck v. Knevals, 106 U.S.
they "should disregard false testimony"). 14

360, 365, 1 S.Ct. 336, 337, 27 L.Ed. 201 (1882).

8. Severs should again: should is mandatory


Summers 94 (Justice Oklahoma Supreme Court, Kelsey v.
Dollarsaver Food Warehouse of Durant, 1994 OK 123, 11-8,
http://www.oscn.net/applications/oscn/DeliverDocument.asp?
CiteID=20287#marker3fn13)
4

The legal question to be resolved by the court is whether the word "should"13 in the May 18 order
connotes futurity or may be deemed a ruling in praesenti.14 The answer to this query is not to be divined
from rules of grammar;15 it must be governed by the age-old practice culture of legal professionals and its
immemorial language usage. To determine if the omission (from the critical May 18 entry) of the turgid
phrase, "and the same hereby is", (1) makes it an in futuro ruling - i.e., an expression of what the judge will
or would do at a later stage - or (2) constitutes an in in praesenti resolution of a disputed law issue, the
trial judge's intent must be garnered from the four corners of the entire record.16 [CONTINUES TO
FOOTNOTE] 13 "Should" not only is used as a "present indicative" synonymous with ought but also is the
past tense of "shall" with various shades of meaning not always easy to analyze. See 57 C.J. Shall 9,
Judgments 121 (1932). O. JESPERSEN, GROWTH AND STRUCTURE OF THE ENGLISH LANGUAGE (1984); St.
Louis & S.F.R. Co. v. Brown, 45 Okl. 143, 144 P. 1075, 1080-81 (1914). For a more detailed explanation, see

Certain contexts mandate a construction of the


term "should" as more than merely indicating preference or desirability. Brown,
supra at 1080-81 (jury instructions stating that jurors "should" reduce the amount of damages in
proportion to the amount of contributory negligence of the plaintiff was held to imply an
obligation and to be more than advisory ); Carrigan v. California Horse Racing Board, 60
the Partridge quotation infra note 15.

Wash. App. 79, 802 P.2d 813 (1990) (one of the Rules of Appellate Procedure requiring that a party "should
devote a section of the brief to the request for the fee or expenses" was interpreted to mean that a party is
under an obligation to include the requested segment); State v. Rack, 318 S.W.2d 211, 215 (Mo. 1958)

("should" would mean the same as "shall" or "must" when used in an instruction to the
jury which tells the triers they "should disregard false testimony"). 14 In praesenti means literally "at the
present time." BLACK'S LAW DICTIONARY 792 (6th Ed. 1990). In legal parlance the phrase denotes that
which in law is presently or immediately effective, as opposed to something that will or would become
effective in the future [in futurol]. See Van Wyck v. Knevals, 106 U.S. 360, 365, 1 S.Ct. 336, 337, 27 L.Ed.
201 (1882).

Severance is a reason to reject the team:


3. Neg groundmakes the AFF a shifting target and allows
them to spike out of offense
4. Unpredictablekills clash which destroys advocacy skills
and education

2NC AT Perm do Both


Permutation do both links to politics:
3. Congressional debatesCP means Congress doesnt
debate the substance of the plan, only the commission
reportperm makes Congress to debate the plan, triggers
the link over partisan inclinations and electoral pressures
thats the politics net benefit ev
4. Time crunchperm forces the plan now, doesnt give the
commission time to generate political support and links to
politics
Biggs 09
(Biggs, Andrews G. Andrew G. Biggs is a resident scholar at the American Enterprise Institute,
where his work focuses on Social Security and pensions. From 2003 through 2008, he served
at the Social Security Administration, as Associate Commissioner for Retirement Policy, Deputy
Commissioner for Policy, and ultimately the principal Deputy Commissioner of the agency.
During 2005, he worked at the White House National Economic Council on Social Security
reform, and in 2001 was on the staff of the President's Commission to Strengthen Social
Security. He blogs on Social Security-related issues at Notes on Social Security Reform.
Rumors Of Obama Social Security Reform Commission, Frum Forum. 02-17-2009.
http://www.frumforum.com/rumors-of-obama-social-security-reform-commission///ghs-kw)

One problem with President Bushs 2001 Commission was that it didnt
represent the reasonable spectrum of beliefs on Social Security reform. This didnt make it
a dishonest commission; like President Roosevelts Committee on Economic Security, it was designed to
put flesh on the bones laid out by the President. In this case, the Commission was tasked with designing a

a commission
only builds political capital toward enacting reform if its seen as building a
consensus through a process in which all views have been heard. In both the
2001 Commission and the later 2005 reform drive, Democrats didnt feel they
were part of the process. They clearly will be a central part of the process this time, but the goal
reform plan that included personal accounts and excluded tax increases. That said,

will now be to include Republicans. Just as Republicans shouldnt reflexively oppose any Obama
administration reform plans for political reasons, so Democrats shouldnt seek to exclude Republicans from

a reform task force should include a variety of different


players, including members of government, both legislative and executive,
representatives of outside interest groups, and experts who can provide
technical advice and help ensure the integrity of the reforms decided upon.
the process. Second,

The 2001 Bush Commission didnt include any sitting Members of Congress and only a small fraction of
commissioners had the technical expertise needed to make the plans the best they could be. A broader

any task force or commission needs time. The 2001


Commission ran roughly from May through December of that year and had to
conduct a number of public hearings. This was simply too much to do in too
little time, and as a result the plans were fairly bare bones. There is plenty
else on the policy agenda at the moment, so theres no reason not to give a
working group a year or more to put things together.
group would be helpful. Third,

2NC AT Theory
Counterinterp: process CPs are legitimate if we have a
solvency advocate
AND, process CPs good:
5. Key to neg groundagent CPs are the only generics we
have on this topic
6. Policy educationcommissions are key to understanding
the policy process
Schwalbe, 03
(Steve,- PhD Public Policy from Auburn, former professor at the Air War College and Col. in the
USAF Independent Commissions: Their History, Utilization and Effectiveness)

Many analysts characterize commissions as an unofficial, separate


branch of government, much like the news media. Campbell referred to commissions as
the fifth arm of government, after the media, the often-referred-to fourth arm.17 However, the media and
FIFTH BRANCH

independent commissions have as many similarities as differences. They are similar in that neither is mentioned in the Constitution. Both
conduct oversight functions. Both serve to educate and inform the public. Both allow elites to participate in shaping government policy. On
the other hand, the media and independent commissions are dissimilar in many ways. Where the news media responds to market forces, and
hence will likely operate in perpetuity, independent commissions respond to a federal requirement to resolve a difficult problem. Therefore,
they exist for a relatively short period of time, expiring once a final report is published and disseminated. Where the medias primary

a commissions primary responsibilities can range


from developing a recommended solution to a difficult problem to regulating
an entire department of the executive branch. The media receives its funding primarily from
functions are reporting and analyzing the news,

advertisers, where commissions receive their funding from Congress, the President, or from private sources. The news media deal with issues
foreign and domestic, while independent commissions generally focus on domestic issues. PURPOSE

Commissions serve

numerous purposes in the U.S. Government. Campbell cited three

primary reasons for the establishment


of federal independent commissions. First, they are established to provide expertise the Congress does not have among its own elected
officials or their staffs. Next, he noted that the second most frequently cited reason by members of Congress for establishing a commission
was to reduce the workload in Congress. Finally, they are formed to provide a convenient scapegoat to deflect the wrath of the electorate; i.e.,
blame avoidance.18 Fisher found three advantages of regulatory commissions. First, commission members bring essential expert insights
to a commission because the regulated industries are normally complex and highly technical. Second, appointing commissioners for
extended terms of full-time work allows commissioners to become very familiar with the technical aspects of an industry, through periodic
contacts that Congress would not be able to accomplish. As a result of their tenure, varied membership, and shared responsibility,
commissioners would be resistant to external pressures. Finally, regulatory commissions provide policy continuity essential to the stability of
a regulated industry.19 What the taxpayers are primarily looking for from independent commissions are non- partisan solutions to current
problems. A good example of establishing a commission to find non-partisan solutions is Congress regulating its own ethical behavior.
University of Florida Professor Beth Rosenson researched this issue and concluded that authorizing an ethics commission may be based on
the fear of electoral retaliation if legislators do not take aggressive action to regulate their own ethics.20 Campbell noted that commissions
perform several other functions besides providing recommendations to the President and Congress. The most common reason provided by
analysts is that members of Congress generally want to avoid making difficult decisions that may adversely affect their chances for reelection.
As he noted, Incentives to avoid blame lead members of Congress to adopt a distinctive set of political strategies, such as passing the buck
or deflection.21 Another technique legislators use to avoid incurring the wrath of the voters is to schedule any controversial independent
commissions for after the next election. Establish- ing a commission to research the issue and come up with recommendations after a preset
period of time is an effective way to do that. The most clear-cut example demonstrating this technique is the timing of the BRAC commissions
in the 1990s all three made their base closure recommendations in non-election years (1991, 1993, and 1995). Even the next BRAC
commission, established by the National Defense Authorization Act for Fiscal Year 2002, is not required to submit its base closure
recommendations until 2005. Congress certainly is not the most efficient organization in the U.S.; hence, there are times when an
independent commission is the more efficient and effective way to go. Law- makers are almost always short on time and information, which
makes the option of delegating authority to a commission very appealing. Oftentimes, the expertise and necessary information is very costly
for Congress to acquire. Commissions are generally the most inexpensive way for Congress to solve complex problems. From 1993-1997,
Campbell found that 92 congressional offices introduced legislation that included proposals to establish ad hoc commissions.22 There are
numerous other reasons for establishing independent commissions. They are created as a symbolic response to a crisis or to satisfy the
electorate at home. They have served as trial balloons to test the political waters, or to make political gains with the voters. They can be
created to gain public or political consensus. Often, when Congress has exhausted all its other options, a commission serves as an option of
last resort.23 Commissions are a relatively impartial way to help resolve problems between the executive and legislative branches of
government, especially during periods of congressional gridlock. Wolanin also noted that commissions are particularly useful for problems
and in circumstances marked by federal executive branch incapacity. Federal bureaucracies suffer from many of the same shortcomings
attributed to Congress when considering commissions. They often lack the expertise, information, and time to conduct the research and make
recommendations to resolve internal problems. They can be afflicted by groupthink, not being able to think outside the box, or by not being

Commissions offer a non-partisan, neutral option to address bureaucratic


Secretary
Donald Rumsfeld
has decided to implement the
recommendations of the congressionally- chartered Commission on Space,
which he chaired prior to being appointed Secretary of Defense!25 One of the more important functions of independent
able to see the big picture.
policy

problems.24

Defense

commissions is educating and persuading. Due to the high visibility of most appointed commissioners, a policy issue will automatically tend
to gain public attention. According to Wolanin, the prestige and visibility of commissions give them the capability to focus attention on a
problem, and to see that thinking about it permeates more rapidly. A recent example of a high-visibility commission chair appointment was
Henry Kissinger, selected to chair the commission to look into the perceived intelligence failure regarding the September 11, 2001 terrorist
attack on the U.S. .26 Wolanin cited four educational impacts of commissions: 1) educating the general public; 2) educating government
officials; 3) serving as intellectual milestones; and, 4) educating the commission members themselves. Regarding education of the general
public, he stated that, Commissions have helped to place broad new issues on the national agenda, to elevate them to a level of legitimate

and pressing matters about which government should take affirmative action. Regarding educating government officials, he noted that,

The educational impact of commissions within governmentmake it safer


for congressmen and federal executives to openly discuss or advocate a
proposal that has been sanctioned by such an august group. Commission reports have often been so
influential that they serve as milestones in affected fields. Such reports have become source material for

analysts, commentators, and even students, particularly when commission reports are widely published and disseminated. Finally, by serving
on a commission, members also learn much about the issue, and about the process of analyzing a problem and coming up with viable
recommendations. Commissioners also learn from one another.27

7. Predictabilitycommissions are widely used and


predictable and solvency advocate checks
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He
received his Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from
California State University, Chico. Prior to joining the National War College, Dr. Campbell was a
Legislative Aide to Representative Mike Thompson (CA-01), chair of the House Intelligence
Committee's Subcommittee on Terrorism, Analysis and Counterintelligence, where he handled
Appropriations, Defense and Trade matters for the congressman. Before that, he was an
Analyst in American National Government at the Congressional Research Service, an Associate
Professor of Political Science at Florida International University, and an American Political
Science Association Congressional Fellow, where he served as a policy adviser to Senator Bob
Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of 11 books on
Congress, most recently the Guide to Political Campaigns in America, and Impeaching Clinton:
Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT,
USA: Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

Ad hoc commissions as instruments of government have a long


history. They are used by almost all units and levels of government
for almost every conceivable task. Ironically, the use which Congress makes
of commissions preparing the groundwork for legislation, bringing public
issues into the spotlight, whipping legislation into shape, and giving priority
to the consideration of complex, technical, and critical developments receives
relatively little attention from political scientists. As noted in earlier chapters, following the logic of rational

legislators
often delegate fact-

choice theory, individual decisions to delegate are occasioned by imperfect information;


who want to develop effective policies, but who lack the necessary expertise,

finding and policy development. Others contend that some commissions are set up to shift
blame in order to maximize benefits and minimize losses.

8. At worse, reject the argument, not the team

2NC AT Certainty
Counterplan solves your certainty argsexpertise
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

By delegating some of its policymaking authority to expertise commissions,


Congress creates institutions that reduce uncertainty. Tremendous gains accrue as a
result of delegating tasks to other organizations with a comparative advantage in
performing them. Commissions are especially adaptable devices for addressing problems that do not fall
neatly within committees jurisdictional boundaries. They can complement and supplement the
regular committees. In the 1990s, it became apparent that committees were ailing beset by mounting
workloads, duplication and jurisdictional battles, and conflicts between program and funding panels. But relevant
expertise can be mobilized by a commission that brings specialized information to
its tasks, especially if commission members and staff are selected on the basis of
education, their training, and their experience in the area which cross-cut the
responsibilities of several standing committees.

2NC AT Commissions Bad


No disadscommissions are inevitable due to Congressional
structure
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

Others see congressional delegation as unavoidable (and even desirable) in light of basic
structural flaws in the design of Congress. 61 They argue that Congress is incapable of
crafting policies that address the full complexity of modern-day problems.
62 Another charge is that congressional action can be stymied at several junctures in the
legislative policymaking process. Congress is decentralized, having few mechanisms
for integrating or coordinating its policy decisions ; it is an institution of bargaining, consensusseeking, and compromise. The logic of delegation is broad: to fashion solutions to tough
problems, to broker disputes, to build consensus, and to keep fragile coalitions
together. The commission co-opts the most publicly ideological and privately pragmatic, the liberal left and the
conservative right. Leaders of both parties or their designated representatives can negotiate a deal without the
media, the public, or interest groups present. When deliberations are private, parties can make offers without being
denounced either by their opponents or by affected groups. Removing external contact reduces the opportunity to
use an offer from the other side to curry favor with constituents.

2NC AT Congress Doesnt Pass Recommendations


Recommendations are passedeither bipartisan or perceived
as non-partisan
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

Solutions to policy problems produced within the normal legislative


process may also suffer politically from charges of partisanship.30 Similar charges
may be made against investigations conducted by Congress. 31 The non-partisan
or bipartisan character of most congressional commissions may make their
findings and recommendations less susceptible to such charges and more
politically acceptable to diverse viewpoints. The bipartisan or nonpartisan
arrangement can potentially give their recommendations strong
credibility, both in Congress and among the public, even when dealing
with divisive issues of public policy.32 Commissions may also give political
factions space to negotiate compromises in good faith, bypassing the short-term
tactical political maneuvers that accompany public negotiations. 33 Similarly,
because commission members are not elected, they may be better suited to
suggesting unpopular, but necessary, policy solutions. 34
Reducing Partisanship

Recommendations are passedBRAC Commission proves


Fiscal Seminar 9
(The Fiscal Seminar is a group of scholars who meet on a regular basis, under the auspices of The
Brookings Institution and The Heritage Foundation, to discuss federal budget and fiscal policy issues.
The members of the Fiscal Seminar acknowledge the contributions of Paul Cullinan, a former colleague
and Brookings scholar, in the development of this paper, and the editorial assistance of Emily Monea.
THE POTENTIAL ROLE OF ENTITLEMENT OR BUDGET COMMISSIONS IN ADDRESSING LONG-TERM
BUDGET PROBLEMS, The Fiscal Seminar. 06-2009.)

the success of BRAC seems to have resulted more from the defined
structure and process of the commission.5 Under BRAC, a package of recommendations
originated with the Department of Defense, was modified by the BRAC commission, and was
then reviewed by the President. Congress then had to consider the package as a whole with no
amendments allowed; if it failed to pass a resolution of disapproval, the recommendations
would be implemented as if they had been enacted in law. Not one of the five sets
of BRAC recommendations has been rejected by the Congress. 6,
On the other hand,

2NC AT No Authority
Commissions have broad authority
Campbell 01
(Campbell, Colton C. Dr. Colton Campbell is Professor of National Security Strategy. He received his
Ph.D. from the University of California, Santa Barbara, and his B.A. and M.A. from California State
University, Chico. Prior to joining the National War College, Dr. Campbell was a Legislative Aide to
Representative Mike Thompson (CA-01), chair of the House Intelligence Committee's Subcommittee on
Terrorism, Analysis and Counterintelligence, where he handled Appropriations, Defense and Trade
matters for the congressman. Before that, he was an Analyst in American National Government at the
Congressional Research Service, an Associate Professor of Political Science at Florida International
University, and an American Political Science Association Congressional Fellow, where he served as a
policy adviser to Senator Bob Graham of Florida. Dr. Campbell is the author, co-author, and co-editor of
11 books on Congress, most recently the Guide to Political Campaigns in America, and Impeaching
Clinton: Partisan Strife on Capitol Hill. He has also written more than two dozen chapters and articles
on the legislative process. Discharging Congress : Government by Commission. Westport, CT, USA:
Greenwood Press, 2001. ProQuest ebrary. Web. 27 July 2015. Ghs-kw.)

commissions have reached the point where they can take over various fact-finding
functions formerly performed by Congress itself. Once the facts have been found by a
commission, it is possible for Congress to subject those facts to the scrutiny of
cross-examination and debate. And if the findings stand up under such scrutiny, there remains for
Congressional

Congress the major task of determining the policy to be adopted with reference to the known factual situation. Once
it was clear, for example, that the acquired immune deficiency syndrome (AIDS) yielded an extraordinary range of
newfound political and practical difficulties, the need for legislative action was readily apparent. The question that
remained was one of policy: how to prevent the spread of AIDS. Should it be by accelerated research? By public
education? By facilitating housing support for people living with AIDS? Or by implementing a program of AIDS
counseling and testing? The AIDS Commission could help Congress answer such questions.

2NC AT Perception
CP solves your perception arguments
Glassman and Straus 15
(Glassman, Matthew E. and Straus, Jacob R. Analysts on Congress at the Congressional Research
Service. Congressional Commissions: Overview, Structure, and Legislative Considerations ,
Congressional Research Service. 01-27-2015. http://fas.org/sgp/crs/misc/R40076.pdf//ghs-kw)

By establishing a commission, Congress can often provide a


highly visible forum for important issues that might otherwise receive
scant attention from the public.38 Commissions often are composed of notable public
figures, allowing personal prestige to be transferred to policy solutions. 39 Meetings
and press releases from a commission may receive significantly more
attention in the media than corresponding information coming directly
from members of congressional committees. Upon completion of a commissions work
product, public attention may be temporarily focused on a topic that otherwise would
receive scant attention, thus increasing the probability of congressional action within the policy area.40
Raising Visibility

Private Sector CP

1NC
Counterplan: the private sector should implement and enforce
default encryption standards on a level equivalent with those
announced by Apple in 2014.
Apples new standards are unhackable even by Apple
eliminates backdoors
Green 10/4
(Green, Matthew D. Matthew D. Green is an Assistant Research Professor at the Johns Hopkins
Information Security Institute. He completed his PhD in 2008. His research includes techniques for
privacy-enhanced information storage, anonymous payment systems, and bilinear map-based
cryptography. "A Few Thoughts on Cryptographic Engineering: Why can't Apple decrypt your iPhone?
10-4-2014. http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-youriphone.html//ghs-kw)
In the rest of this post I'm going to talk about how these protections may work and how

Apple can

realistically claim not to possess a back door.

One caveat: I should probably point out that


Apple isn't known for showing up at parties and bragging about their technology -- so while a fair amount of this is
based on published information provided by Apple, some of it is speculation. I'll try to be clear where one ends and
the other begins. Password-based encryption 101 Normal password-based file encryption systems take in a
password from a user, then apply a key derivation function (KDF) that converts a password (and some salt) into an
encryption key. This approach doesn't require any specialized hardware, so it can be securely implemented purely in
software provided that (1) the software is honest and well-written, and (2) the chosen password is strong, i.e., hard
to guess. The problem here is that nobody ever chooses strong passwords. In fact, since most passwords are
terrible, it's usually possible for an attacker to break the encryption by working through a 'dictionary' of likely
passwords and testing to see if any decrypt the data. To make this really efficient, password crackers often use
special-purpose hardware that takes advantage of parallelization (using FPGAs or GPUs) to massively speed up the
process. Thus a common defense against cracking is to use a 'slow' key derivation function like PBKDF2 or scrypt.
Each of these algorithms is designed to be deliberately resource-intensive, which does slow down normal login
attempts -- but hits crackers much harder. Unfortunately, modern cracking rigs can defeat these KDFs by simply
throwing more hardware at the problem. There are some approaches to dealing with this -- this is the approach of

How Apple's encryption


works Apple doesn't use scrypt. Their approach is to add a 256-bit device-unique
secret key called a UID to the mix, and to store that key in hardware
where it's hard to extract from the phone. Apple claims that it does not record
these keys nor can it access them. On recent devices (with A7 chips), this key and the
mixing process are protected within a cryptographic co-processor called the Secure
Enclave. The Apple Key Derivation function 'tangles' the password with the UID key
by running both through PBKDF2-AES -- with an iteration count tuned to require
about 80ms on the device itself.** The result is the 'passcode key'. That key is then
used as an anchor to secure much of the data on the phone. Overview of Apple key
derivation and encryption (iOS Security Guide, p.10). Since only the device itself knows UID -- and
the UID can't be removed from the Secure Enclave -- this means all password
cracking attempts have to run on the device itself. That rules out the use of FPGA or
ASICs to crack passwords. Of course Apple could write a custom firmware
that attempts to crack the keys on the device but even in the best case
such cracking could be pretty time consuming, thanks to the 80ms PBKDF2 timing. (Apple
pegs such cracking attempts at 5 1/2 years for a random 6-character password consisting of
lowercase letters and numbers. PINs will obviously take much less time, sometimes as little as half an
memory-hard KDFs like scrypt -- but this is not the direction that Apple has gone.

hour. Choose a good passphrase!) So one view of Apple's process is that it depends on the user picking a strong
password. A different view is that it also depends on the attacker's inability to obtain the UID. Let's explore this a bit

The Secure Enclave is designed to prevent exfiltration of


the UID key. On earlier Apple devices this key lived in the application processor itself. Secure Enclave
provides an extra level of protection that holds even if the software on the
application processor is compromised -- e.g., jailbroken. One worrying thing about this approach is
more. Securing the Secure Enclave

that, according to Apple's documentation, Apple controls the signing keys that sign the Secure Enclave firmware. So
using these keys, they might be able to write a special "UID extracting" firmware update that would undo the

protections described above, and potentially allow crackers to run their attacks on specialized hardware. Which

How does Apple avoid holding a backdoor signing key that


allows them to extract the UID from the Secure Enclave? It seems to me that there are a few
possible ways forward here. No software can extract the UID. Apple's documentation even claims
that this is the case; that software can only see the output of encrypting something with
UID, not the UID itself. The problem with this explanation is that it isn't really clear that this guarantee
leads to the following question?

covers malicious Secure Enclave firmware written and signed by Apple. Update 10/4: Comex and others (who have

The UID
appears to be connected to the AES circuitry by a dedicated path, so software can
set it as a key, but never extract it. Moreover this appears to be the same for both
the Secure Enclave and older pre-A7 chips. So ignore options 2-4 below.
forgotten more about iPhone internals than I've ever known) confirm that #1 is the right answer.

2NC O/V
The counterplan solves 100% of the caseprivate corporations
will institute strong encryption standards on all their products
and store decryption mechanisms on individual devices
without retaining separate decryption programsthis means
nobody but the owner of the device can decrypt the
informationthats Green
AND, solves backdoorscompanies are technologically
incapable of providing backdoors in the world of the CP
solves the AFFthats Green

AT Perception
Other companies follow solves their credibility internal links
Whittaker 14
(Zack Whittaker. "Apple doubles-down on security, shuts out law enforcement from accessing iPhones,
iPads," ZDNet. 9-18-2014. http://www.zdnet.com/article/apple-doubles-down-on-security-shuts-outlaw-enforcement-from-accessing-iphones-ipads///ghs-kw)

The new encryption methods prevent even Apple from accessing even the relatively
small amount of data it holds on users. "Unlike our competitors, Apple cannot bypass your
passcode and therefore cannot access this data ," the company said in its new privacy policy,
updated Wednesday. "So it's not technically feasible for us to respond to government warrants for the extraction of
this data from devices in their possession running iOS 8." There are some caveats, however. For the iCloud data it
stores, Apple still has the ability (and the legal responsibility) to turn over data it stores on its own servers, or thirdparty servers it uses to support the service. iCloud data can include photos, emails, music, documents, and

Apple has set itself apart from the rest of


the crowd by bolstering its encryption efforts in such a way that makes it impossible
for it to decrypt the data. Apple chief executive Tim Cook said in a recent interview with PBS' Charlie Rose
that if the government "laid a subpoena" at its doors, Apple "can't provide" the data. He said,
bluntly: "We don't have a key. The door is closed." Although the iPhone and iPad maker was late to
contacts. In the wake of the Edward Snowden disclosures,

the transparency report party, the company has rocketed up the ranks of the civil liberties table. The Electronic
Frontier Foundation's annual reports for 2012 and 2013 showed Apple as having poor privacy practices around user
data, gaining just one star out of five each year. In 2014, Apple scored the full five stars a massive turnaround

Yahoo is bolstering encryption between its datacenters ,


and recently turned on encryption-by-default on its email service . Microsoft is also
encrypting its network traffic amid reports of the National Security Agency's
datacenter tapping program. And Google is working hard to crackdown on
government spies cracking into its networks and cables. Privacy and security
are, and have been for a while, the pinnacle of tech credibility. And Apple just scored
about a billion points on that scale, leaving most of its Silicon Valley partners in the dust
from two years prior. In the meantime,

AT Links to Terror
No link to their disadsother sources of data
NYT 14
(David E. Sanger and Brian X. Chen. "Signaling Post-Snowden Era, New iPhone Locks Out N.S.A. ," New
York Times. 9-26-2014. http://www.nytimes.com/2014/09/27/technology/iphone-locks-out-the-nsasignaling-a-post-snowden-era-.html?_r=0//ghs-kw)

concerns about Apples new encryption to hinder law enforcement


seemed overblown. He said there were still plenty of ways for the police to get
customer data for investigations. In the example of a kidnapping victim, the police can still
request information on call records and geolocation information from phone carriers
like AT&T and Verizon Wireless. Eliminating the iPhone as one source I dont think is
going to wreck a lot of cases, he said. There is such a mountain of other evidence
from call logs, email logs, iCloud, Gmail logs. Theyre tapping the whole Internet.
Mr. Zdziarski said that

XO CP

1NC
XOs solve the Secure Data Act
Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.
He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

the U.S. government should draw a clear line in the sand and declare
that the policy of the U.S. government is to strengthen not weaken
information security. The U.S. Congress should pass legislation, such as the Secure Data Act introduced
by Sen. Wyden (D-OR), banning any government efforts to introduce backdoors in software or weaken encryption.43 In the short term ,
President Obama, or his successor, should sign an executive order
formalizing this policy as well. In addition, when U.S. government agencies discover vulnerabilities in software or hardware products, they should
Second,

responsibly notify these companies in a timely manner so that the companies can fix these flaws. The best way to protect U.S. citizens from digital threats is to promote strong
cybersecurity practices in the private sector.

Zero-Days Adv CP

1NC
Counterplan: the United States federal government should
legalize and regulate the zero-day exploit market.
Regulation is key to stop zero days from falling into enemy
hands
Gallagher 13
(Ryan Gallagher. "The Secretive Hacker Market for Software Flaws," Slate Magazine. 1-16-2013.
http://www.slate.com/articles/technology/future_tense/2013/01/zero_day_exploits_should_the_hacker_g
ray_market_be_regulated.html//ghs-kw)

hackers hunt for security


vulnerabilities worth thousands of dollars on a secretive unregulated marketplace.
Behind computer screens from France to Fort Worth, Texas, elite

Using sophisticated techniques to detect weaknesses in widely used programs like Google Chrome, Java, and Flash,

they spend hours crafting zero-day exploitscomplex codes custom-made to


target a software flaw that has not been publicly disclosed, so they can bypass antivirus or firewall detection to help infiltrate a computer system. Like most technologies, the
exploits have a dual use. They can be used as part of research efforts to help strengthen computers against
intrusion. But they can also be weaponized and deployed aggressively for everything from
government spying and corporate espionage to flat-out fraud. Now, as cyberwar escalates across the globe,
there are fears that the burgeoning trade in finding and selling exploits is spiralling out of
controlspurring calls for new laws to rein in the murky trade. Some legitimate
companies operate in a legal gray zone within the zero-day market, selling exploits to governments
and law enforcement agencies in countries across the world. Authorities can use
them covertly in surveillance operations or as part of cybersecurity or espionage
missions. But because sales are unregulated, there are concerns that some
gray market companies are supplying to rogue foreign regimes that may
use exploits as part of malicious targeted attacks against other countries
or opponents. There is also an anarchic black market that exists on invite-only
Web forums, where exploits are sold to a variety of actorsoften for criminal
purposes. The importance of zero-day exploits, particularly to governments, has become
increasingly apparent in recent years. Undisclosed vulnerabilities in Windows played a crucial role in
how Iranian computers were infiltrated for surveillance and sabotage when the
countrys nuclear program was attacked by the Stuxnet virus (an assault reportedly launched
by the United States and Israel). Last year, at least eight zero days in programs like Flash and Internet Explorer
were discovered and linked to a Chinese hacker group dubbed the Elderwood gang, which targeted more than
1,000 computers belonging to corporations and human rights groups as part of a shady intelligence-gathering effort
allegedly sponsored by China. The most lucrative zero days can be worth hundreds of thousands of dollars in both
the black and gray markets. Documents released by Anonymous in 2011 revealed Atlanta-based security firm
Endgame Systems offering to sell 25 exploits for $2.5 million. Emails published alongside the documents showed
the firm was trying to keep a very low profile due to feedback we've received from our government clients. (In
keeping with that policy, Endgame didnt respond to questions for this story.) But not everyone working in the
business of selling software exploits is trying to fly under the radarand some have decided to blow the whistle on
what they see as dangerous and irresponsible behavior within their secretive profession. Adriel Desautels, for one,
has chosen to speak out. The 36-year-old exploit broker from Boston runs a company called Netragard, which
buys and sells zero days to organizations in the public and private sectors. (He wont name names, citing
confidentiality agreements.) The lowest-priced exploit that Desautels says he has sold commanded $16,000; the

Unlike other companies and sole traders operating in the zeroday trade, Desautels has adopted a policy to sell his exploits only domestically
within the United States, rigorously vetting all those he deals with. If he didnt have
this principle, he says, he could sell to anyone he wantedeven Iran or China
because the field is unregulated. And thats exactly why he is concerned. As technology
advances, the effect that zero-day exploits will have is going to become more
physical and more real, he says. The software becomes a weapon. And if you
dont have controls and regulations around weapons, youre really open to
highest, more than $250,000.

introducing chaos and problems. Desautels says he knows of greedy and


irresponsible people who will sell to anybody, to the extent that some exploits
might be sold by the same hacker or broker to two separate governments not on
friendly terms. This can feasibly lead to these countries unwittingly targeting each
others computer networks with the same exploit, purchased from the same seller.
If I take a gun and ship it overseas to some guy in the Middle East and he uses it to
go after American troopsits the same concept, he says. The position Desautels has
taken casts him as something of an outsider within his trade. Frances Vupen, one of
the foremost gray-market zero-day sellers, takes a starkly different approach. Vupen
develops and sells exploits to law enforcement and intelligence agencies across the
world to help them intercept communications and conduct offensive cyber security
missions, using what it describes as extremely sophisticated codes that bypass
all modern security protections and exploit mitigation technologies. Vupens latest
financial accounts show it reported revenue of about $1.2 million in 2011, an overwhelming majority of which (86
percent) was generated from exports outside France. Vupen says it will sell exploits to a list of more than 60
countries that are members or partners of NATO, provided these countries are not subject to any export sanctions.
(This means Iran, North Korea, and Zimbabwe are blacklistedbut the likes of Kazakhstan, Bahrain, Morocco, and
Russia are, in theory at least, prospective customers, as they are not subject to any sanctions at this time.) As a
European company, we exclusively work with our allies and partners to help them protect their democracies and
citizens against threats and criminals, says Chaouki Bekrar, Vupens CEO, in an email. He adds that even if a given
country is not on a sanctions list, it doesnt mean Vupen will automatically work with it, though he declines to name

Vupens policy of selling


to a broad range of countries has attracted much controversy, sparking furious
debate around zero-day sales, ethics, and the law. Chris Soghoian of the ACLUa prominent
privacy and security researcher who regularly spars with Vupen CEO Bekrar on Twitterhas accused Vupen of
being modern-day merchants of death selling the bullets for cyberwar. Just as
the engines on an airplane enable the military to deliver a bomb that kills people, so
too can a zero day be used to deliver a cyberweapon that causes physical harm or
loss of life, Soghoian says in an email. He is astounded that governments are sitting on flaws
by purchasing zero-day exploits and keeping them secret. This ultimately entails exposing their own
specific countries or continents where his firm does or does not have customers.

citizens to espionage, he says, because it means that the government knows about software vulnerabilities but is
not telling the public about them. Some claim, however, that the zero-day issue is being overblown and politicized.
You dont need a zero day to compromise the workstation of an executive, let alone an activist, says Wim Remes,
a security expert who manages information security for Ernst & Young. Others argue that the U.S. government in
particular needs to purchase exploits to keep pace with what adversaries like China and Iran are doing. If were
going to have a military to defend ourselves, why would you disarm our military? says Robert Graham at the
Atlanta-based firm Errata Security. If the government cant buy exploits on the open market, they will just develop
them themselves, Graham says. He also fears that regulation of zero-day sales could lead to a crackdown on
legitimate coding work. Plus, digital arms dont existits an analogy. They dont kill people. Bad things really dont
happen with them. * * * So are zero days really a danger? The overwhelming majority of compromises of computer
systems happen because users failed to update software and patch vulnerabilities that are already known about.
However, there are a handful of cases in which undisclosed vulnerabilitiesthat is, zero dayshave been used to

It was a zero day, for instance, that was recently used by


malicious hackers to compromise Microsofts Hotmail and steal emails and details of
the victims' contacts. Last year, it was reported that a zero day was used to target a flaw in
Internet Explorer and hijack Gmail accounts. Noted off
ensive security companies such as
target organizations or individuals.

Italys Hacking Team and the England-based Gamma Group are among those to make use of zero-day exploits to

companies
have been accused of supplying their technologies to countries with an authoritarian
bent. Tracking and communications interception can have serious real-world
consequences for dissidents in places like Iran, Syria, or the United Arab Emirates.
In the wrong hands, it seems clear, zero days could do damage. This potential has been
recognized in Europe, where Dutch politician Marietje Schaake has been crusading for
groundbreaking new laws to curb the trade in what she calls digital weapons.
Speaking on the phone from Strasbourg, France*, Schaake tells me shes concerned about security exploits,
help law enforcement agencies install advanced spyware on target computersand both of these

particularly where they are being sold with the intent to help enable access to computers or mobile devices not

is considering pressing for the European Commission, the


a whole new regulatory framework that would encompass
the trade in zero days, perhaps by looking at incentives for companies or hackers to
report vulnerabilities that they find. Such a move would likely be welcomed by the handful
authorized by the owner. She adds that she
EUs executive body, to bring in

of organizations already working to encourage hackers and security researchers to responsibly disclose
vulnerabilities they find instead of selling them on the black or gray markets. The Zero Day Initiative, based in
Austin, Texas, has a team of about 2,700 researchers globally who submit vulnerabilities that are then passed on to
software developers so they can be fixed. ZDI, operated by Hewlett-Packard, runs competitions in which hackers
can compete for a pot of more than $100,000 in prize funds if they expose flaws. We believe our program is
focused on the greater good, says Brian Gorenc, a senior security researcher who works with the ZDI.

DAs

Terror

1NC - Generic
Terror risk is highmaintaining current surveillance is key
Inserra, 6/8 (David Inserra is a Research Associate for Homeland Security and Cyber
Security in the Douglas and Sarah Allison Center for Foreign and National Security Policy of
the Kathryn and Shelby Cullom Davis Institute for National Security and Foreign Policy, at
The Heritage Foundation, 6-8-2015, "69th Islamist Terrorist Plot: Ongoing Spike in Terrorism
Should Force Congress to Finally Confront the Terrorist Threat," Heritage Foundation,
http://www.heritage.org/research/reports/2015/06/69th-islamist-terrorist-plot-ongoing-spikein-terrorism-should-force-congress-to-finally-confront-the-terrorist-threat)
On June 2 in Boston, Usaamah Abdullah Rahim drew a knife and attacked police officers and
FBI agents, who then shot and killed him. Rahim was being watched by Bostons Joint
Terrorism Task Force as he had been plotting to behead police officers as part of violent
jihad. A conspirator, David Wright or Dawud Sharif Abdul Khaliq, was arrested shortly
thereafter for helping Rahim to plan this attack. This plot marks the 69th publicly known
Islamist terrorist plot or attack against the U.S. homeland since 9/11, and is part of a recent
spike in terrorist activity. The U.S. must redouble its efforts to stop terrorists before they
strike, through the use of properly applied intelligence tools. The Plot According to the criminal
complaint filed against Wright, Rahim had originally planned to behead an individual outside the state of
Massachusetts,[1] which, according to news reports citing anonymous government officials, was Pamela Geller, the
organizer of the draw Mohammed cartoon contest in Garland, Texas.[2] To this end, Rahim had purchased multiple
knives, each over 1 foot long, from Amazon.com. The FBI was listening in on the calls between Rahim

and Wright and recorded multiple conversations regarding how these weapons would be
used to behead someone. Rahim then changed his plan early on the morning of June 2. He planned to go on
vacation right here in Massachusetts. Im just going to, ah, go after them, those boys in blue. Cause, ah, its the
easiest target.[3] Rahim and Wright had used the phrase going on vacation repeatedly in their conversations as
a euphemism for violent jihad. During this conversation, Rahim told Wright that he planned to attack a police officer
on June 2 or June 3. Wright then offered advice on preparing a will and destroying any incriminating evidence.
Based on this threat, Boston police officers and FBI agents approached Rahim to question him, which prompted him
to pull out one of his knives. After being told to drop his weapon, Rahim responded with you drop yours and
moved toward the officers, who then shot and killed him. While Rahims brother, Ibrahim, initially claimed that
Rahim was shot in the back, video surveillance was shown to community leaders and civil rights groups, who have
confirmed that Rahim was not shot in the back.[4 ] Terrorism Not Going Away This 69th Islamist plot is also
the seventh in this calendar year. Details on how exactly Rahim was radicalized are still forthcoming, but
according to anonymous officials, online propaganda from ISIS and other radical Islamist groups are
the source.[5] That would make this attack the 58th homegrown terrorist plot and continue

the recent trend of ISIS playing an important role in radicalizing individuals in the United
States. It is also the sixth plot or attack targeting law enforcement in the U.S., with a recent uptick in plots aimed
at police. While the debate over the PATRIOT Act and the USA FREEDOM Act is taking a break, the terrorists are not.
The result of the debate has been the reduction of U.S. intelligence and counterterrorism capabilities, meaning that
the U.S. has to do even more with less when it comes to connecting the dots on terrorist plots.[6] Other
legitimate intelligence tools and capabilities must be leaned on now even more. Protecting the
Homeland To keep the U.S. safe, Congress must take a hard look at the U.S. counterterrorism enterprise and
determine other measures that are needed to improve it. Congress should: Emphasize community outreach. Federal
grant funds should be used to create robust community-outreach capabilities in higher-risk urban areas. These
funds must not be used for political pork, or so broadly that they no longer target those communities at greatest
risk. Such capabilities are key to building trust within these communities, and if the United States is to thwart lonewolf terrorist attacks, it must place effective community outreach operations at the tip of the spear. Prioritize local
cyber capabilities. Building cyber-investigation capabilities in the higher-risk urban areas must become a primary
focus of Department of Homeland Security grants. With so much terrorism-related activity occurring on the Internet,
local law enforcement must have the constitutional ability to monitor and track violent extremist activity on the
Web when reasonable suspicion exists to do so. Push the FBI toward being more effectively driven by intelligence.
While the FBI has made high-level changes to its mission and organizational structure, the bureau is still working on
integrating intelligence and law enforcement activities. Full integration will require overcoming inter-agency cultural
barriers and providing FBI intelligence personnel with resources, opportunities, and the stature they need to
become a more effective and integral part of the FBI . Maintain essential counterterrorism tools.

Support for important investigative tools is essential to maintaining the security of the U.S.
and combating terrorist threats. Legitimate government surveillance programs are also a
vital component of U.S. national security and should be allowed to continue. The need for
effective counterterrorism operations does not relieve the government of its obligation to

follow the law and respect individual privacy and liberty. In the American system, the
government must do both equally well. Clear-Eyed Vigilance The recent spike in terrorist plots
and attacks should finally awaken policymakersall Americans, for that matterto the seriousness
of the terrorist threat. Neither fearmongering nor willful blindness serves the United States.
Congress must recognize and acknowledge the nature and the scope of the Islamist terrorist
threat, and take the appropriate action to confront it.

Backdoors are key to stop terrorism and child predators


Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal.
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

Terrorists will use bioweapons- guarantees extinction


Cooper 13 (Joshua, 1/23/13, University of South Carolina, Bioterrorism and the Fermi
Paradox, http://people.math.sc.edu/cooper/fermi.pdf, 7/15/15, SM)

We may conclude that, when a civilization reaches its space-faring age, it will more or less at the same
moment (1) contain many individuals who seek to cause large-scale destruction,

and
(2) acquire the capacity to tinker with its own genetic chemistry. This is a perfect
recipe for bioterrorism, and, given the many very natural pathways for its development
and the overwhelming evidence that precisely this course has been taken by humanity , it is
hard to see how bioterrorism does not provide a neat, if profoundly unsettling, solution to Fermis paradox. One
might object that, if omnicidal individuals are successful in releasing highly virulent and
deadly genetic malware into the wild, they are still unlikely to succeed in killing everyone. However,
even if every such mass death event results only in a high (i.e., not total) kill rate
and there is a large gap between each such event (so that individuals can build
up the requisite scientific infrastructure again ), extinction would be inevitable
regardless. Some of the engineered bioweapons will be more successful than others; the inter-apocalyptic eras
will vary in length; and post-apocalyptic environments may be so war-torn, disease-

stricken, and impoverished of genetic variation that they may culminate in true
extinction events even if the initial cataclysm only results in 90% death rates ,
since they may cause the effective population size to dip below the so-called
minimum viable population. This author ran a Monte Carlo simulation using as (admittedly very
crude and poorly informed, though arguably conservative) estimates the following Earth-like parameters:
bioterrorism event mean death rate 50% and standard deviation 25% (beta distribution), initial population 1010,
minimum viable population 4000, individual omnicidal act probability 107 per annum, and population growth
rate 2% per annum. One thousand trials yielded an average post-space-age time until extinction of less than 8000
years. This is essentially instantaneous on a cosmological scale, and varying the parameters by quite a bit does
nothing to make the survival period comparable with the age of the universe.

1NC - ISIS Version


ISIS will emerge as a serious threat to the US
Morell 15 (Michael Morell is the former deputy director of the CIA and has twice served
as acting director. He is the author of The Great War of Our Time: The CIA's Fight Against
Terrorism From al Qa'ida to ISIS. May 14, 2015 Time Magazine
ISIS Is a Danger on
U.S. Soil http://time.com/3858354/isis-is-a-danger-on-u-s-soil/)
The terrorist group poses a gathering threat. In the aftermath of the attempted terrorist attack on May 4 in Garland,
Texasfor which ISIS claimed responsibilitywe find ourselves again considering the question of whether or
not ISIS is a real threat. The answer is yes. A very serious one. Extremists inspired by Osama bin
Ladens ideology consider themselves to be at war with the U.S.; they want to attack us. It is
important to never forget thatno matter how long it has been since 9/11. ISIS is just the latest manifestation of bin
Ladens design. The group has grown faster than any terrorist group we can remember, and the
threat it poses to us is as wide-ranging as any we have seen. What ISIS has that al-Qaeda doesnt is
a Madison Avenue level of sophisticated messaging and social media. ISIS has a multilingual propaganda arm
known as al-Hayat, which uses GoPros and cameras mounted on drones to make videos that appeal to its followers.
And ISIS uses just about every tool in the platform boxfrom Twitter to YouTube to Instagramto great effect,
attracting fighters and funding. Digital media are one of the groups most significant strengths; they have helped
ISIS become an organization that poses four significant threats to the U.S. First, it is a threat to the stability of the
entire Middle East. ISIS is putting the territorial integrity of both Iraq and Syria at risk. And a further collapse of
either or both of these states could easily spread throughout the region, bringing with it sectarian and religious
strife, humanitarian crises and the violent redrawing of borders, all in a part of the world that remains critical to U.S.
national interests. ISIS now controls more territoryin Iraq and Syriathan any other terrorist group anywhere in the
world. When al-Qaeda in Iraq joined the fight in Syria, the group changed its name to ISIS. ISIS added Syrians and
foreign fighters to its ranks, built its supply of arms and money and gained significant battlefield experience fighting
Bashar Assads regime. Together with the security vacuum in Iraq and Nouri al-Malikis alienation of the Sunnis, this
culminated in ISISs successful blitzkrieg across western Iraq in the spring and summer of 2014, when it seized large
amounts of territory. ISIS is not the first extremist group to take and hold territory. Al-Shabab in Somalia did so a
number of years ago and still holds territory there, al-Qaeda in the Islamic Maghreb did so in Mali in 2012, and alQaeda in Yemen did so there at roughly the same time. I fully expect extremist groups to attempt to takeand
sometimes be successful in takingterritory in the years ahead. But no other group has taken so much territory so
quickly as ISIS has. Second, ISIS is attracting young men and women to travel to Syria and Iraq to join its cause. At
this writing, at least 20,000 foreign nationals from roughly 90 countries have gone to Syria and Iraq to join the fight.
Most have joined ISIS. This flow of foreigners has outstripped the flow of such fighters into Iraq during the war there
a decade ago. And there are more foreign fighters in Syria and Iraq today than there were in Afghanistan in the
1980s working to drive the Soviet Union out of that country. These foreign nationals are getting experience on the
battlefield, and they are becoming increasingly radicalized to ISISs cause. There is a particular subset of these
fighters to worry about. Somewhere between 3,500 and 5,000 jihadist wannabes have traveled to
Syria and Iraq from Western Europe, Canada, Australia and the U.S. They all have easy access to the U.S.
homeland, which presents two major concerns: that these fighters will leave the Middle East

and either conduct an attack on their own or conduct an attack at the direction of the ISIS
leadership. The former has already happened in Europe. It has not happened yet in the U.S.
but it will. In spring 2014, Mehdi Nemmouche, a young Frenchman who went to fight in Syria, returned to Europe
and shot three people at the Jewish Museum of Belgium in Brussels. The third threat is that ISIS is building a
following among other extremist groups around the world. The allied exaltation is happening at a faster pace than
al-Qaeda ever enjoyed. It has occurred in Algeria, Libya, Egypt and Afghanistan. More will follow. These groups,
which are already dangerous, will become even more so. They will increasingly target ISISs enemies (including us),
and they will increasingly take on ISISs brutality. We saw the targeting play out in early 2015 when an ISISassociated group in Libya killed an American in an attack on a hotel in Tripoli frequented by diplomats and
international businesspeople. And we saw the extreme violence play out just a few weeks after that when another
ISIS-affiliated group in Libya beheaded 21 Egyptian Coptic Christians. And fourth, perhaps most insidiously, ISISs
message is radicalizing young men and women around the globe who have never traveled to Syria or Iraq but who
want to commit an attack to demonstrate their solidarity with ISIS. These are the so-called lone wolves. Even before
May 4, such an ISIS-inspired attack had already occurred in the U.S.: an individual with sympathies for ISIS attacked
two New York City police officers with a hatchet. Al-Qaeda has inspired such U.S. attacksthe Fort Hood shootings in
late 2009 that killed 13 and the Boston Marathon bombing in spring 2013 that killed five and injured nearly 300.
The attempted attack in Texas is just the latest of these. We can expect more of these kinds of attacks in the U. S.
Attacks by ISIS-inspired individuals are occurring at a rapid pace around the worldroughly 10 since ISIS took control
of so much territory. Two such attacks have occurred in Canada, including the October 2014 attack on the
Parliament building. And another occurred in Sydney, in December 2014. Many planning such attacksin Australia,
Western Europe and the U.S.have been arrested before they could carry out their terrorist plans. Today an ISIS-

directed attack in the U. S. would be relatively unsophisticated (small-scale), but over time
ISISs capabilities will grow. This is what a long-term safe haven in Iraq and Syria would give ISIS, and it is

exactly what the group is planning to do. They have announced their intentionsjust like bin Laden did in the years
prior to 9/11.

Backdoors are key to stop ISIS recruitment


Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Jim Comey, ISIS, and "Going Dark"," Lawfare. 723-2015. http://www.lawfareblog.com/jim-comey-isis-and-going-dark//ghs-kw)

the nexus of our domestic ISIS


problem and what the FBI calls the "going dark" issue . CNN the other day reported on some
I had a lengthy conversation with FBI Director Jim Comey today about

remarks Comey made on the subject, remarks that have not gotten enough attention but reflect a problem at the

FBI Director James Comey said Thursday his agency does not yet have the
capabilities to limit ISIS attempts to recruit Americans through social media. It is
becoming increasingly apparent that Americans are gravitating toward the militant
organization by engaging with ISIS online , Comey said, but he told reporters that "we don't have the
capability we need" to keep the "troubled minds" at home. "Our job is to find needles in a
nationwide haystack, needles that are increasingly invisible to us because of end-toend encryption," Comey said. "This is the 'going dark' problem in high definition."
Comey said ISIS is increasingly communicating with Americans via mobile apps that are
difficult for the FBI to decrypt. He also explained that he had to balance the desire to intercept the
front of his mind these days:

communication with broader privacy concerns. "It is a really, really hard problem, but the collision that's going on
between important privacy concerns and public safety is significant enough that we have to figure out a way to
solve it," Comey said. Let's unpack this. As has been widely reported, the FBI has been busy recently dealing with

ISIS has gotten extremely good at the


inducing self-radicalization in disaffected souls worldwide using Twitter and because of the
ISIS threats. There have been a bunch of arrests, both because

convergence of Ramadan and the run-up to the July 4 holiday. As has also been widely reported, the FBI is

the effect of end-to-end encryption on its ability to conduct


counterterrorism operations and other law enforcement functions. The concern is
two-fold: It's about data at rest on devices, data that is now being encrypted in a
fashion that can't easily be cracked when those devices are lawfully seized. And it's
also about data in transit between devices, data encrypted such that when captured
with a lawful court-ordered wiretap, the signal intercepted is undecipherable. Comey
concerned about

raised his concerns on both subjects at a speech at Brookings last year and has talked about them periodically since

the extent to which the ISIS concerns and


the "going dark" concerns have converged. In his Brookings speech, Comey did not focus on
then: What was not clear to me until today, however, was

counterterrorism in the examples he gave of the going dark problem. In the remarks quoted by CNN, and in his

the landscape is changing fast. Initial


recruitment may take place on Twitter, but the promising ISIS candidate quickly gets
moved onto messaging platforms that are encrypted end to end. As a practical matter, that
means there are people in the United States whom authorities reasonably
believe to be in contact with ISIS for whom surveillance is lawful and
appropriate but for whom useful signals interception is not technically
feasible. That's a pretty scary thought. I don't know what the right answer is to this problem, which involves a
particularly complex mix of legitimate cybersecurity, investigative, and privacy questions. I do think the
problem is a very different one if the costs of impaired law enforcement access to
signal is enhanced ISIS ability to communicate with its recruits than if we're dealing
primarily with more routine crimes, even serious ones.
conversation with me today, however, he made clear that

ISIS is a threat to the grid


Landsbaum 14
(Mark, 9/5/2014, OC Register, Mark Landsbaum: Attack on power grid could
bring dark days, http://www.ocregister.com/articles/emp-633883-powerattack.html, 7/15/15, SM)

It could be worse. Terrorists pose an imminent threat to the U.S. electrical grid ,
which could leave the good ol USA looking like 19th century USA for a lot longer than three days. Dont take my
word for it. Ask

Peter Pry, former CIA officer and one-time House Armed Services Committee staffer,

who served on a congressional commission investigating such eventualities. There

is an imminent
threat from ISIS to the national electric grid and not just to a single U.S.
city, Pry warns. He points to a leaked U.S. Federal Energy Regulatory Commission report in March that said a

coordinated terrorist attack on just nine of the nations 55,000 electrical


power substations could cause coast-to-coast blackouts for up to 18 months .
Consider what youll have to worry about then. If you were uncomfortable watching looting and riots on TV last
month in Ferguson, Mo., as police stood by, project such unseemly behavior nationwide. For 18 months. Its likely
phones wont be reliable, so you wont have to watch police stand idly by. Chances are, police wont show up.
Worse, your odds of needing them will be excruciatingly more likely if terrorists attack the power grid using an
electromagnetic pulse (EMP) burst of energy to knock out electronic devices. The Congressional EMP Commission,

critical
systems in this country are distressingly unprotected. We calculated that,
based on current realities, in the first year after a full-scale EMP event, we
could expect about two-thirds of the national population 200 million
Americans to perish from starvation and disease, as well as anarchy in the
streets. Skeptical? Consider who is capable of engineering such measures before dismissing the likelihood. In
his 2013 book, A Nation Forsaken, Michael Maloof reported that the 2008 EMP Commission considered whether a
hostile nation or terrorist group could attack with a high-altitude EMP weapon
and determined, any number of adversaries possess both the ballistic
missiles and nuclear weapons capabilities, and could attack within 15 years. That was six
years ago. North Korea, Pakistan, India, China and Russia are all in the position
to launch an EMP attack against the United States now, Maloof wrote last year. Maybe
on which I served, did an extensive study of this, Pry says. We discovered to our own revulsion that

youll rest more comfortably knowing the House intelligence authorization bill passed in May told the intelligence
community to report to Congress within six months, on the threat posed by man-made electromagnetic pulse
weapons to United States interests through 2025, including threats from foreign countries and foreign nonstate
actors. Or, maybe thats not so comforting. In 2004 and again in 2008, separate congressional commissions gave
detailed, horrific reports on such threats. Now, Congress wants another report. In his book, Maloof quotes Clay
Wilson of the Congressional Research Service, who said, Several nations, including reported sponsors of terrorism,
may currently have a capability to use EMP as a weapon for cyberwarfare or cyberterrorism to disrupt
communications and other parts of the U.S. critical infrastructure. What would an EMP attack look like? Within an
instant, Maloof writes, we will have no idea whats happening all around us, because we will have no news. There
will be no radio, no TV, no cell signal. No newspaper delivered. Products wont flow into the nearby Wal-Mart. The
big trucks will be stuck on the interstates. Gas stations wont be able to pump the fuel they do have. Some police
officers and firefighters will show up for work, but most will stay home to protect their own families. Power lines will
get knocked down in windstorms, but nobody will care. Theyll all be fried anyway. Crops will wither in the fields
until scavenged since the big picking machines will all be idled, and there will be no way to get the crop to market
anyway. Nothing

thats been invented in the last 50 years based on computer


chips, microelectronics or digital technology will work. And it will get
worse.

Cyberterror leads to nuclear exchanges traditional defense


doesnt apply
Fritz 9 (Jason, Master in International Relations from Bond, BS from St.
Cloud), Hacking Nuclear Command and Control, International Commission
on Nuclear Non-proliferation and Disarmament, 2009, pnnd.org)//duncan
This paper will analyse the threat of cyber terrorism in regard to nuclear weapons.

Specifically, this research will use open source knowledge to identify the structure of nuclear command and control
centres, how those structures might be compromised through computer network operations, and how doing so

If access to command
and control centres is obtained, terrorists could fake or actually cause one
nuclear-armed state to attack another, thus provoking a nuclear response
would fit within established cyber terrorists capabilities, strategies, and tactics.

from another nuclear power. This may be an easier alternative for terrorist
groups than building or acquiring a nuclear weapon or dirty bomb themselves. This
would also act as a force equaliser, and provide terrorists with the asymmetric
benefits of high speed, removal of geographical distance, and a relatively low cost.
Continuing difficulties in developing computer tracking technologies which could trace
the identity of intruders, and difficulties in establishing an internationally agreed upon legal
framework to guide responses to computer network operations, point towards an inherent
weakness in using computer networks to manage nuclear weaponry . This is
particularly relevant to reducing the hair trigger posture of existing nuclear
arsenals. All computers which are connected to the internet are susceptible to
infiltration and remote control. Computers which operate on a closed network may also be compromised
by various hacker methods, such as privilege escalation, roaming notebooks, wireless access points, embedded
exploits in software and hardware, and maintenance entry points. For example, e-mail spoofing targeted at
individuals who have access to a closed network, could lead to the installation of a virus on an open network. This
virus could then be carelessly transported on removable data storage between the open and closed network.

Efforts by
militaries to place increasing reliance on computer networks , including experimental
technology such as autonomous systems, and their desire to have multiple launch
options, such as nuclear triad capability, enables multiple entry points for terrorists .
Information found on the internet may also reveal how to access these closed networks directly.

For example, if a terrestrial command centre is impenetrable, perhaps isolating one nuclear armed submarine would
prove an easier task. There is evidence to suggest multiple attempts have been made by hackers to compromise
the extremely low radio frequency once used by the US Navy to send nuclear launch approval to submerged
submarines. Additionally, the alleged Soviet system known as Perimetr was designed to automatically launch
nuclear weapons if it was unable to establish communications with Soviet leadership. This was intended as a
retaliatory response in the event that nuclear weapons had decapitated Soviet leadership; however it did not
account for the possibility of cyber terrorists blocking communications through computer network operations in an
Should a warhead be launched, damage could be further
enhanced through additional computer network operations. By using proxies, multilayered attacks could be engineered. Terrorists could remotely commandeer computers in China and

attempt to engage the system.

use them to launch a US nuclear attack against Russia. Thus Russia would believe it was under attack from the US

emergency response communications


could be disrupted, transportation could be shut down, and disinformation, such as
misdirection, could be planted, thereby hindering the disaster relief effort and
maximizing destruction. Disruptions in communication and the use of
disinformation could also be used to provoke uninformed responses. For
and the US would believe China was responsible. Further,

example, a nuclear strike between India and Pakistan could be coordinated with Distributed Denial of Service
attacks against key networks, so they would have further difficulty in identifying what happened and be forced to
respond quickly. Terrorists could also knock out communications between these states so they cannot discuss the

amidst the confusion of a traditional large-scale terrorist attack,


claims of responsibility and declarations of war could be falsified in an attempt to
instigate a hasty military response. These false claims could be posted directly on Presidential,
situation. Alternatively,

military, and government websites. E-mails could also be sent to the media and foreign governments using the IP
addresses and e-mail accounts of government officials. A sophisticated and all encompassing combination of
traditional terrorism and cyber terrorism could be enough to launch nuclear weapons on its own, without the need
for compromising command and control centres directly.

2NC UQ - ISIS
ISIS is mobilizing now and ready to take action.
DeSoto 5/7 (Randy DeSoto May 7, 2015 http://www.westernjournalism.com/isis-claimsto-have-71-trained-soldiers-in-targeted-u-s-states/ Randy DeSoto is a writer for
Western Journalism, which consistently ranks in the top 5 most popular conservative
online news outlets in the country)
Purported ISIS jihadists issued threats against the United States Tuesday, indicating the
group has trained soldiers positioned throughout the country, ready to attack any target we
desire. The online post singles out controversial blogger Pamela Geller, one of the organizers of the Draw
the Prophet Muhammad cartoon contest in Garland, Texas, calling for her death to heal the hearts of our brothers
and disperse the ones behind her. ISIS also claimed responsibility for the shooting, which marked
the first time the terror group claimed responsibility for an attack on U.S. soil , according to the
New York Daily News. The attack by the Islamic State in America is only the beginning of our efforts to
establish a wiliyah [authority or governance] in the heart of our enemy, the ISIS post reads. As for Geller, the
jihadists state: To those who protect her: this will be your only warning of housing this woman and her circus show.
Everyone who houses her events, gives her a platform to spill her filth are legitimate targets. We have been
watching closely who was present at this event and the shooter of our brothers. ISIS further claims to have

known that the Muhammad cartoon contest venue would be heavily guarded, but conducted
the attack to demonstrate the willingness of its followers to die for the Sake of Allah. The FBI
and the Department of Homeland Security, in fact, issued a bulletin on April 20 indicating the event would be a
likely terror target. ISIS drew its message to a close with an ominous threat: We have 71 trained

soldiers in 15 different states ready at our word to attack any target we desire. Out of the 71
trained soldiers 23 have signed up for missions like Sunday, We are increasing in number
bithnillah [if God wills]. Of the 15 states, 5 we will name Virginia, Maryland, Illinois, California,
and MichiganThe next six months will be interesting. Fox News reports that the U.S. intelligence
community was assessing the threat and trying to determine if the source is directly related
to ISIS leadership or an opportunist such as a low-level militant seeking to further capitalize
on the Garland incident. Former Navy Seal Rob ONeill told Fox News he believes the ISIS threat is credible,
and the U.S. must be prepared. He added that the incident in Garland is a prime example of the difference
between a gun free zone and Texas. They showed up at Charlie Hebdo, and it was a massacre. If these two guys
had gotten into that building it would have been Charlie Hebdo times ten. But these two guys showed up because
they were offended by something protected by the First Amendment, and were quickly introduced to the Second
Amendment. Geller issued a statement regarding the ISIS posting: This threat illustrates the savagery and
barbarism of the Islamic State. They want me dead for violating Sharia blasphemy laws. What remains to be seen is
whether the free world will finally wake up and stand for the freedom of speech, or instead kowtow to this evil and
continue to denounce me.

ISIS will attack three reasons its capabilities are growing,


an attack would be good propaganda, and it basically hates all
things America
Rogan 15 (Tom, panelist on The McLaughlin Group and holds the Tony Blankley Chair at
the Steamboat Institute, Why ISIS Will Attack America, National Review, 3-24-15,
http://www.nationalreview.com/article/415866/why-isis-will-attack-america-tom-rogan)//MJ

There is no good in you if they are secure and happy while you have a pulsing vein. Erupt volcanoes of jihad
everywhere. Light the earth with fire upon all the [apostate rulers], their soldiers and supporters. ISIS leader Abu
Bakr al-Baghdadi, November 2014. Those words werent idle. The Islamic State (ISIS) is still advancing,

across continents and cultures. Its attacking Shia Muslims in Yemen, gunning down Western
tourists in Tunisia, beheading Christians in Libya, and murdering or enslaving all who do not
yield in Iraq and Syria. Its black banner seen as undaunted by the international coalition against it, new
recruits still flock to its service. The Islamic States rise is, in other words, not over, and it is likely to end up
involving an attack on America. Three reasons why such an attempt is inevitable: ISISS STRATEGY
PRACTICALLY DEMANDS IT Imbued with existential hatred against the United States, the group doesnt just oppose
American power, it opposes Americas identity. Where the United States is a secular democracy that binds law to
individual freedom, the Islamic State is a totalitarian empire determined to sweep freedom from the earth. As an
ideological and physical necessity, ISIS must ultimately conquer America. Incidentally, this kind of
total-war strategy explains why counterterrorism experts are rightly concerned about nuclear proliferation. The

Islamic States strategy is also energized by its desire to replace al-Qaeda as Salafi jihadisms
global figurehead. While al-Qaeda in the Arabian Peninsula (AQAP) and ISIS had a short flirtation last year, ISIS
has now signaled its intent to usurp al-Qaedas power in its home territory. Attacks by ISIS last week against
Shia mosques in the Yemeni capital of Sanaa were, at least in part, designed to suck recruits, financial donors, and
prestige away from AQAP. But to truly displace al-Qaeda, ISIS knows it must furnish a new 9/11. ITS
CAPABILITIES ARE GROWING Today, ISIS has thousands of European citizens in its ranks. Educated at the
online University of Edward Snowden, ISIS operations officers have cut back intelligence services

ability to monitor and disrupt their communications. With EU intelligence services stretched
beyond breaking point, ISIS has the means and confidence to attempt attacks against the
West. EU passports are powerful weapons: ISIS could attack as al-Qaeda has repeatedly U.S. targets around
the world. AN ATTACK ON THE U.S. IS PRICELESS PROPAGANDA For transnational Salafi jihadists like alQaeda and ISIS, a successful blow against the U.S. allows them to claim the mantle of a global
force and strengthens the narrative that theyre on a holy mission. Holiness is especially important:
ISIS knows that to recruit new fanatics and deter its enemies, it must offer an abiding narrative of strength and
divine purpose. With the groups leaders styling themselves as Mohammeds heirs, Allahs

chosen warriors on earth, attacking the infidel United States would reinforce ISISs narrative.
Of course, attacking America wouldnt actually serve the Islamic States long-term objectives. Quite the opposite:
Any atrocity would fuel a popular American resolve to crush the group with expediency. (Make no mistake, it would
be crushed.) The problem, however, is that, until then, America is in the bulls eye.

2NC Cyber - ISIS


ISIS is a threat to the grid
Landsbaum 14
(Mark, 9/5/2014, OC Register, Mark Landsbaum: Attack on power grid could
bring dark days, http://www.ocregister.com/articles/emp-633883-powerattack.html, 7/15/15, SM)
It could be worse. Terrorists pose an imminent threat to the U.S. electrical grid ,
which could leave the good ol USA looking like 19th century USA for a lot longer than three days. Dont take my

Peter Pry, former CIA officer and one-time House Armed Services Committee staffer,
is an imminent
threat from ISIS to the national electric grid and not just to a single U.S. city,
Pry warns. He points to a leaked U.S. Federal Energy Regulatory Commission report in March that said a
coordinated terrorist attack on just nine of the nations 55,000 electrical
power substations could cause coast-to-coast blackouts for up to 18 months .
word for it. Ask

who served on a congressional commission investigating such eventualities. There

Consider what youll have to worry about then. If you were uncomfortable watching looting and riots on TV last
month in Ferguson, Mo., as police stood by, project such unseemly behavior nationwide. For 18 months. Its likely
phones wont be reliable, so you wont have to watch police stand idly by. Chances are, police wont show up.
Worse, your odds of needing them will be excruciatingly more likely if terrorists attack the power grid using an
electromagnetic pulse (EMP) burst of energy to knock out electronic devices. The Congressional EMP Commission,

critical
systems in this country are distressingly unprotected. We calculated that,
based on current realities, in the first year after a full-scale EMP event, we
could expect about two-thirds of the national population 200 million
Americans to perish from starvation and disease, as well as anarchy in the
streets. Skeptical? Consider who is capable of engineering such measures before dismissing the likelihood. In
his 2013 book, A Nation Forsaken, Michael Maloof reported that the 2008 EMP Commission considered whether a
hostile nation or terrorist group could attack with a high-altitude EMP weapon
and determined, any number of adversaries possess both the ballistic
missiles and nuclear weapons capabilities, and could attack within 15 years. That was six
years ago. North Korea, Pakistan, India, China and Russia are all in the position
to launch an EMP attack against the United States now, Maloof wrote last year. Maybe
on which I served, did an extensive study of this, Pry says. We discovered to our own revulsion that

youll rest more comfortably knowing the House intelligence authorization bill passed in May told the intelligence
community to report to Congress within six months, on the threat posed by man-made electromagnetic pulse
weapons to United States interests through 2025, including threats from foreign countries and foreign nonstate
actors. Or, maybe thats not so comforting. In 2004 and again in 2008, separate congressional commissions gave
detailed, horrific reports on such threats. Now, Congress wants another report. In his book, Maloof quotes Clay
Wilson of the Congressional Research Service, who said, Several nations, including reported sponsors of terrorism,
may currently have a capability to use EMP as a weapon for cyberwarfare or cyberterrorism to disrupt
communications and other parts of the U.S. critical infrastructure. What would an EMP attack look like? Within an
instant, Maloof writes, we will have no idea whats happening all around us, because we will have no news. There
will be no radio, no TV, no cell signal. No newspaper delivered. Products wont flow into the nearby Wal-Mart. The
big trucks will be stuck on the interstates. Gas stations wont be able to pump the fuel they do have. Some police
officers and firefighters will show up for work, but most will stay home to protect their own families. Power lines will
get knocked down in windstorms, but nobody will care. Theyll all be fried anyway. Crops will wither in the fields
until scavenged since the big picking machines will all be idled, and there will be no way to get the crop to market
anyway. Nothing

thats been invented in the last 50 years based on computer


chips, microelectronics or digital technology will work. And it will get
worse.

2NC Links
Backdoors are key to prevent terrorism
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)

The risks related to going dark are real.

When the President of the United States,60 the Prime


Minister of the United Kingdom,61 and the Director of the FBI62 all publically express deep concerns about how this

encryption
technologies that are making it increasingly easy for individual users to
prevent even lawful government access to potentially vital information
related to crimes or other national security threats. This evolution of
individual encryption capabilities represents a fundamental distortion of the balance
between government surveillance authority and individual liberty central to the Fourth
phenomenon will endanger their respective nations, it is difficult to ignore. Today,

Amendment. And balance is the operative word. The right of The People to be secure against unreasonable
government intrusions into those places and things protected by the Fourth Amendment must be vehemently

Reasonable searches, however, should not only be permitted, but they should be
mandated where necessary. Congress has the authority to ensure that such
searches are possible. While some argue that this could cause American manufacturers to suffer, saddled
protected.

as they will appear to be by the Snowden Effect, the rules will apply equally to any manufacturer that wishes to
do business in the United States. Considering that the United States economy is the largest in the world, it is highly
unlikely that foreign manufacturers will forego access to our market in order to avoid having to create CALEA-like
solutions to allow for lawful access to encrypted data. Just as foreign cellular telephone providers, such as T-Mobile,
are active in the United States, so too will foreign device manufacturers and other communications services adjust
their technology to comply with our laws and regulations. This will put American and foreign companies on an equal
playing field while encouraging ingenuity and competition. Most importantly, the

right of the people to


be secure in their persons, houses, papers, and effects will be protected not only against
unreasonable searches and seizures, but also against attacks by criminals and terrorists.
And is not this, in essence, the primary purpose of government?

Backdoors are key to securityterror turns the case


Goldsmith 13
(Jack Goldsmith. Jack Goldsmith, a contributing editor, teaches at Harvard Law School and is a member
of the Hoover Institution Task Force on National Security and Law. "We Need an Invasive NSA," New
Republic. 10-10-2013. http://www.newrepublic.com/article/115002/invasive-nsa-will-protect-us-cyberattacks//ghs-kw)
Ever since stories about the National Security Agencys (NSA) electronic intelligence-gathering capabilities began

The New York Times has published more than a dozen editorials excoriating the
national surveillance state. It wants the NSA to end the mass warehousing of everyones data and
the use of back doors to break encrypted communications. A major element of the Times critique is that
tumbling out last June,

the NSAs domestic sweeps are not justified by the terrorist threat they aim to prevent. At the end of August, in the
midst of the Times assault on the NSA, the newspaper suffered what it described as a malicious external attack
on its domain name registrar at the hands of the Syrian Electronic Army, a group of hackers who support Syrian
President Bashar Al Assad. The papers website was down for several hours and, for some people, much longer. In
terms of the sophistication of the attack, this is a big deal, said Marc Frons, the Times chief information officer. Ten
months earlier, hackers stole the corporate passwords for every employee at the Times, accessed the computers of
53 employees, and breached the e-mail accounts of two reporters who cover China. We brought in the FBI, and the
FBI said this had all the hallmarks of hacking by the Chinese military, Frons said at the time. He also acknowledged
that the hackers were in the Times system on election night in 2012 and could have wreaked havoc on its

cyber-intrusions threaten corporate America and the U.S.


government every day. Relentless assaults on Americas computer networks by
China and other foreign governments, hackers and criminals have created an urgent
need for safeguards to protect these vital systems , the Times editorial page noted last year
coverage if they wanted. Such

while supporting legislation encouraging the private sector to share cybersecurity information with the government.

Keith Alexander, the director of the NSA, who had noted a 17-fold increase in cyberintrusions on critical infrastructure from 2009 to 2011 and who described the losses
in the United States from cyber-theft as the greatest transfer of wealth in history.
If a catastrophic cyber-attack occurs, the Times concluded, Americans will be justified
in asking why their lawmakers ... failed to protect them. When catastrophe
strikes, the public will adjust its tolerance for intrusive government
measures. The Times editorial board is quite right about the seriousness of the
cyber- threat and the federal governments responsibility to redress it. What it does
not appear to realize is the connection between the domestic NSA surveillance it
detests and the governmental assistance with cybersecurity it cherishes . To keep
our computer and telecommunication networks secure, the government
will eventually need to monitor and collect intelligence on those networks
using techniques similar to ones the Times and many others find
reprehensible when done for counterterrorism ends. The fate of domestic
surveillance is today being fought around the topic of whether it is needed to stop
Al Qaeda from blowing things up. But the fight tomorrow, and the more important
fight, will be about whether it is necessary to protect our ways of life embedded in
computer networks. Anyone anywhere with a connection to the Internet can engage in cyber-operations
within the United States. Most truly harmful cyber-operations, however, require group effort
and significant skill. The attacking group or nation must have clever hackers,
significant computing power, and the sophisticated software known as malwarethat
enables the monitoring, exfiltration, or destruction of information inside a computer.
It cited General

The supply of all of these resources has been growing fast for many yearsin governmental labs devoted to

Telecommunication networks
are the channels through which malware typically travels , often anonymized or encrypted, and
buried in the billions of communications that traverse the globe each day. The targets are the
communications networks themselves as well as the computers they connect things
developing these tools and on sprawling black markets on the Internet.

like the Times servers, the computer systems that monitor nuclear plants, classified documents on computers in

To keep these
computers and networks secure, the government needs powerful intelligence
capabilities abroad so that it can learn about planned cyber-intrusions. It also needs
to raise defenses at home. An important first step is to correct the market failures that plague
the Pentagon, the nasdaq exchange, your local bank, and your social-network providers.

cybersecurity. Through law or regulation, the government must improve incentives for individuals to use security
software, for private firms to harden their defenses and share information with one another, and for Internet service
providers to crack down on the botnetsnetworks of compromised zombie computersthat underlie many cyberattacks. More, too, must be done to prevent insider threats like Edward Snowdens, and to control the stealth
introduction of vulnerabilities during the manufacture of computer componentsvulnerabilities that can later be

The U.S. government can fully


monitor air, space, and sea for potential attacks from abroad. But it has limited
access to the channels of cyber-attack and cyber-theft, because they are owned by
private telecommunication firms, and because Congress strictly limits government access to private
used as windows for cyber-attacks. And yet thats still not enough.

communications. I cant defend the country until Im into all the networks, General Alexander reportedly told

being in the network means having


government computers scan the content and metadata of Internet communications
in the United States and store some of these communications for extended periods.
Such access, he thinks, will give the government a fighting chance to find the needle of known
malware in the haystack of communications so that it can block or degrade the attack or
exploitation. It will also allow it to discern patterns of malicious activity in the swarm
of communications, even when it doesnt possess the malwares signature. And it
will better enable the government to trace back an attacks trajectory so that it can
discover the identity and geographical origin of the threat. Alexanders domestic
senior government officials a few months ago. For Alexander,

cybersecurity plans look like pumped-up versions of the NSAs counterterrorism-related homeland surveillance that
has sparked so much controversy in recent months. That is why so many people in Washington think that

Alexanders vision has virtually no chance of moving forward, as the Times recently reported. Whatever trust
was there is now gone, a senior intelligence official told Times. There are two reasons to think that these

the government, with extensive assistance from the NSA, will one day
intimately monitor private networks. The first is that the cybersecurity threat is more
pervasive and severe than the terrorism threat and is somewhat easier to see. If the Times website
goes down a few more times and for longer periods, and if the next penetration of
its computer systems causes large intellectual property losses or a compromise in
its reporting, even the editorial page would rethink the proper balance of privacy
and security. The point generalizes: As cyber-theft and cyber-attacks continue to
spread (and they will), and especially when they result in a catastrophic
disaster (like a banking compromise that destroys market confidence, or a
successful attack on an electrical grid), the public will demand
government action to remedy the problem and will adjust its tolerance for
intrusive government measures. At that point, the nations willingness to adopt some version of
predictions are wrong and that

Alexanders vision will depend on the possibility of credible restraints on the NSAs activities and credible ways for

the second
reason why skeptics about enhanced government involvement in the network might be wrong. The
public mistrusts the NSA not just because of what it does, but also because of its extraordinary secrecy. To
obtain the credibility it needs to secure permission from the American people to protect our
networks, the NSA and the intelligence community must fundamentally recalibrate their
attitude toward disclosure and scrutiny. There are signs that this is happening and
the public to monitor, debate, and approve what the NSA is doing over time. Which leads to

that, despite the undoubted damage he inflicted on our national security in other respects, we have Edward
Snowden to thank. Before the unauthorized disclosures, we were always conservative about discussing specifics of
our collection programs, based on the truism that the more adversaries know about what were doing, the more
they can avoid our surveillance, testified Director of National Intelligence James Clapper last month. But the
disclosures, for better or worse, have lowered the threshold for discussing these matters in public. In the last few

the NSA has done the unthinkable in releasing dozens of documents that
implicitly confirm general elements of its collection capabilities. These revelations are
weeks,

bewildering to most people in the intelligence community and no doubt hurt some elements of collection. But they

are justified by the countervailing need for public debate about , and public confidence in,
NSA activities that had run ahead of what the public expected. And they suggest that secrecy about collection
capacities is one value, but not the only or even the most important one. They also show that not all revelations of
NSA capabilities are equally harmful. Disclosure that it sweeps up metadata is less damaging to its mission than
disclosure of the fine-grained details about how it collects and analyzes that metadata.

2NC Turns Backdoors


Cyberattacks turn the casepublic pressures for backdoors
Goldsmith 13
(Jack Goldsmith. Jack Goldsmith, a contributing editor, teaches at Harvard Law School and is a member
of the Hoover Institution Task Force on National Security and Law. "We Need an Invasive NSA," New
Republic. 10-10-2013. http://www.newrepublic.com/article/115002/invasive-nsa-will-protect-us-cyberattacks//ghs-kw)
There are two reasons to think that these predictions are wrong and that the government, with extensive assistance
from

the NSA, will one day intimately monitor private networks.

The first is that the

If the
Times website goes down a few more times and for longer periods, and if the next
penetration of its computer systems causes large intellectual property losses or a
compromise in its reporting, even the editorial page would rethink the proper
balance of privacy and security. The point generalizes: As cyber-theft and cyberattacks continue to spread (and they will), and especially when they result
in a catastrophic disaster (like a banking compromise that destroys market confidence, or a
successful attack on an electrical grid), the public will demand
government action to remedy the problem and will adjust its tolerance for
intrusive government measures.
cybersecurity threat is more pervasive and severe than the terrorism threat and is somewhat easier to see.

Ptix

1NC
Backdoors are popular nownational security concerns
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark: Part I,"
Lawfare. 7-23-2015. http://www.lawfareblog.com/thoughts-encryption-and-going-dark-part-i//ghs-kw)
In other words, I think Comey and Yates inevitably are asking for legislation , at least in the
longer term. The administration has decided not to seek it now, so the conversation is taking place at
a somewhat higher level of abstraction than it would if there were a specific legislative proposal on

But the current discussion should be understood as an effort to begin


building a legislative coalition for some sort of mandate that internet platform
companies retain (or build) the ability to permit, with appropriate legal process, the capture
and delivery to law enforcement and intelligence authorities of decrypted versions
of the signals they carry. This coalition does not exist yet , particularly not in the House of
Representatives. But yesterday's hearings were striking in showing how successful
Comey has been in the early phases of building it. A lot of members are clearly
concerned already. That concern will likely grow if Comey is correct about the speed
with which major investigative tools are weakening in their utility. And it could
become a powerful force in the event an attack swings the pendulum away from
civil libertarian orthodoxy.
the table.

2NC
(KQ) 1AC Macri 14 evidence magnifies the link to politics: The
U.S. Senate voted down consideration of a bill on Tuesday that
would have reigned in the NSAs powers to conduct domestic
surveillance, upping the legal hurdles for certain types of
spying Rogers repeated Thursday he was largely uninterested
in.
Even if backdoors are unpopular now, that will inevitably
change
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)

There's a final, non-legal factor that may push companies to work this problem as
energetically as they are now moving toward end-to-end encryption: politics. We are
at very particular moment in the cryptography debate, a moment in which law
enforcement sees a major problem as having arrived but the tech companies see
that problem as part of the solution to the problems the Snowden revelations
created for them. That is, we have an end-to-end encryption issue, in significant part, because companies
are trying to assure customers worldwide that they have their backs privacy-wise and are not simply tools of NSA. I

politics are likely to change. If Comey is right and we start seeing law
enforcement and intelligence agencies blind in investigating and preventing horrible
crimes and significant threats, the pressure on the companies is going to shift. And
it may shift fast and hard. Whereas the companies now feel intense pressure to
assure customers that their data is safe from NSA, the kidnapped kid with the
encrypted iPhone is going to generate a very different sort of political response. In
extraordinary circumstances, extraordinary access may well seem reasonable. And
people will wonder why it doesn't exist.
think those

Military DA

1NC
Cyber-deterrence is strong now but keeping our capabilities in
line with other powers is key to maintain stability
Healey 14
(Healey, Jason. Jason Healey is a Nonresident Senior Fellow for the Cyber Statecraft Initiative of the
Atlantic Council and Senior Research Scholar at Columbia University's School of International and
Public Affairs, focusing on international cooperation, competition, and conflict in cyberspace. From
2011 to 2015, he worked as the Director of the Council's Cyber Statecraft Initiative. Starting his career
in the United States Air Force, Mr. Healey earned two Meritorious Service Medals for his early work in
cyber operations at Headquarters Air Force at the Pentagon and as a plankholder (founding member)
of the Joint Task Force Computer Network Defense, the world's first joint cyber warfighting unit. He
has degrees from the United States Air Force Academy (political science), Johns Hopkins University
(liberal arts), and James Madison University (information security). "Commentary: Cyber Deterrence Is
Working," Defense News. 7-30-2014.
http://archive.defensenews.com/article/20140730/DEFFEAT05/307300017/Commentary-CyberDeterrence-Working//ghs-kw)

cyber deterrence is not only


possible but has been working for decades. Cyberwar professionals are in the midst of a
decades-old debate on how America could deter adversaries from attacking us in
cyberspace. In 2010, then-Deputy Defense Secretary Bill Lynn summed up the prevailing view that Cold War
Despite the mainstream view of cyberwar professionals and theorists,

deterrence models do not apply to cyberspace because of low barriers to entry and the anonymity of Internet
attacks. Cyber attacks, unlike intercontinental missiles, dont have a return address. But this view is too narrow and

The history of how nations have actually fought (or not fought) conflicts in
cyberspace makes it clear deterrence is not only theoretically possible, but is
actually keeping an upper threshold to cyber hostilities. The hidden hand of
deterrence is most obvious in the discussion of a digital Pearl Harbor. In 2012,
then-Defense Secretary Leon Panetta described his worries of such a bolt-from-theblue attack that could cripple the United States or its military. Though his phrase raised
eyebrows among cyber professionals, there was broad agreement with the basic implication:
The United States is strategically vulnerable and potential adversaries have both
the means for strategic attack and the will to do it. But worrying about a digital
Pearl Harbor actually dates not to 2012 but to testimony by Winn Schwartau to
Congress in 1991. So cyber experts have been handwringing about a digital Pearl
Harbor for more than 20 of the 70 years since the actual Pearl Harbor. Waiting for Blow To
Come? Clearly there is a different dynamic than recognized by conventional wisdom. For over two
decades, the United States has had its throat bared to the cyber capabilities of
potential adversaries (and presumably their throats are as bared to our capabilities),
yet the blow has never come. There is no solid evidence anyone has ever been killed by any cyber
technical.

attack; no massive power outages, no disruptions of hospitals or faking of hospital records, no tampering of dams

The Internet is a fierce domain and conflicts are common


between nations. But deterrence or at least restraint has kept a lid on the
worst. Consider: Large nations have never launched strategically significant
disruptive cyber attacks against other large nations. China, Russia and the United States seem
causing a catastrophic flood.

to have plans to do so not as surprise attacks from a clear sky, but as part of a major (perhaps even existential)

Cyber attacks between equals have


always stayed below the threshold of death and destruction. Larger nations do
seem to be willing to launch significant cyber assaults against rivals but only during
larger crises and below the threshold of death and destruction, such as Russian attacks
international security crisis not unlike the original Pearl Harbor.

against Estonia and Georgia or China egging on patriotic hackers to disrupt computers in dust-ups with Japan,
Vietnam or the Philippines. The United States and Israel have perhaps come closest to the threshold with the
Stuxnet attacks but even here, the attacks were against a very limited target (Iranian programs to enrich uranium)
and hardly out of the blue. Nations seem almost completely unrestrained using cyber espionage to further their
security (and sometimes commercial) objectives and only slightly more restrained using low levels of cyber force for
small-scale disruption, such as Chinese or Russian disruption of dissidents websites or British disruption of chat
rooms used by Anonymous to coordinate protest attacks.

In a discussion about any other kind of

military power, such as nuclear weapons, we would have no problem using the word
deterrence to describe nations reluctance to unleash capabilities against one
another. Indeed, a comparison with nuclear deterrence is extremely relevant, but
not necessarily the one that Cold Warriors have recognized. Setting a Ceiling Nuclear
weapons did not make all wars unthinkable, as some early postwar thinkers had
hoped. Instead, they provided a ceiling under which the superpowers fought all
kinds of wars, regular and irregular. The United States and Soviet Union, and their allies and proxies,
engaged in lethal, intense conflicts from Korea to Vietnam and through proxies in Africa, Asia and Latin America.

Nuclear warheads did not stop these wars, but did set an upper threshold neither
side proved willing to exceed. Likewise, the most cyber capable nations (including
America, China and Russia) have been more than willing to engage in irregular cyber
conflicts, but have stayed well under the threshold of strategic cyber
warfare, creating a de facto norm. Nations have proved just as unwilling to launch
a strategic attack in cyberspace as they are in the air, land, sea or space. The new
norm is same as the old norm. This norm of strategic restraint is a blessing but still is no help
to deter cyber crime or the irregular conflicts that have long occurred under the threshold. Cyber espionage and
lesser state-sponsored cyber disruption seem to be increasing markedly in the last few years.

Backdoors are key to cyberoffensive capabilities


Schneier 13
(Schneier. Schneier is a fellow at the Berkman Center for Internet & Society at Harvard Law School and
a program fellow at the New America Foundation's Open Technology. He is an American cryptographer,
computer security and privacy specialist, and writer. He is the author of several books on general
security topics, computer security and cryptography. He is also a contributing writer for The Guardian
news organization.[ "US Offensive Cyberwar Policy. 06-21-2013.
https://www.schneier.com/blog/archives/2013/06/us_offensive_cy.html//ghs-kw)

Cyberattacks have the potential to be both immediate and devastating. They can
disrupt communications systems, disable national infrastructure , or, as in the case of
Stuxnet, destroy nuclear reactors; but only if they've been created and targeted beforehand.
Before launching cyberattacks against another country, we have to go through several
steps. We have to study the details of the computer systems they're running and
determine the vulnerabilities of those systems. If we can't find exploitable vulnerabilities,
we need to create them: leaving "back doors," in hacker speak. Then we have to build
new cyberweapons designed specifically to attack those systems. Sometimes we have to
embed the hostile code in those networks -- these are called "logic bombs" -- to be unleashed in the future. And we
have to keep penetrating those foreign networks, because computer systems
always change and we need to ensure that the cyberweapons are still effective. Like
our nuclear arsenal during the Cold War, our cyberweapons arsenal must be pretargeted and
ready to launch. That's what Obama directed the US Cyber Command to do. We can see glimpses of
how effective we are in Snowden's allegations that the NSA is currently penetrating
foreign networks around the world: "We hack network backbones -- like huge
Internet routers, basically -- that give us access to the communications of hundreds
of thousands of computers without having to hack every single one."

Loss of cyber-offensive capabilities incentivizes China to take


Taiwanturns heg and the economy
Hjortdal 11
(Magnus Hjortdal received his BSc and MSc in Political Science, with a specialization in IR, from the
University of Copenhagen. He was an Assistant Lecturer at the University of Copenhagen, a Research
Fellow at the Royal Danish Defence College, and is now the Head of the Ministry of Foreign Affairs in
Denmark. China's Use of Cyber Warfare: Espionage Meets Strategic Deterrence , Journal of Strategic
Security, Vol. 4 No. 2, Summer 2011: Strategic Security in the Cyber Age, Article 2, pp 1-24.
http://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1101&context=jss//ghs-kw)

China's military strategy mentions cyber capabilities as an area that the People's

Liberation Army (PLA) should invest in and use on a large scale. 13 The U.S. Secretary of
Defense, Robert Gates, has also declared that China's development in the cyber area
increasingly concerns him,14 and that there has been a decade-long trend of cyber attacks emanating
from China.15 Virtually all digital and electronic military systems can be attacked via cyberspace. Therefore, it is
essential for a state to develop capabilities in this area if it wishes to challenge the
present American hegemony. The interesting question then is whether China is developing capabilities in
cyberspace in order to deter the United States.16 China's military strategists describe cyber
capabilities as a powerful asymmetric opportunity in a deterrence strategy. 19
Analysts consider that an "important theme in Chinese writings on computernetwork operations (CNO) is the use of computer-network attack (CNA) as the
spearpoint of deterrence."20 CNA increases the enemy's costs to become too great
to engage in warfare in the first place, which Chinese analysts judge to be essential
for deterrence.21 This could, for example, leave China with the potential ability to
deter the United States from intervening in a scenario concerning Taiwan.
CNO is viewed as a focal point for the People's Liberation Army, but it is not clear how the actual capacity functions

If a state with superpower potential (here China) is


to create an opportunity to ascend militarily and politically in the international
system, it would require an asymmetric deterrence capability such as that described
here.23 It is said that the "most significant computer network attack is characterized
as a pre-emption weapon to be used under the rubric of the rising Chinese strategy
of [] gaining mastery before the enemy has struck." 24 Therefore, China, like other states
or precisely what conditions it works under.22

seeking a similar capacity, has recruited massively within the hacker milieu inside China.25 Increasing resources in
the PLA are being allocated to develop assets in relation to cyberspace.26 The improvements are visible: The PLA
has established "information warfare" capabilities,27 with a special focus on cyber warfare that, according to their
doctrine, can be used in peacetime.28 Strategists from the PLA advocate the use of virus and hacker attacks that
can paralyze and surprise its enemies.29 Aggressive and Widespread Cyber Attacks from China and the

China's use of asymmetric capabilities, especially cyber warfare,


could pose a serious threat to the American economy .30 Research and development in cyber
International Response

espionage figure prominently in the 12th Five-Year Plan (20112015) that is being drafted by both the Chinese

China could well have the most extensive


and aggressive cyber warfare capability in the world, and that this is being driven by
China's desire for "global-power status." 32 These observations do not come out of the blue, but are a
central government and the PLA.31 Analysts say that

consequence of the fact that authoritative Chinese writings on the subject present cyber warfare as an obvious
asymmetric instrument for balancing overwhelming (mainly U.S.) power, especially in case of open conflict, but also
as a deterrent.33

Escalates to nuclear war and turns the economy


Landay 2k
(Jonathan S. Landay, National Security and Intelligence Correspondent, -2K [Top Administration
Officials Warn Stakes for U.S. Are High in Asian Conflicts, Knight Ridder/Tribune News Service, March
10, p. Lexis. Ghs-kw)

China and Taiwan, North Korea and South Korea, or India and
Pakistan are spoiling to fight. But even a minor miscalculation by any of them could
destabilize Asia, jolt the global economy and even start a nuclear war. India,
Pakistan and China all have nuclear weapons, and North Korea may have a few , too.
Asia lacks the kinds of organizations, negotiations and diplomatic
relationships that helped keep an uneasy peace for five decades in Cold
War Europe. Nowhere else on Earth are the stakes as high and relationships so
fragile, said Bates Gill, director of northeast Asian policy studies at the Brookings Institution, a Washington think
tank. We see the convergence of great power interest overlaid with lingering
confrontations with no institutionalized security mechanism in place. There are
elements for potential disaster. In an effort to cool the regions tempers, President Clinton, Defense
Few if any experts think

Secretary William S. Cohen and National Security Adviser Samuel R. Berger all will hopscotch Asias capitals this
month.

For America, the stakes could hardly be higher. There are 100,000 U.S. troops

in Asia committed to defending Taiwan, Japan and South Korea, and the United
States would instantly become embroiled if Beijing moved against Taiwan or North
Korea attacked South Korea. While Washington has no defense commitments to either India or Pakistan, a
conflict between the two could end the global taboo against using nuclear
weapons and demolish the already shaky international nonproliferation
regime. In addition, globalization has made a stable Asia _ with its massive markets,
cheap labor, exports and resources indispensable to the U.S. economy. Numerous
U.S. firms and millions of American jobs depend on trade with Asia that totaled $600
billion last year, according to the Commerce Department.

2NC UQ
Cyber-capabilities strong now but its close
NBC 13
(NBC citing Scott Borg, CEO of the US Cyber Consequences Unit, and independent, non-profit research
institute. Borg has lectured at Harvard, Yale, Columbia, London, and other leading universities.
"Expert: US in cyberwar arms race with China, Russia," NBC News. 02-20-2013.
http://investigations.nbcnews.com/_news/2013/02/20/17022378-expert-us-in-cyberwar-arms-race-withchina-russia//ghs-kw)

The United States is locked in a tight race with China and Russia to build
destructive cyberweapons capable of seriously damaging other nations
critical infrastructure, according to a leading expert on hostilities waged via the Internet. Scott Borg,
CEO of the U.S. Cyber Consequences Unit, a nonprofit institute that advises the U.S. government and businesses on

all three nations have built arsenals of sophisticated computer


viruses, worms, Trojan horses and other tools that place them atop the rest of the
world in the ability to inflict serious damage on one another, or lesser powers.
Ranked just below the Big Three, he said, are four U.S. allies: Great Britain, Germany, Israel
and perhaps Taiwan. But in testament to the uncertain risk/reward ratio in cyberwarfare, Iran has used attacks
cybersecurity, said

on its nuclear program to bolster its offensive capabilities and is now developing its own "cyberarmy," Borg said.
Borg offered his assessment of the current state of cyberwar capabilities Tuesday in the wake of a report by the
American computer security company Mandiant linking hacking attacks and cyber espionage against the U.S. to a
sophisticated Chinese group known as Peoples Liberation Army Unit 61398. In todays brave new interconnected

hackers who can defeat security defenses are capable of disrupting an array of
critical services, including delivery of water, electricity and heat, or bringing transportation to a grinding halt.
world,

U.S. senators last year received a closed-door briefing at which experts demonstrated how a power company
employee could take down the New York City electrical grid by clicking on a single email attachment, the New York
Times reported. U.S. officials rarely discuss offensive capability when discussing cyberwar, though several privately

the U.S. could "shut down" the electrical grid of a smaller nation -if it chose to do so. Borg echoed that assessment, saying the U.S.
cyberwarriors, who work within the National Security Agency, are very good across the
board. There is a formidable capability. Stuxnet and Flame (malware used
to disrupt and gather intelligence on Iran's nuclear program) are demonstrations of
that, he said. (The U.S.) could shut down most critical infrastructure in potential
adversaries relatively quickly.
told NBC News recently that
Iran, for example

Cyber-deterrence works now


Healey 14
(Healey, Jason. Jason Healey is a Nonresident Senior Fellow for the Cyber Statecraft Initiative of the
Atlantic Council and Senior Research Scholar at Columbia University's School of International and
Public Affairs, focusing on international cooperation, competition, and conflict in cyberspace. From
2011 to 2015, he worked as the Director of the Council's Cyber Statecraft Initiative. Starting his career
in the United States Air Force, Mr. Healey earned two Meritorious Service Medals for his early work in
cyber operations at Headquarters Air Force at the Pentagon and as a plankholder (founding member)
of the Joint Task Force Computer Network Defense, the world's first joint cyber warfighting unit. He
has degrees from the United States Air Force Academy (political science), Johns Hopkins University
(liberal arts), and James Madison University (information security). "Commentary: Cyber Deterrence Is
Working," Defense News. 7-30-2014.
http://archive.defensenews.com/article/20140730/DEFFEAT05/307300017/Commentary-CyberDeterrence-Working//ghs-kw)

Nations have been unwilling to take advantage of each others vulnerable


infrastructures perhaps because, as Joe Nye notes in his book, The Future of Power, interstate
deterrence through entanglement and denial still exist for cyber conflicts. The most
capable cyber nations rely heavily on the same Internet infrastructure and global standards (though using
significant local infrastructure), so attacks above a certain threshold are not obviously in any nations self-interest.

In addition, both deterrence by denial and deterrence by punishment are


in force. Despite their vulnerabilities, nations may still be able to mount effective-enough
defenses to deny any benefits to the adversary. Taking down a cyber target is
spectacularly easy and well within the capability of the proverbial two-teenagers-

in-a-basement. But keeping a target down over time in the face of determined
defenses is very hard, demanding intelligence, battle damage assessment and the
ability to keep restriking targets over time. These capabilities are still largely the
province of the great cyber powers, meaning it can be trivially easy to determine
the likely attacker. During all of the most disruptive cyber conflicts (such as Estonia,
Georgia or Stuxnet) there was quick consensus on the obvious choice of which nation or
nations were behind the assault. If any of those attacks had caused large numbers
of deaths or truly strategic disruption, hiding behind Internet anonymity (It wasnt us
and you cant prove otherwise) would ring flat and invite a retaliatory strike.

2NC Link - Backdoors


Backdoors and surveillance are key to winning the cyber arms
race
Spiegel 15
(Spiegel Online, Hamburg, Germany. "The Digital Arms Race: NSA Preps America for Future Battle,"
SPIEGEL ONLINE. 1-17-2015. http://www.spiegel.de/international/world/new-snowden-docs-indicatescope-of-nsa-preparations-for-cyber-battle-a-1013409.html//ghs-kw)
Potential interns are also told that research into third party computers might include plans to "remotely degrade or
destroy opponent computers, routers, servers and network enabled devices by attacking the hardware." Using a

With
programs like Berserkr they would implant "persistent backdoors" and "parasitic drivers".
program called Passionatepolka, for example, they may be asked to "remotely brick network cards."

Using another piece of software called Barnfire, they would "erase the BIOS on a brand of servers that act as a
backbone to many rival governments." An intern's tasks might also include remotely destroying the functionality of
hard drives. Ultimately, the goal of the internship program was "developing an attacker's mindset." The internship
listing is eight years old, but the attacker's mindset has since become a kind of doctrine for the NSA's data spies.
And the intelligence service isn't just trying to achieve mass surveillance of Internet communication, either. The
digital spies of the Five Eyes alliance -- comprised of the United States, Britain, Canada, Australia and New Zealand

NSA whistleblower
are planning for wars of the future in which
the Internet will play a critical role, with the aim of being able to use the net to
paralyze computer networks and, by doing so, potentially all the infrastructure they
control, including power and water supplies, factories, airports or the flow of money.
-- want more. The Birth of D Weapons According to top secret documents from the archive of
Edward Snowden seen exclusively by SPIEGEL, they

During the 20th century, scientists developed so-called ABC weapons -- atomic, biological and chemical. It took

New digital weapons


have now been developed for the war on the Internet. But there are almost no
international conventions or supervisory authorities for these D weapons, and the
only law that applies is the survival of the fittest. Canadian media theorist Marshall McLuhan
decades before their deployment could be regulated and, at least partly, outlawed.

foresaw these developments decades ago. In 1970, he wrote, "World War III is a guerrilla information war with no
division between military and civilian participation." That's precisely the reality that spies are preparing for today.
The US Army, Navy, Marines and Air Force have already established their own cyber forces, but it is

the NSA, also

officially a military agency, that is taking the lead. It's no coincidence that the director of the NSA also serves
as the head of the US Cyber Command. The country's leading data spy, Admiral Michael Rogers, is also its chief
cyber warrior and his close to 40,000 employees are responsible for both digital spying and destructive network

From a military perspective, surveillance of the Internet is


merely "Phase 0" in the US digital war strategy. Internal NSA documents indicate
that it is the prerequisite for everything that follows. They show that the aim of the
surveillance is to detect vulnerabilities in enemy systems. Once "stealthy implants"
have been placed to infiltrate enemy systems, thus allowing "permanent accesses,"
then Phase Three has been achieved -- a phase headed by the word "dominate" in
the documents. This enables them to "control/destroy critical systems & networks at
will through pre-positioned accesses (laid in Phase 0)." Critical infrastructure is
considered by the agency to be anything that is important in keeping a society
running: energy, communications and transportation. The internal documents state
that the ultimate goal is "real time controlled escalation". One NSA presentation proclaims
that "the next major conflict will start in cyberspace." To that end, the US government
is currently undertaking a massive effort to digitally arm itself for network warfare. For the
attacks. Surveillance only 'Phase 0'

2013 secret intelligence budget, the NSA projected it would need around $1 billion in order to increase the strength
of its computer network attack operations. The budget included an increase of some $32 million for "unconventional
solutions" alone.

Back doors are key to cyber-warfare


Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.

After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

The policy debate has moved so that offensive options are more prominent now,
said former deputy defense secretary William J. Lynn III, who has not seen the budget document and was speaking generally. I think

Of
the 231 offensive operations conducted in 2011, the budget said, nearly three-quarters
were against top-priority targets, which former officials say includes adversaries such as
Iran, Russia, China and North Korea and activities such as nuclear proliferation. The
theres more of a case made now that offensive cyberoptions can be an important element in deterring certain adversaries.

document provided few other details about the operations. Stuxnet, a computer worm reportedly developed by the United States
and Israel that destroyed Iranian nuclear centrifuges in attacks in 2009 and 2010, is often cited as the most dramatic use of a
cyberweapon. Experts said no other known cyberattacks carried out by the United States match the physical damage inflicted in
that case. U.S. agencies define offensive cyber-operations as activities intended to manipulate, disrupt, deny, degrade, or destroy
information resident in computers or computer networks, or the computers and networks themselves, according to a presidential
directive issued in October 2012. Most offensive operations have immediate effects only on data or the proper functioning of an
adversarys machine: slowing its network connection, filling its screen with static or scrambling the results of basic calculations. Any
of those could have powerful effects if they caused an adversary to botch the timing of an attack, lose control of a computer or
miscalculate locations. U.S. intelligence services are making routine use around the world of government-built malware that differs
little in function from the advanced persistent threats that U.S. officials attribute to China. The principal difference, U.S. officials
told The Post, is that China steals U.S. corporate secrets for financial gain. The Department of Defense does engage in computer
network exploitation, according to an e-mailed statement from an NSA spokesman, whose agency is part of the Defense
Department. The department does ***not*** engage in economic espionage in any domain, including cyber. Millions of implants

The administrations cyber-operations sometimes involve what one budget


document calls field operations abroad, commonly with the help of CIA operatives
or clandestine military forces, to physically place hardware implants or software
modifications. Much more often, an implant is coded entirely in software by an NSA group called Tailored Access
Operations (TAO). As its name suggests, TAO builds attack tools that are custom-fitted to their
targets. The NSA units software engineers would rather tap into networks than
individual computers because there are usually many devices on each network.
Tailored Access Operations has software templates to break into common brands
and models of routers, switches and firewalls from multiple product vendor lines,
according to one document describing its work. The implants that TAO creates are intended to persist
through software and equipment upgrades, to copy stored data, harvest
communications and tunnel into other connected networks. This year TAO is working on implants
that can identify select voice conversations of interest within a target network and exfiltrate select cuts, or excerpts, according to

In some cases, a single compromised device opens the door


to hundreds or thousands of others. Sometimes an implants purpose is to
create a back door for future access. You pry open the window
somewhere and leave it so when you come back the owner doesnt know
its unlocked, but you can get back in when you want to, said one intelligence official,
one budget document.

who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of anonymity to

these operations are known as exploitation, not attack,


essential precursors both to attack and defense. By the end of this year,
GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S.
discuss sensitive technology. Under U.S. cyberdoctrine,
but they are

intelligence budget. The NSA appears to be planning a rapid expansion of those numbers, which were limited until recently by the
need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use

For GENIEs next phase, according to an


authoritative reference document, the NSA has brought online an automated
system, code-named TURBINE, that is capable of managing potentially millions of
implants for intelligence gathering and active attack. The ROC When it comes time to fight the
of only 8,448 of the 68,975 machines with active implants in 2011.

cyberwar against the best of the NSAs global competitors, the TAO calls in its elite operators, who work at the agencys Fort Meade
headquarters and in regional operations centers in Georgia, Texas, Colorado and Hawaii. The NSAs organizational chart has the
main office as S321. Nearly everyone calls it the ROC, pronounced rock: the Remote Operations Center. To the NSA as a whole,
the ROC is where the hackers live, said a former operator from another section who has worked closely with the exploitation teams.
Its basically the one-stop shop for any kind of active operation thats not defensive. Once the hackers find a hole in an

[t]argeted systems are compromised electronically, typically


providing access to system functions as well as data. System logs and processes are
modified to cloak the intrusion, facilitate future access, and accomplish other
operational goals, according to a 570-page budget blueprint for what the government calls its Consolidated Cryptologic
adversarys defense,

Program, which includes the NSA. Teams from the FBI, the CIA and U.S. Cyber Command work alongside the ROC, with overlapping
missions and legal authorities. So do the operators from the NSAs National Threat Operations Center, whose mission is focused
primarily on cyberdefense. That was Snowdens job as a Booz Allen Hamilton contractor, and it required him to learn the NSAs best

the ROC teams give Cyber Command specific


target related technical and operational material (identification/recognition), tools
and techniques that allow the employment of U.S. national and tactical specific
computer network attack mechanisms. The intelligence communitys cybermissions include
defense of military and other classified computer networks against foreign attack , a
hacking techniques. According to one key document,

task that absorbs roughly one-third of a total cyber operations budget of $1.02 billion in fiscal 2013, according to the Cryptologic
Program budget. The ROCs breaking-and-entering mission, supported by the GENIE infrastructure, spends nearly twice as much:

Most GENIE operations aim for exploitation of foreign systems, a term


defined in the intelligence budget summary as surreptitious virtual or physical
access to create and sustain a presence inside targeted systems or facilities. The
document adds: System logs and processes are modified to cloak the intrusion,
facilitate future access, and accomplish other operational goals. The NSA designs most of its
$651.7 million.

own implants, but it devoted $25.1 million this year to additional covert purchases of software vulnerabilities from private malware
vendors, a growing gray-market industry based largely in Europe.

2NC Link Exports


Backdoors are inserted in US products and exported globally
Schneier indicates backdoors in networks is key to cyberoperations
Greenwald 14
(Glenn Greenwald. Glenn Greenwald is an ex-constitutional lawyer and a contributor for the Guardian,
NYT, LAT, and The Intercept. He received his BA from George Washington University and a JD from
NYU. "Glenn Greenwald: how the NSA tampers with US-made internet routers," Guardian. 5-12-2014.
http://www.theguardian.com/books/2014/may/12/glenn-greenwald-nsa-tampers-us-internet-routerssnowden//ghs-kw)
But while American companies were being warned away from supposedly untrustworthy Chinese routers, foreign
organisations would have been well advised to beware of American-made ones. A June 2010 report from the head of

The NSA routinely


receives or intercepts routers, servers and other computer network
devices being exported from the US before they are delivered to the
international customers. The agency then implants backdoor surveillance tools,
repackages the devices with a factory seal and sends them on . The NSA thus
gains access to entire networks and all their users. The document gleefully observes
that some "SIGINT tradecraft is very hands-on (literally!)". Eventually, the
implanted device connects back to the NSA. The report continues: "In one recent case,
after several months a beacon implanted through supply-chain interdiction called
back to the NSA covert infrastructure. This call back provided us access to further
exploit the device and survey the network." It is quite possible that Chinese firms are implanting
the NSA's Access and Target Development department is shockingly explicit.

surveillance mechanisms in their network devices. But the US is certainly doing the same.

Routers are keygives us access to thousands of connected


devices
Gellman and Nakashima 13
(Barton Gellman. Barton Gellman writes for the national staff. He has contributed to three Pulitzer
Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service. He is also a
senior fellow at the Century Foundation and visiting lecturer at Princetons Woodrow Wilson School.
After 21 years at The Post, where he served tours as legal, military, diplomatic, and Middle East
correspondent, Gellman resigned in 2010 to concentrate on book and magazine writing. He returned
on temporary assignment in 2013 and 2014 to anchor The Post's coverage of the NSA disclosures after
receiving an archive of classified documents from Edward Snowden. Ellen Nakashima is a national
security reporter for The Washington Post. She focuses on issues relating to intelligence, technology
and civil liberties. She previously served as a Southeast Asia correspondent for the paper. She wrote
about the presidential candidacy of Al Gore and co-authored a biography of Gore, and has also covered
federal agencies, Virginia state politics and local affairs. She joined the Post in 1995. "U.S. spy
agencies mounted 231 offensive cyber-operations in 2011, documents show," Washington Post. 8-302013. https://www.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cbfd7ce041d814_story.html//ghs-kw)

The policy debate has moved so that offensive options are more prominent now,
said former deputy defense secretary William J. Lynn III, who has not seen the budget document and was speaking generally. I think

Of
the 231 offensive operations conducted in 2011, the budget said, nearly three-quarters
were against top-priority targets, which former officials say includes adversaries such as
Iran, Russia, China and North Korea and activities such as nuclear proliferation. The
theres more of a case made now that offensive cyberoptions can be an important element in deterring certain adversaries.

document provided few other details about the operations. Stuxnet, a computer worm reportedly developed by the United States
and Israel that destroyed Iranian nuclear centrifuges in attacks in 2009 and 2010, is often cited as the most dramatic use of a
cyberweapon. Experts said no other known cyberattacks carried out by the United States match the physical damage inflicted in
that case. U.S. agencies define offensive cyber-operations as activities intended to manipulate, disrupt, deny, degrade, or destroy
information resident in computers or computer networks, or the computers and networks themselves, according to a presidential
directive issued in October 2012. Most offensive operations have immediate effects only on data or the proper functioning of an
adversarys machine: slowing its network connection, filling its screen with static or scrambling the results of basic calculations. Any
of those could have powerful effects if they caused an adversary to botch the timing of an attack, lose control of a computer or
miscalculate locations. U.S. intelligence services are making routine use around the world of government-built malware that differs
little in function from the advanced persistent threats that U.S. officials attribute to China. The principal difference, U.S. officials

told The Post, is that China steals U.S. corporate secrets for financial gain. The Department of Defense does engage in computer
network exploitation, according to an e-mailed statement from an NSA spokesman, whose agency is part of the Defense
Department. The department does ***not*** engage in economic espionage in any domain, including cyber. Millions of implants

The administrations cyber-operations sometimes involve what one budget


document calls field operations abroad, commonly with the help of CIA operatives
or clandestine military forces, to physically place hardware implants or software
modifications. Much more often, an implant is coded entirely in software by an NSA group called Tailored Access
Operations (TAO). As its name suggests, TAO builds attack tools that are custom-fitted to their
targets. The NSA units software engineers would rather tap into networks than
individual computers because there are usually many devices on each network.
Tailored Access Operations has software templates to break into common brands
and models of routers, switches and firewalls from multiple product vendor lines,
according to one document describing its work. The implants that TAO creates are intended to persist
through software and equipment upgrades, to copy stored data, harvest
communications and tunnel into other connected networks. This year TAO is working on implants
that can identify select voice conversations of interest within a target network and exfiltrate select cuts, or excerpts, according to

In some cases, a single compromised device opens the door


to hundreds or thousands of others. Sometimes an implants purpose is to
create a back door for future access. You pry open the window
somewhere and leave it so when you come back the owner doesnt know
its unlocked, but you can get back in when you want to, said one intelligence official,
one budget document.

who was speaking generally about the topic and was not privy to the budget. The official spoke on the condition of anonymity to

these operations are known as exploitation, not attack,


essential precursors both to attack and defense. By the end of this year,
GENIE is projected to control at least 85,000 implants in strategically chosen
machines around the world. That is quadruple the number 21,252 available in 2008, according to the U.S.
discuss sensitive technology. Under U.S. cyberdoctrine,
but they are

intelligence budget. The NSA appears to be planning a rapid expansion of those numbers, which were limited until recently by the
need for human operators to take remote control of compromised machines. Even with a staff of 1,870 people, GENIE made full use

For GENIEs next phase, according to an


authoritative reference document, the NSA has brought online an automated
system, code-named TURBINE, that is capable of managing potentially millions of
implants for intelligence gathering and active attack. The ROC When it comes time to fight the
of only 8,448 of the 68,975 machines with active implants in 2011.

cyberwar against the best of the NSAs global competitors, the TAO calls in its elite operators, who work at the agencys Fort Meade
headquarters and in regional operations centers in Georgia, Texas, Colorado and Hawaii. The NSAs organizational chart has the
main office as S321. Nearly everyone calls it the ROC, pronounced rock: the Remote Operations Center. To the NSA as a whole,
the ROC is where the hackers live, said a former operator from another section who has worked closely with the exploitation teams.
Its basically the one-stop shop for any kind of active operation thats not defensive. Once the hackers find a hole in an

[t]argeted systems are compromised electronically, typically


providing access to system functions as well as data. System logs and processes are
modified to cloak the intrusion, facilitate future access, and accomplish other
operational goals, according to a 570-page budget blueprint for what the government calls its Consolidated Cryptologic
adversarys defense,

Program, which includes the NSA. Teams from the FBI, the CIA and U.S. Cyber Command work alongside the ROC, with overlapping
missions and legal authorities. So do the operators from the NSAs National Threat Operations Center, whose mission is focused
primarily on cyberdefense. That was Snowdens job as a Booz Allen Hamilton contractor, and it required him to learn the NSAs best

the ROC teams give Cyber Command specific


target related technical and operational material (identification/recognition), tools
and techniques that allow the employment of U.S. national and tactical specific
computer network attack mechanisms. The intelligence communitys cybermissions include
defense of military and other classified computer networks against foreign attack , a
hacking techniques. According to one key document,

task that absorbs roughly one-third of a total cyber operations budget of $1.02 billion in fiscal 2013, according to the Cryptologic
Program budget. The ROCs breaking-and-entering mission, supported by the GENIE infrastructure, spends nearly twice as much:

Most GENIE operations aim for exploitation of foreign systems, a term


defined in the intelligence budget summary as surreptitious virtual or physical
access to create and sustain a presence inside targeted systems or facilities. The
document adds: System logs and processes are modified to cloak the intrusion,
facilitate future access, and accomplish other operational goals. The NSA designs most of its
$651.7 million.

own implants, but it devoted $25.1 million this year to additional covert purchases of software vulnerabilities from private malware

vendors, a growing gray-market industry based largely in Europe.

2NC Link - Zero Days


Zero-days are key to the cyber-arsenal
Cushing 14
(Cushing, Seychelle. Cushing received her MA with Distinction in Political Science and her BA in
Political Science from Simon Fraser Unversity. She is the Manager of Strategic Initiatives and Special
Projects at the Office of the Vice-President, Research. Leveraging Information as Power: Americas
Pursuit of Cyber Security, Simon Fraser University. 11-28-2014.
http://summit.sfu.ca/system/files/iritems1/14703/etd8726_SCushing.pdf//ghs-kw)
Nuclear or conventional weapons, once developed, can remain dormant yet functional until needed. In comparison,

zero-days used in cyber weapons require the US to constantly discover new


vulnerabilities to maintain a deployable cyber arsenal. Holding a specific
zero-day does not guarantee that the vulnerability will remain unpatched for a
prolonged period of time by the targeted state. 59 Complicating this is the fact that undetected
the

vulnerabilities, once acquired, are rarely used immediately given the time and resources it takes to construct a
cyber attack.60 In the time between acquisition and use, a patch for the vulnerability may be released, whether
through routine patches or a specific identification of a security hole, rendering the vulnerability obsolete. To

America deploys several zero-days at once in a cyber attack to increase


the odds that at least one (or more) of the vulnerabilities remains open to provide
system access.6 2.4. One Attack, Multiple Vulnerabilities Multiple backdoor entry points are
preferable given that America cannot be absolutely certain of what vulnerabilities
the target system will contain62 despite extensive pre-launch cyber attack testing63 and
customization.64 A successful cyber attack needs a minimum of one undetected
vulnerability to gain access to the target system. Each successive zero-day that
works adds to the strength and sophistication of a cyber assault. 65 As one vulnerability is
minimize this,

patched, America can still rely on the other undetected vulnerabilities to continue its cyber strike.

Incorporating multiple undetected vulnerabilities into a cyber attack reduces the


need to create new cyber attacks after each zero-day fails. Stuxnet , a joint US-Israel
operation, was a cyber attack designed to disrupt Irans progress on its nuclear
weapons program.66 The attack was designed to alter the code of Natanzs computers and industrial control
systems to induce chronic fatigue, rather than destruction, of the nuclear centrifuges.67 The precision of Stuxnet

What is
notable about Stuxnet is its use of four zero-day exploits (of which one was
allegedly purchased)69 in the attack. 70 That is, to target one system, Stuxnet entered
through four different backdoors. A target state aware of a specific vulnerability in its system will enact
a patch upon detection and likely assume that the problem is fixed. Exploiting multiple vulnerabilities
creates variations in how the attack is executed given that different backdoors alter
how the attack enters the target system.71 One patch does not stop the cyber
attack. The use of multiple zero-days thus capitalizes on a states limited awareness of the vulnerabilities in its
ensured that all other control systems were ignored except for those regulating the centrifuges.68

system. Each phase of Stuxnet was different from its previous phase which created confusion among the Iranians.

Yet even upon the initial


discovery of the attack, who the attacker was remained unclear. The failures in the
Natanz centrifuges were first attributed to insider error73 and later to China 74 before
Launched in 2009, Stuxnet was not discovered by the Iranians until 2010.72

finally discovering the true culprits.75 The use of multiple undetected vulnerabilities helped to obscure the US and

The Stuxnet case helps illustrate the efficacy of zeroday attacks as a means of attaining political goals. Although Stuxnet did not produce
Israel as the actual attackers.76

immediate results in terminating Irans nuclear program, it helped buy time for the Americans to consider other
options against Iran. A nuclear Iran would not only threaten American security but possibly open a third conflict for
America77 in the Middle East given Israels proclivity to strike a nuclear Iran first. Stuxnet allowed the United States
to delay Irans nuclear program without resorting to kinetic action.78

Zero-days are key to effective cyber-war offensive capabilities


Gjelten 13
(Gjelten, Tom. TOM GJELTEN is a correspondent for NPR. Over the years, he has reported extensively

from Europe and Latin America, including Cuba. He was reporting live from the Pentagon when it was
attacked on September 11, 2001. Subsequently, he covered the war in Afghanistan and Iraq invasion
as NPR's lead Pentagon correspondent. Gjelten also covered the first Gulf War and the wars in Croatia
and Bosnia, Nicaragua, El Salvador, Guatemala, and Colombia. From Berlin (19901994), he covered
Europes political and economic transition after the fall of the Berlin Wall. Gjeltens series From Marx
to Markets, documenting Eastern Europes transition to a market economy, earned him an Overseas
Press Club award for the the Best Business or Economic Reporting in Radio or TV. His reporting from
Bosnia earned him a second Overseas Press Club Award, a George Polk Award, and a Robert F Kennedy
Journalism Award. Gjeltens books include Sarajevo Daily: A City and Its Newspaper Under Siege, which
the New York Times called a chilling portrayal of a citys slow murder. His 2008 book, Bacardi and
the Long Fight for Cuba: The Biography of a Cause, was selected as a New York Times Notable
Nonfiction Book. "First Strike: US Cyber Warriors Seize the Offensive," World Affairs Journal.
January/February 2013. http://www.worldaffairsjournal.org/article/first-strike-us-cyber-warriors-seizeoffensive//ghs-kw)

Much of the cyber talk around the Pentagon these days is about offensive
operations. It is no longer enough for cyber troops to be deployed along network
perimeters, desperately trying to block the constant attempts by adversaries to
penetrate front lines. The US militarys geek warriors are now prepared to go on the
attack, armed with potent cyberweapons that can break into enemy computers with
pinpoint precision. The new emphasis is evident in a program launched in October 2012 by the Defense Advanced
That was then.

Research Projects Agency (DARPA), the Pentagons experimental research arm. DARPA funding enabled the invention of the Internet,

DARPA
managers said the Plan X goal was to create revolutionary technologies for
understanding, planning, and managing cyberwarfare. The US Air Force was also signaling its
stealth aircraft, GPS, and voice-recognition software, and the new program, dubbed Plan X, is equally ambitious.

readiness to go into cyber attack mode, announcing in August that it was looking for ideas on how to destroy, deny, degrade,
disrupt, deceive, corrupt, or usurp the adversaries [sic] ability to use the cyberspace domain for his advantage. The new interest in
attacking enemies rather than simply defending against them has even spread to the business community. Like their military
counterparts, cybersecurity experts in the private sector have become increasingly frustrated by their inability to stop intruders

The new idea is


to pursue the perpetrators back into their own networks. Were following a failed
security strategy in cyber, says Steven Chabinsky, formerly the head of the FBIs cyber intelligence section and now
from penetrating critical computer networks to steal valuable data or even sabotage network operations.

chief risk officer at CrowdStrike, a startup company that promotes aggressive action against its clients cyber adversaries.

Theres no way that we are going to win the cybersecurity effort on defense. We
have to go on offense. The growing interest in offensive operations is bringing changes in the cybersecurity industry.
Expertise in patching security flaws in ones own computer network is out; expertise in finding those flaws in the other guys

Among the hot jobs listed on the career page at the National Security
Agency are openings for computer scientists who specialize in vulnerability
discovery. Demand is growing in both government and industry circles for technologists with the skills to develop ever more
network is in.

sophisticated cyber tools, including malicious softwaremalwarewith such destructive potential as to qualify as cyberweapons
when implanted in an enemys network.

Offense is the biggest growth sector in the cyber industry

right now,

says Jeffrey Carr, a cybersecurity analyst and author of Inside Cyber Warfare. But have we given sufficient thought
to what we are doing? Offensive operations in the cyber domain raise a host of legal, ethical, and political issues, and governments,
courts, and business groups have barely begun to consider them. The move to offensive operations in cyberspace was actually
under way even as Pentagon officials were still insisting their strategy was defensive. We just didnt know it. The big revelation came
in June 2012, when New York Times reporter David Sanger reported that the United States and Israel were behind the development
of the Stuxnet worm, which had been used to damage computer systems controlling Irans nuclear enrichment facilities.
Sanger, citing members of President Obamas national security team, said the attacks were code-named Olympic Games and

constituted Americas first sustained use of cyberweapons. The highly sophisticated Stuxnet
worm delivered computer instructions that caused some Iranian centrifuges to spin uncontrollably and self-destruct. According to
Sanger, the secret cyber attacks had begun during the presidency of George W. Bush but were accelerated on the orders of Obama.
The publication of such a highly classified operation provoked a firestorm of controversy, but government officials who took part in

In the
aftermath of the Stuxnet revelations, discussions about cyber war became more
realistic and less theoretical. Here was a cyberweapon that had been designed and
used for the same purpose and with the same effect as a kinetic weapon : like a missile or a
discussions of Stuxnet have not denied the accuracy of Sangers reporting. He nailed it, one participant told me.

bomb, it caused physical destruction. Security experts had been warning that a US adversary could use a cyberweapon to destroy

the Stuxnet
story showed how the American military itself could use an offensive cyberweapon
against an enemy. The advantages of such a strike were obvious. A cyberweapon
could take down computer networks and even destroy physical equipment without
power plants, water treatment facilities, or other critical infrastructure assets here in the United States, but

the civilian casualties that a bombing mission would entail. Used preemptively, it could keep a
conflict from evolving in a more lethal direction. The targeted country would have a hard time
determining where the cyber attack came from. In fact, the news that the United States had actually
developed and used an offensive cyberweapon gave new significance to hints US officials had quietly dropped on previous occasions
about the enticing potential of such tools. In remarks at the Brookings Institution in April 2009, for example, the then Air Force chief
of staff, General Norton Schwartz, suggested that cyberweapons could be used to attack an enemys air defense system.

Traditionally, Schwartz said, we take down integrated air defenses via kinetic
means. But if it were possible to interrupt radar systems or surface to air missile
systems via cyber, that would be another very powerful tool in the tool kit allowing
us to accomplish air missions. He added, We will develop thathave [that]
capability. A full two years before the Pentagon rolled out its defensive cyber strategy, Schwartz was clearly suggesting an
offensive application. The Pentagons reluctance in 2011 to be more transparent about its interest in offensive cyber capabilities
may simply have reflected sensitivity to an ongoing dispute within the Obama administration. Howard Schmidt, the White House
Cybersecurity Coordinator at the time the Department of Defense strategy was released, was steadfastly opposed to any use of the
term cyber war and had no patience for those who seemed eager to get into such a conflict. But his was a losing battle.

Pentagon planners had already classified cyberspace officially as a fifth domain of


warfare, alongside land, air, sea, and space. As the 2011 cyber strategy noted, that designation allows
DoD to organize, train, and equip for cyberspace as we do in air, land, maritime, and space to support national security interests.

Once the US
military accepts the challenge to fight in a new domain, it aims for superiority in
that domain over all its rivals, in both offensive and defensive realms. Cyber is no
exception. The US Air Force budget request for 2013 included $4 billion in proposed spending to achieve cyberspace
That statement by itself contradicted any notion that the Pentagons interest in cyber was mainly defensive.

superiority, according to Air Force Secretary Michael Donley. It is hard to imagine the US military settling for any less, given the
importance of electronic assets in its capabilities. Even small unit commanders go into combat equipped with laptops and video
links. Were no longer just hurling mass and energy at our opponents in warfare, says John Arquilla, professor of defense analysis
at the Naval Postgraduate School. Now were using information, and the more you have, the less of the older kind of weapons you
need. Access to data networks has given warfighters a huge advantage in intelligence, communication, and coordination. But their
dependence on those networks also creates vulnerabilities, particularly when engaged with an enemy that has cyber capabilities of
his own. Our adversaries are probing every possible entry point into the network, looking for that one possible weak spot, said
General William Shelton, head of the Air Force Space Command, speaking at a CyberFutures Conference in 2012. If we dont do this
right, these new data links could become one of those spots. Achieving cyber superiority in a twenty-first-century battle space is
analogous to the establishment of air superiority in a traditional bombing campaign. Before strike missions begin against a set of
targets, air commanders want to be sure the enemys air defense system has been suppressed. Radar sites, antiaircraft missile
batteries, enemy aircraft, and command-and-control facilities need to be destroyed before other targets are hit. Similarly, when an
information-dependent combat operation is planned against an opposing military, the operational commanders may first want to
attack the enemys computer systems to defeat his ability to penetrate and disrupt the US militarys information and communication
networks. Indeed, operations like this have already been carried out. A former ground commander in Afghanistan, Marine Lieutenant
General Richard Mills, has acknowledged using cyber attacks against his opponent while directing international forces in southwest
Afghanistan in 2010. I was able to use my cyber operations against my adversary with great impact, Mills said, in comments
before a military conference in August 2012. I was able to get inside his nets, infect his command-and-control, and in fact defend
myself against his almost constant incursions to get inside my wire, to affect my operations. Mills was describing offensive cyber
actions. This is cyber war, waged on a relatively small scale and at the tactical level, but cyber war nonetheless. And, as DARPAs
Plan X reveals, the US military is currently engaged in much larger scale cyber war planning. DARPA managers want contractors to
come up with ideas for mapping the digital battlefield so that commanders could know where and how an enemy has arrayed his
computer networks, much as they are now able to map the location of enemy tanks, ships, and aircraft. Such visualizations would
enable cyber war commanders to identify the computer targets they want to destroy and then assess the battle damage
afterwards. Plan X would also support the development of new cyber war architecture. The DARPA managers envision operating
systems and platforms with mission scripts built in, so that a cyber attack, once initiated, can proceed on its own in a manner
similar to the auto-pilot function in modern aircraft. None of this technology exists yet, but neither did the Internet or GPS when

the government role is to fund and


facilitate, but much of the experimental and research work would be done in the private sector. A computer worm with a
DARPA researchers first dreamed of it. As with those innovations,

destructive code like the one Stuxnet carried can probably be designed only with state sponsorship, in a research lab with resources
like those at the NSA. But private contractors are in a position to provide many of the tools needed for offensive cyber activity,

software bugs that can be exploited to provide a back door into


a computers operating system. Ideally, the security flaw or vulnerability that
can be exploited for this purpose will be one of which the network operator is totally
unaware. Some hackers specialize in finding these vulnerabilities, and as the
interest in offensive cyber operations has grown, so has the demand for their
services. The world-famous hacker conference known as Defcon attracts a wide and interesting assortment of people each
including the

year to Las Vegas: creative but often antisocial hackers who identify themselves only by their screen names, hackers who have gone
legit as computer security experts, law enforcement types, government spies, and a few curious academics and journalists. One can
learn whats hot in the hacker world just by hanging out there. In August 2012, several attendees were seated in the Defcon cafe
when a heavy-set young man in jeans, a t-shirt, and a scraggly beard strolled casually up and dropped several homemade calling
cards on the table. He then moved to the next table and tossed down a few more, all without saying a word. There was no company
logo or brand name on the card, just this message: Paying top dollar for 0-day and offensive technologies... The card identified
the buyer as zer0daybroker and listed an e-mail address.

A zero-day is the most valuable of computer

vulnerabilities, one unknown to anyone but the researcher who finds it. Hackers
prize zero-days because no one knows to have prepared a defense against them. The
growing demand for these tools has given rise to brokers like Zer0day, who identified himself in a subsequent e-mail exchange as
Zer0 Day Haxor but provided no other identifying information. As a broker, he probably did not intend to hack into a computer
network himself but only to act as an intermediary, connecting sellers who have discovered system vulnerabilities with buyers who
want to make use of the tools and are willing to pay a high price for them. In the past, the main market for these vulnerabilities was
software firms themselves who wanted to know about flaws in their products so that they could write patches to fix them. Big
companies like Google and Microsoft employ penetration testers whose job it is to find and report vulnerabilities that would allow
someone to hack into their systems. In some cases, such companies have paid a bounty to freelance cyber researchers who

the rise in offensive cyber operations has


transformed the vulnerability market, and hackers these days are more inclined to
sell zero-days to the highest bidder. In most cases, these are governments. The
market for back-door exploits has been boosted in large part by the burgeoning
demand from militaries eager to develop their cyber warfighting capabilities. The
discover a vulnerability and alert the company engineers. But

designers of the Stuxnet code cleared a path into Iranian computers through the use of four or five separate zero-day vulnerabilities,
an achievement that impressed security researchers around the world. The next Stuxnet would require the use of additional

If the president asks the US military to launch a cyber operation


in Iran tomorrow, its not the time to start looking for exploits, says
Christopher Soghoian, a Washington-based cybersecurity researcher.
They need to have the exploits ready to go. And you may not know what
kind of computer your target uses until you get there. You need a whole
arsenal [of vulnerabilities] ready to go in order to cover every possible
configuration you may meet. Not surprisingly, the National Security Agencybuying
through defense contractorsmay well be the biggest customer in the vulnerability market,
largely because it pays handsomely. The US militarys dominant presence in the
market means that other possible purchasers cannot match the militarys price.
Instead of telling Google or Mozilla about a flaw and getting a bounty for two
thousand dollars, researchers will sell it to a defense contractor like Raytheon or
SAIC and get a hundred thousand for it, says Soghoian, now the principal technologist in the Speech, Privacy
vulnerabilities.

and Technology Project at the American Civil Liberties Union and a prominent critic of the zero-day market. Those companies will
then turn around and sell the vulnerability upstream to the NSA or another defense agency. They will outbid Google every time.

2NC China
Cyber capabilities are key to deterrence and defending against
China
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

the United States regards cyber war during armed conflict with a cybercapable enemy as probable, if not inevitable. It both assumes that the computer
systems on which its own forces rely to deploy, receive support and strike will be
attacked, and intends to attack the computer systems that enable opposing forces
to operate as well. Thus, the United States has said that it can and would conduct cyber war
to support operational and contingency plans a euphemism for attacking computer systems that enable
enemy war fighting. US military doctrine now regards non-kinetic (that is, cyber) measures
as an integral aspect of US joint offensive operations. 8 Even so, the stated purposes of the US military
At the same time,

regarding cyber war stress protecting the ability of conventional military forces to function as they should, as well as avoiding and
preventing escalation, especially to non-military targets. Apart from its preparedness to conduct counter-military cyber operations
during wartime, the United States has been reticent about using its offensive capabilities. While it has not excluded conducting

Broadly speaking, US
policy is to rely on the threat of retaliation to deter a form of warfare it is keen to
avoid. Chinese criticism that the US retaliatory policy and capabilities will up the ante on the Internet arms race is disingenuous
cyber operations to coerce hostile states or non-state actors, it has yet to brandish such a threat.9

in that China has been energetic in forming and using capabilities for cyber operations.10 Chinese criticism is disingenuous
Notwithstanding the defensive bias in US attitudes toward cyber war, the dual missions of deterrence and preparedness for offensive
operations during an armed conflict warrant maintaining superb, if not superior, offensive capabilities. Moreover, the case can be

the United States should have superiority in offensive


capabilities in order to control escalation.11 The combination of significant capabilities and declared
made and we have made it that

reluctance to wage cyber war raises a question that is not answered by any US official public statements: when it comes to offence,

To be clear, we do not
take issue with the basic US stance of being at once wary and capable of cyber war.
Nor do we think that the United States should advertise exactly when and how it
would conduct offensive cyber war. However, the very fact that the United States maintains options for
what are US missions, desired effects, target sets and restraints in short, what is US policy?

offensive operations implies the need for some articulation of policy. After all, the United States was broadly averse to the use of
nuclear weapons during the Cold War, yet it elaborated a declaratory policy governing such use to inform adversaries, friends and
world opinion, as well as to forge domestic consensus. Indeed, if the United States wants to discourage and limit cyber war
internationally, while keeping its options open, it must offer an example. For that matter, the American people deserve to know what
national policy on cyber war is, lest they assume it is purely defensive or just too esoteric to comprehend. Whether to set a
normative example, warn potential adversaries or foster national consensus, US policy on waging cyber war should be coherent. At
the same time, it must encompass three distinguishable offensive missions: wartime counter-military operations, which the United
States intends to conduct; retaliatory missions, which the US must have the will and ability to conduct for reasons of deterrence; and
coercive missions against hostile states, which could substitute for armed attack.12 Four cases serve to highlight the relevant issues
and to inform the elaboration of an overall policy to guide US conduct of offensive cyber war. The first involves wartime countermilitary cyber operations against a cyber-capable opponent, which may also be waging cyber war; the second involves retaliation
against a cyber-capable opponent for attacking US systems other than counter-military ones; the third involves coercion of a cyberweak opponent with little or no means to retaliate against US cyber attack; and the fourth involves coercion of a cyber-strong
opponent with substantial means to retaliate against US cyber attack. Of these, the first and fourth imply a willingness to initiate
cyber war. Counter-military cyber war during wartime Just as cyber war is war, armed hostilities will presumably include cyber war if
the belligerents are both capable of and vulnerable to it. The reason for such certainty is that impairing opposing military forces use
of computer systems is operationally compelling. Forces with requisite technologies and skills benefit enormously from data
communications and computation for command and control, intelligence, surveillance and reconnaissance (ISR), targeting,
navigation, weapon guidance, battle assessment and logistics management, among other key functions. If the performance of forces
is dramatically enhanced by such systems, it follows that degrading them can provide important military advantages. Moreover,
allowing an enemy to use cyber war without reciprocating could mean military defeat. Thus,

the United States and

other advanced states are acquiring capabilities not only to use and protect
computer systems, but also to disrupt those used by enemies. The intention to
wage cyber war is now prevalent in Chinese planning for war with the United States
and vice versa. Chinese military planners have long made known their belief that,
because computer systems are essential for effective US military operations, they
must be targeted. Chinese cyber capabilities may not (yet) pose a threat to US
command, control, communications, computers, intelligence, surveillance and
reconnaissance (C4ISR) networks, which are well partitioned and protected.
However, the networks that enable logistical support for US forces are inviting
targets. Meant to disable US military operations, Chinese use of cyber war during an armed conflict
would not be contingent on US cyber operations. Indeed, it could come
early, first or even as a precursor of armed hostilities. For its part, the US
military is increasingly aware not only that sophisticated adversaries like
China can be expected to use cyber war to degrade the performance of US
forces, but also that US forces must integrate cyber war into their
capabilities and operations. Being more dependent on computer networks to enhance military performance
than are its adversaries, including China, US forces have more to lose than to gain from the outbreak of cyber war during an armed
conflict. This being so, would it make sense for the United States to wait and see if the enemy resorts to cyber war before doing so

Given US conventional military superiority, it can be assumed that any


adversary that can use cyber war against US forces will do so. Moreover, waiting for the other
itself?

side to launch a cyber attack could be disadvantageous insofar as US forces would be the first to suffer degraded performance.
Thus, rather than waiting, there will be pressure for the United States to commence cyber attacks early, and perhaps first. Moreover,
leading US military officers have strongly implied that cyber war would have a role in attacking enemy anti-access and area-denial
(A2AD) capabilities irrespective of the enemys use of cyber war.13 If the United States is prepared to conduct offensive cyber
operations against a highly advanced opponent such as China, it stands to reason that it would do likewise against lesser opponents.

The nature of US countermilitary cyber attacks during wartime should derive from the mission of gaining, or
denying the opponent, operational advantage. Primary targets of the United States should mirror those of
In sum, offensive cyber war is becoming part and parcel of the US war-fighting doctrine.

a cyber-capable adversary: ISR, command and control, navigation and guidance, transport and logistics support. Because this
mission is not coercive or strategic in nature, economic and other civilian networks should not be targeted. However, to the extent
that networks that enable military operations may be multipurpose, avoidance of non-military harm cannot be assured. There are no
sharp firebreaks in cyber war.14

China would initiate preemptive cyber strikes on the US


Freedberg 13
(Freedberg, Sydney J. Sydney J. Freedberg Jr. is the deputy editor for Breaking Defense. He graduated
summa cum laude from Harvard with an AB in History and holds an MA in Security Studies from
Georgetown University and a MPhil in European Studies from Cambridge University. During his 13
years at National Journal magazine, he wrote his first story about what became known as "homeland
security" in 1998, his first story about "military transformation" in 1999, and his first story on
"asymmetrical warfare" in 2000. Since 2004 he has conducted in-depth interviews with more than 200
veterans of Afghanistan and Iraq about their experiences, insights, and lessons-learned, writing
stories that won awards from the association of Military Reporters & Editors in 2008 and 2009, as well
as an honorable mention in 2010. "Chinas Fear Of US May Tempt Them To Preempt: Sinologists,"
Breaking Defense. 10-1-2013. http://breakingdefense.com/2013/10/chinas-fear-of-us-may-tempt-themto-preempt-sinologists/2///ghs-kw)

Because China believes it is much weaker than the United


States, they are more likely to launch a massive preemptive strike in a crisis.
Heres the other bad news: The current US concept for high-tech warfare , known as Air-Sea Battle,
might escalate the conflict even further towards a limited nuclear war, says one of the top American experts
WASHINGTON:

on the Chinese military. [This is one in an occasional series on the crucial strategic relationship and the military
capabilities of the US, its allies and China.] What US analysts call an anti-access/area denial strategy is what

the Chinese approach is born of a deep


sense of vulnerability that dates back 200 years, China analyst Larry Wortzel said at the Institute
of World Politics: The Peoples Liberation Army still sees themselves as an inferior force
to the American military, and thats who they think their most likely enemy is. Thats
fine as long as it deters China from attacking its neighbors. But if deterrence fails, the Chinese are
likely to go big or go home. Chinese military history from the Korean War in 1950 to
China calls counter-intervention and active defense, and

the Chinese invasion of Vietnam in 1979 to more recent, albeit vigorous but nonviolent, grabs for the disputed Scarborough Shoal suggests a preference for a
sudden use of overwhelming force at a crucial point , what Clausewitz would call the enemys
center of gravity. What they do is very heavily built on preemption, Wortzel said. The problem with the
striking the enemys center of gravity is, for the United States, they see it as being
in Japan, Hawaii, and the West Coast.Thats very escalatory. (Students of the American
military will nod sagely, of course, as we remind everyone that President George Bush made preemption a
centerpiece of American strategy after the terror attacks of 2001.) Wortzel argued that the current version of US AirSea Battle concept is also likely to lead to escalation. Chinas dependent on these ballistic missiles and anti-ship
missiles and satellite links, he said. Since those are almost all land-based, any attack on them involves striking
the Chinese mainland, which is pretty escalatory. You dont know how theyre going to react, he said. They do
have nuclear missiles. They actually think were more allergic to nuclear missiles landing on our soil than they are
on their soil. They think they can withstand a limited nuclear attack, or even a big nuclear attack, and retaliate.

So how would Chinas preemptive attack unfold? First would come


weeks of escalating rhetoric and cyberattacks. Theres no evidence the Chinese
favor a bolt out of the blue without giving the adversary what they believe is a
chance to back down, agreed retired Rear Adm. Michael McDevitt and Dennis Blasko, former Army defense
What War Would Look Like

attache in Beijing, speaking on a recent Wilson Center panel on Chinese strategy where they agreed on almost
nothing else. Thats not much comfort, though, considering that Imperial Japan showed clear signs they might

When the blow does fall, the experts


believe it would be sudden. Stuxnet-style viruses, electronic jamming, and Israeli-designed Harpy
radar-seeking cruise missiles (similar to the American HARM but slower and longer-ranged) would try to
blind every land-based and shipborne radar. Long-range anti-aircraft missiles like the
attack and still caught the US flat-footed at Pearl Harbor.

Russian-built S-300 would go for every plane currently in the air within 125 miles of Chinas coast, a radius that
covers all of Taiwan and some of Japan. Salvos of ballistic missiles would strike every airfield within 1,250 miles.
Thats enough range to hit the four US airbases in Japan and South Korea which are, after all, static targets you
can look up on Google Maps to destroy aircraft on the ground, crater the runways, and scatter the airfield with
unexploded cluster bomblets to defeat repair attempts. Long-range cruise missiles launched from shore, ships, and
submarines then go after naval vessels. And if the Chinese get really good and really lucky, they just might get a
solid enough fix on a US Navy aircraft carrier to lob a precision-guided ballistic missile at it. But would this work?
Maybe. This is fundamentally terra incognita, Heritage Foundation research fellow Dean Cheng told me. There has
been no direct conventional clash between major powers since Korea in the 1950s, no large-scale use of anti-ship
missiles since the Falklands in 1982, and no war ever where both sides possessed todays space, cyber, electronic
warfare, and precision-guided missile capabilities. Perhaps the least obvious but most critical uncertainty in a Pacific

I dont think weve seen electronic warfare on a scale that wed


see in a US-China confrontation, said Cheng. I doubt very much they are behind
us when it comes to electronic warfare, [and] the Chinese are training every day on
cyber: all those pings, all those attacks, all those attempts to penetrate. While the US
war would be invisible.

has invested heavily in jamming and spoofing over the last decade, much of the focus has been on how to disable

China, however, has focused its


electronic warfare and cyber attack efforts on the United States. Conceptually,
China may well be ahead of us in linking the two. (F-35 supporters may well disagree with this
insurgents roadside bombs, not on how to counter a high-tech nation-state.

conclusion.) Traditional radar jammers, for example, can also be used to insert viruses into the highly computerized

Where
there has been a fundamental difference, and perhaps the Chinese are better than
we are at this, is the Chinese seem to have kept cyber and electronic warfare as a
single integrated thing, Cheng said. We are only now coming round to the idea that electronic warfare is
linked to computer network operations. In a battle for the electromagnetic spectrum, Cheng said, the worst
case is that you thought your jammers, your sensors, everything was working
great, and the next thing you know missiles are penetrating [your defenses], planes
are being shot out of the sky.
AESA radars (active electronically scanned array) that are increasingly common in the US military.

China/Taiwan war goes nuclear


Glaser 11
(Charles, Professor of Political Science and International Affairs at the Elliott School of International
Affairs at George Washington University, Director of the Institute for Security and Conflict Studies,
Will Chinas Rise lead to War? , Foreign Affairs March/April 2011,
http://web.clas.ufl.edu/users/zselden/coursereading2011/Glaser.pdf)

THE PROSPECTS for avoiding intense military competition and war may be good, but growth in China's power may
nevertheless require some changes in U.S. foreign policy that Washington will find disagreeable--particularly
regarding Taiwan. Although it lost control of Taiwan during the Chinese Civil War more than six decades ago,

China still considers Taiwan to be part of its homeland, and unification remains a key political goal
for Beijing. China has made clear that it will use force if Taiwan declares
independence, and much of China's conventional military buildup has been dedicated
to increasing its ability to coerce Taiwan and reducing the United States' ability to intervene.
Because China places such high value on Taiwan and because the United States and
China--whatever they might formally agree to-- have such different attitudes regarding the
legitimacy of the status quo, the issue poses special dangers and challenges for the U.S.-Chinese
relationship, placing it in a different category than Japan or South Korea. A crisis over Taiwan could
fairly easily escalate to nuclear war, because each step along the way
might well seem rational to the actors involved. Current U.S. policy is designed to reduce the
probability that Taiwan will declare

independence and to make clear that the United States will not come to

the United States would find itself under pressure to


protect Taiwan against any sort of attack, no matter how it originated. Given the different
Taiwan's

aid if it does. Nevertheless,

interests and perceptions of the various parties and the limited control Washington has over Taipei's behavior, a
crisis could unfold in which the United States found itself following events rather than leading them. Such

ongoing improvements in China's military


capabilities may make Beijing more willing to escalate a Taiwan crisis. In addition
to its improved conventional capabilities, China is modernizing its nuclear forces to
increase their ability to survive and retaliate following a large-scale U.S. attack. Standard
dangers have been around for decades, but

deterrence theory holds that Washington's

current ability to destroy most or all of China's nuclear force enhances

bargaining position. China's nuclear modernization might remove that check on


Chinese action, leading Beijing to behave more boldly in future crises than it has in
past ones. A U.S. attempt to preserve its ability to defend Taiwan, meanwhile, could fuel a
conventional and nuclear arms race. Enhancements to U.S. offensive targeting capabilities and

its

strategic ballistic missile defenses might be interpreted by China as a signal of malign U.S. motives, leading to
further Chinese military efforts and a general poisoning of U.S.-Chinese relations.

2NC Cyber-Deterrence
Cyber-offensive strengths are key to cyber-deterrence and
minimizing damage
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

there is a danger that US counter-military cyber operations will


infect and damage systems other than those targeted, including civilian systems,
because of the technical difficulties of controlling effects, especially for systems that
support multiple services. As we have previously noted in these pages, an attack that uses a replicable
Even with effective C2,

agent, such as a virus or worm, has substantial potential to spread, perhaps uncontrollably.19 The dangers of
collateral damage on non-combatants imply not only the possibility of violating the laws of war (as they might apply
to cyber war), but also of provoking escalation. While the United States would like there to be strong technical and

US
doctrine concerning the conduct of wartime counter-military offensive operations
must account for these risks. This presents a dilemma, for dedicated military systems tend
to be harder to access and disrupt than multipurpose or civilian ones. Chinas
military, for example, is known for its attention to communications security, aided
by its reliance on short-range and land-based (for example, fibre-optical)
transmission of C4ISR. Yet, to attack less secure multipurpose systems on which the
Chinese military depends for logistics is to risk collateral damage and heighten the
risk of escalation. Faced with this dilemma, US policy should be to exercise care in attacking
military networks that also support civilian services. The better its offensive cyber-war
capabilities, the more able the United States will be to disrupt critical
enemy military systems and avoid indiscriminate effects. Moreover, US
offensive strength could deter enemy escalation. As we have argued before, US
superiority in counter-military cyber war would have the dual advantage of
delivering operational benefits by degrading enemy forces and averting a more
expansive cyber war than intended. While the United States should avoid the
spread of cyber war beyond military systems, it should develop and maintain an
unmatched capability to conduct counter-military cyber war. This would give it
operational advantages and escalation dominance. Such capabilities might enable
the United States to disrupt enemy C4ISR systems used for the control and
operation of nuclear forces. However, to attack such systems would risk causing the enemy to perceive
C2 safeguards against unwanted effects and thus escalation, it is not clear that there are. It follows that

that the United States was either engaged in a non-nuclear-disarming first strike or preparing for a nucleardisarming first strike. Avoiding such a misperception requires the avoidance of such systems, even if they also

US policy should be to create,


maintain and be ready to use superior cyber-war capabilities for counter-military
operations during armed conflict. Such an approach would deny even the most
capable of adversaries, China included, an advantage by resorting to cyber war in an
armed conflict. The paramount goal of the United States should be to retain its
military advantage in the age of cyber war a tall order, but a crucial one
for US interests.
support enemy non-nuclear C4ISR (as Chinas may do). In sum,

2NC Russia
Deterrence solves cyber-war and Russian aggression
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

While the United States should be ready to conduct cyber attacks against
military forces in an armed conflict, it should in general otherwise try to avoid and
prevent cyber war. (Possible exceptions to this posture of avoidance are taken up later in the cases
Retaliation

concerning coercion.) In keeping with its commitment to an open, secure, interoperable and reliable internet that

the United States should


seek to minimise the danger of unrestricted cyber war, in which critical economic,
governmental and societal systems and services are disrupted. 20 Given how difficult it is to
protect such systems, the United States must rely to a heavy extent on deterrence
and thus the threat of retaliation. To this end, the US Defense Department has stated that a
would-be attacker could suffer unacceptable costs if it launches a cyber attack on
the United States.21 While such a warning is worth issuing, it raises the question of how these unacceptable
enables prosperity, public safety, and the free flow of commerce and ideas,

costs could be defined and levied. Short of disclosing specific targets and methods, which we do not advocate, the
United States could strengthen both the deterrence it seeks and the norms it favours by indicating what actions
might constitute retaliation. This is especially important because the most vulnerable targets of cyber retaliation

cyber retaliation that


extends beyond military capabilities, as required for strong deterrence, might be
considered indiscriminate. Whether it is also disproportionate depends in part on the enemy attack that
are computer networks that serve civilian life, starting with the internet. By definition,

precipitated it. We can posit, for purposes of analysis, that an enemy attack would be aimed at causing severe
disruptions of such economic and societal functions as financial services, power-grid management, transport
systems, telecommunications services, media and government services, along with the expected military and
intelligence functions. In considering how the United States should retaliate, the distinction between the population
and the state of the attacker is useful. The United States would hold the latter, not the former, culpable, and thus
the rightful object of retaliation. This would suggest targeting propaganda and other societal-control systems;
government financial systems; state access to banks; political and economic elites on which the state depends;
industries on which the state depends, especially state-owned enterprises; and internal security forces and

consider the case of Russia. The


Russian state is both sprawling and centralised: within Russias economy and
society, it is pervasive, heavy-handed and exploitative; power is concentrated in the
Kremlin; and elites of all sorts are beholden to it. Although the Russian state is well
entrenched and not vulnerable to being overthrown, it is porous and exposed,
especially in cyberspace. Even if the computer systems of the innermost circle of
Russian state decision-making may be inaccessible, there are many important
systems that are not. Insofar as those who control the Russian state are more concerned about their own
functions. To judge how effective such a retaliation strategy could be,

well-being than that of the masses, targeting their apparatus would cause acute apprehension. Of course, the

even if Russia
were to launch indiscriminate cyber attacks on the US economy and society, the
United States might get more bang for its bytes by retaliating against systems that
support Russian state power. Of course, US cyber targeting could also include the
systems on which Russian leaders rely to direct military and other security forces,
which are the ultimate means of state power and control. Likewise, Russian military and intelligence
systems would be fair game for retaliation. At the same time, it would be vital to observe the
more important a computer system is to the state, the less accessible it is likely to be. Still,

stricture against disabling nuclear C2 systems, lest the Kremlin perceive that a US strategic strike of some sort was

the Russian states cyber vulnerabilities should be exploited


as much as possible. The United States could thus not only meet the standard of
unacceptable costs on which deterrence depends, but also gain escalation control
by giving Russias leaders a sense of their vulnerability. In addition to preventing further
escalation, this US targeting strategy would meet, more or less, normative standards of
discrimination and proportionality.
in the works. With this exception,

And the cyberthreat is real Mutliple Countries and Terrorists


are acquiring capabilities increases the risk of nuclear
nuclear war and collapsing agriculture and the power grid
Habiger, 2k10
(Eugue Retired Air Force General, Cyberwarfare and Cyberterrorism, The Cyber Security
Institute, p. 11-19)
However, there are reasons to believe that what is going on now amounts to a fundamental shift as opposed to business as usual. Todays network exploitation or information
operation trespasses possess a number of characteristics that suggest that the line between espionage and conflict has been, or is close to being, crossed. (What that suggests

the number of cyberattacks we are facing is


growing significantly. Andrew Palowitch, a former CIA official now consulting with the US Strategic
Command (STRATCOM), which oversees the Defense Departments Joint Task ForceGlobal Network Operations, recently told a meeting of experts that
the Defense Department has experienced almost 80,000 computer
attacks, and some number of these assaults have actually reduced the militarys
operational capabilities.20 Second, the nature of these attacks is starting to
shift from penetration attempts aimed at gathering intelligence (cyber spying) to offensive efforts aimed at taking down systems
for the proper response is a different matter.) First,

(cyberattacks). Palowitch put this in stark terms last November, We are currently in a cyberwar and war is going on today.21 Third, these recent attacks need to be taken in a

Russia and China have stepped up their offensive efforts


and taken a much more aggressive cyberwarfare posture. The Chinese have
broader strategic context. Both

developed an openly discussed cyberwar strategy aimed at achieving electronic dominance over the U.S. and its allies by 2050. In 2007 the Department of Defense reported

China has developed first strike viruses , marking a major


shift from prior investments in defensive measures.22 And in the intervening period China has launched
that for the first time

a series of offensive cyber operations against U.S. government and private sector networks and infrastructure. In 2007, Gen. James Cartwright, the former head of STRATCOM
and now the Vice Chairman of the Joint Chiefs of Staff, told the USChina Economic and Security Review Commission that Chinas ability to launch denial of service attacks to

Russia also has already begun to wage offensive


cyberwar. At the outset of the recent hostilities with Georgia, Russian assets launched a series of cyberattacks against the Georgian government and its critical
overwhelm an IT system is of particular concern. 23

infrastructure systems, including media, banking and transportation sites.24 In 2007, cyberattacks that many experts attribute, directly or indirectly, to Russia shut down the
Estonia governments IT systems. Fourth, the current geopolitical context must also be factored into any effort to gauge the degree of threat of cyberwar. The start of the new
Obama Administration has begun to help reduce tensions between the United States and other nations. And, the new administration has taken initial steps to improve bilateral
relations specifically with both China and Russia. However, it must be said that over the last few years the posture of both the Chinese and Russian governments toward America

. Some commentators have talked about the prospects of a cyber


Pearl Harbor, and the pattern of Chinese and Russian behavior to date gives
reason for concern along these lines: both nations have offensive cyberwarfare
strategies in place; both nations have taken the cyber equivalent of building up their
forces; both nations now regularly probe our cyber defenses looking for gaps to be exploited; both nations have
begun taking actions that cross the line from cyberespionage to cyberaggression; and, our bilateral relations with both
nations
are increasingly fractious and complicated by areas of marked, direct
competition. Clearly, there a sharp differences between current U.S. relations with these two nations and relations between the US and Japan just prior to
World War II. However, from a strategic defense perspective, there are enough warning signs to warrant preparation. In addition to the threat of cyberwar, the
limited resources required to carry out even a large scale cyberattack also makes likely the
potential for a significant cyberterror attack against the United States. However, the lack of a long
list of specific incidences of cyberterrorism should provide no comfort. There is strong evidence to suggest that
al Qaeda has the ability to conduct cyberterror attacks against the United States and its allies. Al
has clearly become more assertive, and at times even aggressive

Qaeda and other terrorist organizations are extremely active in cyberspace, using these technologies to communicate among themselves and others, carry out logistics, recruit
members, and wage information warfare. For example, al Qaeda leaders used email to communicate with the 911 terrorists and the 911 terrorists used the Internet to make

there
is evidence of efforts that al Qaeda and other terrorist organizations are
actively developing cyberterrorism capabilities and seeking to carry out cyberterrorist attacks.
travel plans and book flights. Osama bin Laden and other al Qaeda members routinely post videos and other messages to online sites to communicate. Moreover,

For example, the Washington Post has reported that U.S. investigators have found evidence in the logs that mark a browser's path through the Internet that al Qaeda operators
spent time on sites that offer software and programming instructions for the digital switches that run power, water, transport and communications grids. In some interrogations .
. . al Qaeda prisoners have described intentions, in general terms, to use those tools.25 Similarly, a 2002 CIA report on the cyberterror threat to a member of the Senate stated
that al Qaeda and Hezbollah have become "more adept at using the internet and computer technologies.26 The FBI has issued bulletins stating that, U. S. law enforcement
and intelligence agencies have received indications that Al Qaeda members have sought information on Supervisory Control And Data Acquisition (SCADA) systems available on
multiple SCADArelated web sites.27 In addition a number of jihadist websites, such as 7hj.7hj.com, teach computer attack and hacking skills in the service of Islam.28 While al
Qaeda may lack the cyberattack capability of nations like Russia and China, there is every reason to believe its operatives, and those of its ilk, are as capable as the cyber
criminals and hackers who routinely effect great harm on the worlds digital infrastructure generally and American assets specifically. In fact, perhaps, the most troubling
indication of the level of the cyberterrorist threat is the countless, serious non terrorist cyberattacks routinely carried out by criminals, hackers, disgruntled insiders, crime
syndicates and the like. If runofthemill criminals and hackers can threaten powergrids, hack vital military networks, steal vast sums of money, take down a citys of traffic
lights, compromise the Federal Aviation Administrations air traffic control systems, among other attacks, it is overwhelmingly likely that terrorists can carry out similar, if not
more malicious attacks. Moreover, even if the worlds terrorists are unable to breed these skills, they can certainly buy them. There are untold numbers of cybermercenaries
around the worldsophisticated hackers with advanced training who would be willing to offer their services for the right price. Finally, given the nature of our understanding of
cyber threats, there is always the possibility that we have already been the victim or a cyberterrorist attack, or such an attack has already been set but not yet effectuated, and

a welldesigned cyberattack has the capacity cause widespread


chaos, sow societal unrest, undermine national governments, spread paralyzing fear and anxiety, and
create a state of utter turmoil, all without taking a single life. A sophisticated cyberattack
could throw a nations banking and finance system into chaos causing markets
to crash, prompting runs on banks, degrading confidence in markets, perhaps even putting the nations
currency in play and making the government look helpless and hapless. In todays difficult economy, imagine how Americans would
react if vast sums of money were taken from their accounts and their supporting financial records were destroyed. A truly
we dont know it yet. Instead,

nefarious cyberattacker could carry out an attack in such a way (akin to Robin Hood) as to engender populist support and deepen rifts within our society, thereby making efforts

A modestly advanced enemy could use a cyberattack


to shut down (if not physically damage) one or more regional power grids. An entire region could be cast into total darkness, power dependent
systems could be shutdown. An attack on one or more regional power grids could also cause cascading
effects that could jeopardize our entire national grid . When word
leaks that the blackout was caused by a cyberattack, the specter of a foreign enemy
capable of sending the entire nation into darkness would only increase
the fear, turmoil and
unrest. While the finance and energy sectors are considered prime targets for a cyberattack, an attack on any of the 17 delineated critical infrastructure sectors could
to restore the system all the more difficult.

have a major impact on the United States. For example, our healthcare system is already technologically driven and the Obama Administrations e health efforts will only
increase that dependency. A cyberattack on the U.S. ehealth infrastructure could send our healthcare system into chaos and put countless of lives at risk. Imagine if emergency

A cyberattack on our nations water


systems could likewise cause widespread disruption. An attack on the control systems for one
or more dams could put entire communities at risk of being inundated, and could create ripple effects across the
water, agriculture, and energy sectors. Similar water control system
attacks could be used to at least temporarily deny water to otherwise arid regions, impacting
everything from the quality of life in these areas to agriculture. In 2007, the U.S. Cyber Consequences Unit determined that the destruction from a single wave of
room physicians and surgeons were suddenly no longer able to access vital patient information.

cyberattacks on critical infrastructures could exceed $700 billion, which would be the rough equivalent of 50 Katrina esque hurricanes hitting the United States all at the same
time.29 Similarly, one IT security source has estimated that the impact of a single day cyberwar attack that focused on and disrupted U.S. credit and debit card transactions
would be approximately $35 billion.30 Another way to gauge the potential for harm is in comparison to other similar noncyberattack infrastructure failures. For example, the
August 2003 regional power grid blackout is estimated to have cost the U.S. economy up to $10 billion, or roughly .1 percent of the nations GDP. 31 That said, a cyberattack of
the exact same magnitude would most certainly have a much larger impact. The origin of the 2003 blackout was almost immediately disclosed as an atypical system failure
having nothing to do with terrorism. This made the event both less threatening and likely a single time occurrence. Had it been disclosed that the event was the result of an
attack that could readily be repeated the impacts would likely have grown substantially, if not exponentially. Additionally, a cyberattack could also be used to disrupt our nations
defenses or distract our national leaders in advance of a more traditional conventional or strategic attack. Many military leaders actually believe that such a disruptive cyber
preoffensive is the most effective use of offensive cyber capabilities. This is, in fact, the way Russia utilized cyberattackerswhether government assets, governmentdirected/
coordinated assets, or allied cyber irregularsin advance of the invasion of Georgia. Widespread distributed denial of service (DDOS) attacks were launched on the Georgian
governments IT systems. Roughly a day later Russian armor rolled into Georgian territory. The cyberattacks were used to prepare the battlefield; they denied the Georgian
government a critical communications tool isolating it from its citizens and degrading its command and control capabilities precisely at the time of attack. In this way, these
attacks were the functional equivalent of conventional air and/or missile strikes on a nations communications infrastructure.32 One interesting element of the Georgian
cyberattacks has been generally overlooked: On July 20th, weeks before the August cyberattack, the website of Georgian President Mikheil Saakashvili was overwhelmed by a
more narrowly focused, but technologically similar DDOS attack.33 This should be particularly chilling to American national security experts as our systems undergo the same
sorts of focused, probing attacks on a constant basis. The ability of an enemy to use a cyberattack to counter our offensive capabilities or soften our defenses for a wider
offensive against the United States is much more than mere speculation. In fact, in Iraq it is already happening. Iraq insurgents are now using off theshelf software (costing just

insurgents
have succeeded in greatly reducing one of our most valuable sources
of realtime intelligence and situational awareness. If our enemies in Iraq are capable of such an effective cyberattack against one of
$26) to hack U.S. drones (costing $4.5 million each), allowing them to intercept the video feed from these drones.34 By hacking these drones the

our more sophisticated systems, consider what a more technologically advanced enemy could do. At the strategic level, in 2008, as the United States Central Command was
leading wars in both Iraq and Afghanistan, a cyber intruder compromised the security of the Command and sat within its IT systems, monitoring everything the Command was

the attacker could have used


this access to wage cyberwaraltering information, disrupting the
flow of information, destroying information, taking down
systemsagainst the United States forces already at war. Similarly, during 2003 as the United States prepared for and began the War in Iraq, the IT networks of
the Department of Defense were hacked 294 times.36 By August of 2004, with America at war, these ongoing attacks compelled
thenDeputy Secretary of Defense Paul Wolfowitz to write in a memo that, " Recent exploits have reduced
operational capabilities on our networks."37 This wasnt the first time that our national security IT
doing. 35 This time the attacker simply gathered vast amounts of intelligence. However, it is clear that

infrastructure was penetrated immediately in advance of a U.S. military option.38 In February of 1998 the Solar Sunrise attacks systematically compromised a series of
Department of Defense networks. What is often overlooked is that these attacks occurred during the ramp up period ahead of potential military action against Iraq. The

attackers were able to obtain vast amounts of sensitive informationinformation that would have certainly been of value to an enemys military leaders. There is no way to
prove that these actions were purposefully launched with the specific intent to distract American military assets or degrade our capabilities. However, such ambiguitiesthe
inability to specifically attribute actions and motives to actorsare the very nature of cyberspace. Perhaps, these repeated patterns of behavior were mere coincidence, or
perhaps they werent. The potential that an enemy might use a cyberattack to soften physical defenses, increase the gravity of harms from kinetic attacks, or both, significantly
increases the potential harms from a cyberattack. Consider the gravity of the threat and risk if an enemy, rightly or wrongly, believed that it could use a cyberattack to degrade

Such an enemy might be convinced that it could win a


warconventional or even nuclear against the U nited S tates. The effect of this would be
to undermine our deterrencebased defenses, making us significantly more at
risk of a major war .
our strategic weapons capabilities.

And we control probability and magnitude- it causes extinction


Bostrom, 2k2
(Nick Bostrom, Ph.D. and Professor of Philosophy at Oxford University, March 2002, Journal of
Evolution and Technology, Existential Risks: Analyzing Human Extinction Scenarios and
Related Hazards)

A much greater existential risk emerged with the build-up of nuclear


arsenals in the US and the USSR. An all-out nuclear war was a possibility
with both a substantial probability and with consequences that might have been
persistent enough to qualify as global and terminal. There was a real worry among those best acquainted with the information
available at the time that a nuclear Armageddon would occur and that it might annihilate our species or permanently destroy human civilization. Russia and
the US retain large nuclear arsenals that could be used in a future
confrontation, either accidentally or deliberately. There is also a risk that other states may one day build up large
nuclear arsenals. Note however that a smaller nuclear exchange , between India and Pakistan for instance, is not an
existential risk, since it would not destroy or thwart humankinds potential permanently. Such a war might however be a local terminal risk for the cities most
likely to be targeted. Unfortunately, we shall see that nuclear Armageddon and comet or asteroid strikes are mere preludes to the existential risks that we will encounter in the
21st century.

2NC T/ Case
Cyber-deterrence turns terrorism, war, prolif, and human rights
Gompert and Libicki 7/22
(Gompert, David C. and Libicki, Martin. David C. Gompert is the Principle Deputy Director of National
Intelligence. He is a Senior Fellow at RAND and a Distinguished Visiting Professor at the National
Defense University's Center for Technology and National Security Policy. Gompert received his BA in
Engineering from the US Naval Academy and his MPA from Princeton University. Martin Libicki received
his PhD in Economics from UC Berkeley, his MA in City and Regional Planning from UC Berkeley, and his
BSc in Mathematics from MIT. He is a Professor at the RAND Graduate School and a Senior
Management Scientist at RAND. Waging Cyber War the American Way, Survival: Global Politics and
Strategy. AugustSeptember 2015. Vol 57., 4th ed, pp 7-28. 07-22-2015.
http://www.iiss.org/en/publications/survival/sections/2015-1e95/survival--global-politics-and-strategyaugust-september-2015-c6ba/57-4-02-gompert-and-libicki-eab1//ghs-kw)

Given that retaliation and counter-military cyber war require copious offensive
capabilities, questions arise about whether these means could and should also be
used to coerce hostile states into complying with US demands without requiring the
use of armed force. Examples include pressuring a state to cease international
aggression, intimidating behaviour or support for terrorists; or to abandon
acquisition of weapons of mass destruction; or to end domestic human-rights
violations. If, as some argue, it is getting harder, costlier and riskier for the
United States to use conventional military force for such ends, threatening
or conducting cyber war may seem to be an attractive alternative. 25 Of course,
equating cyber war with war suggests that conducting or threatening it to impose Americas will is an idea not to be
treated lightly. Whereas counter-military cyber war presupposes a state of armed conflict, and retaliation

coercion (as meant here) presupposes


neither a state of armed conflict nor an enemy attack. This means , in essence, the
United States would threaten to start a cyber war outside of an armed conflict
presupposes that the United States has suffered a cyber attack,

something US policy has yet to address. While the United States has intimated that it would conduct cyber war
during an armed conflict and would retaliate if deterrence failed, it is silent about using or threatening cyber war as
an instrument of coercion. Such reticence fits with the general US aversion to this form of warfare, as well as a

the
use of cyber war for coercion can be more attractive than the use of conventional
force: it can be conducted without regard to geography, without threatening death
and physical destruction, and with no risk of American casualties. While the United
States has other non-military options, such as economic sanctions and supporting regime opponents,
none is a substitute for cyber war. Moreover, in the case of an adversary with little
or no ability to return fire in cyberspace, the United States might have an even
greater asymmetric advantage than it does with its conventional military
capabilities.
possible preference to carry out cyber attacks without attribution or admission. Notwithstanding US reticence,

China Tech DA

CX Questions
Customers are shifting to foreign products now why does the
plan reverse that trend?

1NC
NSA spying shifts tech dominance to China but its fragile
reversing the trend now kills China
Li and McElveen 13
(Cheng Li; Ryan Mcelveen. Cheng Li received a M.A. in Asian studies from the University of California,
Berkeley and a Ph.D. in political science from Princeton University. He is director of the John L.
Thornton China Center and a senior fellow in the Foreign Policy program at Brookings. He is also a
director of the Nationsal Committee on U.S.-China Relations. Li focuses on the transformation of
political leaders, generational change and technological development in China. "NSA Revelations Have
Irreparably Hurt U.S. Corporations in China," Brookings Institution. 12-12-2013.
http://www.brookings.edu/research/opinions/2013/12/12-nsa-revelations-hurt-corporations-china-limcelveen//ghs-kw)

The first story about the NSA


appeared in The Guardian on June 5. When Obama and Xi met in California two days
later, the United States had lost all credibility on the cyber security issue. Instead of
For the Obama administration, Snowdens timing could not have been worse.

providing Obama with the perfect opportunity to confront China about its years of intellectual property theft from U.S. firms, the
Sunnylands meeting forced Obama to resort to a defensive posture. Reflecting on how the tables had turned, the media reported

the
Chinese government turned to official media to launch a public campaign
against U.S. technology firms operating in China through its de-Cisco (qu
Sike hua) movement. By targeting Cisco, the U.S. networking company that had helped many
local Chinese governments develop and improve their IT infrastructures beginning in the mid-1990s, the Chinese
government struck at the very core of U.S.-China technological and economic
collaboration. The movement began with the publication of an issue of China Economic Weekly titled Hes
Watching You that singled out eight U.S. firms as guardian warriors who had infiltrated
the Chinese market: Apple, Cisco, Google, IBM, Intel, Microsoft, Oracle and
Qualcomm. Cisco, however, was designated as the most horrible of these warriors because of its pervasive reach into
Chinas financial and governmental sectors. For these U.S. technology firms, China is a vital source
of business that represents a fast-growing slice of the global technology market.
After the Chinese official media began disparaging the guardian
warriors in June, the sales of those companies have fallen precipitously.
With the release of its third quarter earnings in November, Cisco reported that orders from China fell 18
percent from the same period a year earlier and projected that overall revenue would fall 8 to 10 percent as a result, according
to Reuters. IBM reported that its revenue from the Chinese market fell 22 percent , which
resulted in a 4 percent drop in overall profit. Similarly, Microsoft has said that China had become its
weakest market. However, smaller U.S. technology firms working in China have not seen the same slowdown in business.
that President Xi chose to stay off-site at a nearby Hyatt hotel out of fear of eavesdropping. After the Sunnylands summit,

Juniper Networks, a networking rival to Cisco, and EMC Corp, a storage system maker, both saw increased business in the third

the Chinese continue to shun the guardian warriors, they may turn to similar but smaller
U.S. firms until domestic Chinese firms are ready to assume their role. In the meantime, trying to completely
de-Cisco would be too costly for China , as Ciscos network infrastructure has become too deeply embedded
around the country. Chinese technology firms have greatly benefited in the aftermath of the
Snowden revelations. For example, the share price of China National Software has increased 250 percent since June. In
addition, the Chinese government continues to push for faster development of its
technology industry, in which it has invested since the early 1990s, by funding the development of
supercomputers and satellite navigation systems . Still, Chinas current investment in cyber security
quarter. As

cannot compare with that of the United States. The U.S. government spends $6.5 billion annually on cyber security, whereas China

The Chinese
governments investment in both cyber espionage and cyber security will continue
to increase, and that investment will overwhelmingly benefit Chinese technology
corporations. Chinas reliance on the eight American guardian warrior
corporations will diminish as its domestic firms develop commensurate
capabilities. Bolstering Chinas cyber capabilities may emerge as one of the goals of Chinas National Security Committee,
spends $400 million, according to NetentSec CEO Yuan Shengang. But that will not be the case for long.

which was formed after the Third Plenary Meeting of the 18th Party Congress in November. Modeled on the U.S. National Security

Council and led by President Xi Jinping, the committee was established to centralize coordination and quicken response time,
although it is not yet clear how much of its efforts will be focused domestically or internationally. The Third Plenum also brought
further reform and opening of Chinas economy, including encouraging more competition in the private sector. The Chinese
leadership continues to solicit foreign investment, as evidenced by in the newly established Shanghai Free Trade Zone. However,

there is no doubt that investments by foreign technology companies are


less welcome than investments from other sectors because of the
Snowden revelations.

The AFF reclaims US tech leadership from China


Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.
He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

it could very well be that one of the


themes will be how the United States lost its global technology leadership to other
nations. And clearly one of the factors they would point to is the long-standing privileging of
U.S. national security interests over U.S. industrial and commercial interests when it comes to U.S.
foreign policy. This has occurred over the last few years as the U.S. government has done relatively little
to address the rising commercial challenge to U.S. technology companies, all the while putting intelligence
gathering first and foremost. Indeed, policy decisions by the U.S. intelligence
community have reverberated throughout the global economy. If the U.S. tech
industry is to remain the leader in the global marketplace, then the U.S.
government will need to set a new course that balances economic interests with national
security interests. The cost of inaction is not only short-term economic losses for U.S. companies, but a
wave of protectionist policies that will systematically weaken U.S. technology
competiveness in years to come, with impacts on economic growth, jobs, trade balance, and
national security through a weakened industrial base. Only by taking decisive steps to reform
its digital surveillance activities will the U.S. government enable its tech
industry to effectively compete in the global market.
CONCLUSION When historians write about this period in U.S. history

Growth is slowing nowinnovation and tech are key to sustain


CCP legitimacy
Ebner 14
(Julia Ebner. Julia Ebner received her MSc in International Relations and Affairs and her MSc in Political
Economy, Development Economics, and Natural Resources from Peking University. She was a
researcher at the European Institute of Asia Studies. "Entrepreneurs: Chinas Next Growth Engine?,"
Diplomat. 8-7-2014. http://thediplomat.com/2014/08/entrepreneurs-chinas-next-growth-engine///ghskw)

China want to remain an international economic superpower, it will need to substitute its
current growth model one largely based on abundant, cheap labor with a different
comparative advantage that can lay the foundation for a new, more sustainable
growth strategy. Chinese policymakers are hoping now that an emerging entrepreneurship
may fit that bill, with start-ups and family-run enterprises potentially
Should

becoming a major driver of sustainable growth and thus replacing the


countrys current economic model. In 2014, international conferences on private
entrepreneurship and innovation were organized all across China : The China Council for the
Promotion of International Trade organized its first annual Global Innovation Economic Congress, while

numerous innovation-related conferences were held at well-known Chinese


universities such as Tsinghua University, Jilin University and Wuhan University. New

Growth Model Needed Although China still ranks among the fastest growing economies in the world, the countrys

From the 1990s until the 2008


financial crisis, Chinas GDP growth was consistently in the double digits with only a brief
growth rates have decreased notably over the past few years.

interruption following the Asian financial crisis of 1997. Despite a relatively quick recovery after the global financial
crisis, declining export rates resulting from the economic distress of Chinas main trading partners have left their

Todays GDP growth of 7.8 percent is just half level recorded


immediately before the 2008 crisis, according to the latest data provided by the World Bank. This
recent slowdown in Chinas economic growth has naturally been a source of concern for
the government. A continuation of the countrys phenomenal economic
growth is needed to maintain both social stability and the Communist
Partys legitimacy. Sustainable economic growth has thus been identified
as one of Chinas key challenges for the coming decade. That challenge is
mark on the Chinese economy.

complicated by demographic trends, which are set to have a strongly negative impact on the Chinese economy
within the next decade. Researchers anticipate that as a consequence of the countrys one-child policy, introduced
in 1977, China will soon experience a sharp decline of its working-age population, leading to a substantial labor
force bottleneck. A labor shortage is likely to mean climbing wages, threatening Chinas cheap labor edge. The
challenge is well described in a recent article published by the International Monetary Fund. Replacing the Cheap

Entrepreneurship is widely recognized as an important engine for


economic growth: It contributes positively to economic development by fuelling job
markets through the creation of new employment opportunities, by stimulating
technological change through increased levels of innovation, and by enhancing the
market environment through an intensification of market competition.
Entrepreneurship and innovation have the potential to halt the contraction
in China economic growth and to replace the countrys unsustainable
comparative advantage of cheap labor over the long term. As former Chinese
President Hu Jintao stressed in 2006, if China can transform its current growth strategy
into one based on innovation and entrepreneurship, it could sustain its growth
rates and secure a key role in the international world order. Indeed, increasing
levels of entrepreneurship in the Chinese private sector are likely to lead to
technological innovation and productivity increases. This could prove particularly useful in
offsetting the workforce bottleneck created by demographic trends. Greater
innovation would also make China more competitive and less dependent on the knowledge and
Labor Strategy

technology of traditional Western trading partners such as the EU and the U.S.

Economic growth is key to prevent CCP collapse and lashout


Friedberg 10, Professor of Politics and International Affairs Princeton,
Asia Expert CFR (Aaron, Implications of the Financial Crisis for the USChina Rivalry, Survival, Volume 52, Issue 4, August, p. 31 54)

Beijing's stimulusprogrammewas insufficient to forestall a sizeable spike in


unemployment. The regime acknowledges that upwards of 20 million migrant workers lost their jobs
in the first year of the crisis, with many returning to their villages, and 7m recent college graduates are reportedly on the streets in search of work.9 Not
surprisingly, tough times have been accompanied by increased social turmoil. Even before the crisis hit,
the number of so-called 'mass incidents' (such as riots or strikes) reported each year in China had been rising. Perhaps
because it feared that the steep upward trend might be unnerving to foreign investors, Beijing stopped publishing aggregate, national statistics in 2005.10 Nevertheless, there
is ample, if fragmentary, evidence that things got worse as the economy slowed . In Beijing, for example,
salary cuts, layoffs, factory closures and the failure of business owners to pay back
wages resulted in an almost 100% increase in the number of labour disputes brought before
Despite its magnitude,

the courts.11 Since the early days of the current crisis, the regime has clearly been bracing itself for trouble. Thus, at the start of 2009, an official news-agency story candidly warned

the regime for


the first time summoned all 3,080 county-level police chiefs to the capital to learn the latest riot-control
tactics, and over 200 intermediate and lower-level judges were also called in for special training.13 Beijing's stimulus was insufficient At least for the moment, the
Chinese Communist Party (CCP) appears to be weathering the storm. But if in the next several years the economy
slumps again or simply fails to return to its previous pace, Beijing's troubles will mount. The regime probably
has enough repressive capacity to cope with a good deal more turbulence than it has thus far encountered,
but a protracted crisis could eventually pose a challenge to the solidarity of the party's
leadership and thus to its continued grip on political power. Sinologist MinxinPei points out that the greatest
danger to CCP rule comes not from below but from above . Rising societal discontent
'might be sufficient to tempt some members of the elite to exploit the situation to their own
political advantage' using 'populist appeals to weaken their rivals and, in the process, open[ing] up divisions within
Chinese readers that the country was, 'without a doubt entering a peak period of mass incidents'.12 In anticipation of an expected increase in unrest,

a bloody civil
war, will suddenly become plausible. Precisely because it is aware of this danger, the regime has been very careful to keep whatever differences exist
the party's seemingly unified upper ranks'.14 If this happens, all bets will be off and a very wide range of outcomes, from a democratic transition to

over how to deal with the current crisis within bounds and out of view. If there are significant rifts they could become apparent in the run-up to the pending change in leadership

Short of causing the regime to unravel, a sustained economic crisis could induce it to
abandon its current, cautious policy of avoiding conflict with other countries while patiently accumulating all the
elements of 'comprehensive national power'. If they believe that their backs are to the wall, China's leaders
might even be tempted to lash out, perhaps provoking a confrontation with a foreign
power in the hopes of rallying domestic support and deflecting public attention from their day-to-day troubles. Beijing
might also choose to implement a policy of 'military Keynesianism', further accelerating its already ambitious plans for
military construction in the hopes of pumping up aggregate demand and resuscitating a sagging domestic economy.15 In sum, despite its impressive initial
performance, Beijing is by no means on solid ground . The reverberations from the 2008-09 financial
crisismay yet shake the regime to its foundations, and could induce it to behave in
unexpected, and perhaps unexpectedly aggressive, ways.
scheduled for 2012.

Chinese lashout goes nuclear


Epoch Times 4
(The Epoch Times, Renxing San, 8/4/2004, 8/4, http://english.epochtimes.com/news/5-84/30931.html//ghs-kw)

Since the Partys life is above all else, it would not be surprising if the CCP resorts to the
use of biological, chemical, and nuclear weapons in its attempt to extend its life.
The CCP, which disregards human life, would not hesitate to kill two hundred million Americans,
along with seven or eight hundred million Chinese, to achieve its ends. These speeches let the public
see the CCP for what it really is. With evil filling its every cell the CCP intends to wage a war against
humankind in its desperate attempt to cling to life. That is the main theme of the speeches. This
theme is murderous and utterly evil. In China we have seen beggars who coerced people to give them money by
threatening to stab themselves with knives or pierce their throats with long nails. But we have never, until now,
seen such a gangster who would use biological, chemical, and nuclear weapons to threaten the world, that all will
die together with him. This bloody confession has confirmed the CCPs nature: that of a monstrous murderer who
has killed 80 million Chinese people and who now plans to hold one billion people hostage and gamble with their
lives.

2NC O/V
Disad outweighs and turns the AFFNSA backdoors are
causing foreign customers to switch to Chinese tech now but
the plan reverses that by closing backdoors and reclaiming US
tech leadership. That kills Chinese growth and results in a loss
of CCP legitimacy, which causes CCP lashout and extinction:
<insert o/w and t/ args>

2NC UQ
Extend uniquenessperception of NSA backdoors incentivizes
the Chinese government and foreign customers to shift to
Chinese tech, which boosts Chinese techUS company foreign
sales have been falling fastthats Li and McElveen
NSA spying boosts Chinese tech firms
Kan 13
(Kan, Michael. Michael Kan covers IT, telecommunications, and the Internet in China for the IDG News
Service. "NSA spying scandal accelerating China's push to favor local tech vendors," PCWorld. 12-32013. http://www.pcworld.com/article/2068900/nsa-spying-scandal-accelerating-chinas-push-to-favorlocal-tech-vendors.html//ghs-kw)

the tech services market may be shrinking for


U.S. enterprise vendors. Security concerns over U.S. secret surveillance are
giving the Chinese government and local companies more reason to trust
domestic vendors, according to industry experts. The country has always tried to support its
homegrown tech industry, but lately it is increasingly favoring local brands over foreign
competition. Starting this year, the nations government tenders have required IT suppliers
to source more products from local Chinese firms, said an executive at a U.S.-based storage
supplier that sells to China. In some cases, the tenders have required 50 percent or more of the
equipment to come from domestic brands, said the executive, who requested anonymity. Recent
While Chinas demand for electronics continues to soar,

leaks by former U.S. National Security Agency contractor, Edward Snowden, about the U.S.s secret spying program

China wants to favor local brands; they feel their


technology is getting better, the executive said. Snowden has just caused this to accelerate
incrementally. Last month, other U.S. enterprise vendors including Cisco and Qualcomm said the U.S.
spying scandal has put strains on their China business. Cisco reported its revenue from
the country fell 18 percent year-over-year in the last fiscal quarter. The Chinese government has yet to release
arent helping the matter. I think in general

an official document telling companies to stay away from U.S. vendors, said the manager of a large data center,

state-owned telecom operators have already


stopped orders for certain U.S. equipment to power their networks , he added. Instead,
the operators are relying on Chinese vendors such as Huawei Technologies , to supply
their telecommunications equipment. It will be hard for certain networking equipment made
in the U.S. to enter the Chinese market , the manager said. Its hard for them (U.S.
vendors) to get approval, to get certification from the related government
departments. Other companies, especially banks, are concerned that buying
enterprise gear from U.S. vendors may lead to scrutiny from the central
government, said Bryan Wang, an analyst with Forrester Research. The NSA issue has been
having an impact, but it hasnt been black and white, he added. In the future, China could
create new regulations on where certain state industries should source
their technology from, a possibility some CIOs are considering when
making IT purchases, Wang said. The obstacles facing U.S. enterprise vendors come
at a time when Chinas own homegrown companies are expanding in the enterprise
market. Huawei Technologies, a major vendor for networking equipment, this August came out with a new
networking switch that will put the company in closer competition with Cisco. Lenovo and ZTE are also
targeting the enterprise market with products targeted at government, and closing
the technology gap with their foreign rivals , Wang said. Overall in the longer-term, the
environment is positive for local vendors. We definitely see them taking market share
from multinational firms in China, he added. Chinese vendors are also expanding
outside the country and targeting the U.S. market. But last year Huawei and ZTE saw a
who has knowledge of such developments. But

push back from U.S. lawmakers concerned with the two companies alleged ties to the Chinese government. A
Congressional panel eventually advised that U.S. firms buy networking gear from other vendors, calling Huawei and
ZTE a security threat.

Europe is shifting to China now


Ranger 15
(Steve Ranger. "Rise of China tech, internet surveillance
revelations form background to CeBIT show," ZDNet. 3-172015. http://www.zdnet.com/article/rise-of-china-tech-internetsurveillance-revelations-form-background-to-cebit-show///ghskw)
CeBIT technology
show in Hannover reflects a gradual but important shift taking place in the European
technology world. Whereas in previous years US companies would have taken centre stage, this year
the emphasis is on China, both as a creator of technology and as a huge
potential market. "German business values China, not just as our most
important trade partner outside of Europe, but also as a partner in
developing sophisticated technologies," said Angela Merkel as she opened the
As well as showcasing new devices, from tablets to robotic sculptors and drones, this year's

show. "Especially in the digital economy, German and Chinese companies have core strengths ... and that's why
cooperation is a natural choice," she said. Chinese vice premier Ma Kai also attended the show, which featured a
keynote from Alibaba founder Jack Ma. China is CeBIT's 'partner country' this year, with over 600 Chinese

The UK is
also keen on further developing a historically close relationship: the China-Britain
Business Council is in Hannover to help UK firms set up meetings with Chinese
companies, and to provide support and advice to UK companies interested in doing
business in China. "China is mounting the biggest CeBIT partner country showcase ever. Attendees will
companies - including Huawei, Xiaomi, ZTE, and Neusoft - presenting their innovations at the show.

clearly see that Chinese companies are up there with the biggest and best of the global IT industry," said a

this activity is a result of the increasingly sophisticated output


of Chinese tech companies who are looking for new markets for their products.
Firms that have found it hard to make headway in the US, such as Huawei, have
been focusing their efforts on Europe instead. European tech companies are equally
keen to access the rapidly growing Chinese market. Revelations about mass
interception of communications by the US National Security Agency (including allegations
that spies had even tapped Angela Merkel's phone) have not helped US-European relations, either. So it's
spokesman for CeBIT. Some of

perhaps significant that an interview with NSA contractor-turned-whistleblower Edward Snowden is closing the
Hannover show.

2NC UQ: US Failing Now


US tech falling behind other countries
Kevin Ashton 06/2015 [the co-founder and former executive director of
the MIT Auto-ID Center, coined the term Internet of Things. His book How
to Fly a Horse: The Secret History of Creation, Invention, and Discovery was
published by Doubleday earlier this year] "America last?," The Agenda,
http://www.politico.com/agenda/story/2015/06/kevin-ashton-internet-ofthings-in-the-us-000102
In 2005,
Chinas high-tech exports exceeded Americas for the first time . In 2009, just after Wen
Jiabao spoke about the Internet of Things, Germanys high-tech exports exceeded Americas as
well. Today, Germany produces five times more high tech per capita than the U nited
States. Singapore and Koreas high-tech exporters are also far more productive than Americas and, according to
the most recent data, are close to pushing the U.S. down to fifth place in the world s high-tech
economy. And, as the most recent data are for 2013, that may have happened already. This decline will
surprise many Americans, including many American policymakers and pundits, who
assume U.S. leadership simply transfers from one tech revolution to the next. After all,
And, while they were not mentioning it, some key indicators began swinging away from the U.S.

that next revolution, the Internet of Things, was born in America, so perhaps it seems natural that America will lead.
Many U.S. commentators spin a myth that America is No. 1 in high tech, then extend it to claims that Europe is
lagging because of excessive government regulation, and hints that Asians are not innovators and entrepreneurs,
but mere imitators with cheap labor. This is jingoistic nonsense that could not be more wrong. Not only does
Germany, a leader of the European Union, lead the U.S. in high tech, but EU member states fund CERN, the
European Organization for Nuclear Research, which invented the World Wide Web and built the Large Hadron
Collider, likely to be a source of several centuries of high-tech innovation. (U.S. government intervention killed
Americas equivalent particle physics program, the Superconducting Super Collider, in 1993 an early symptom of

Apples iPhone, for


example, so often held up as the epitome of American innovation, looked a lot like a
Korean phone, the LG KE850, which was revealed and released before Apples product. Most of the
declining federal investment in basic research.) Asia, the alleged imitator, is anything but.

technology in the iPhone was invented in, and is exported by, Asian countries.

2NC Link
Extend the linkthe AFF stops creation of backdoors and
perpetuates the perception that US tech is safe, which means
the US regains customers and tech leadership from China
thats Castro and McQuinn
If the US loses its tech dominance, Chinese and Indian
innovation will quickly replace it
Fannin 13 (Rebecca Fannin, 7-12-2013, forbes magazine contributor "China Still Likely
To Take Over Tech Leadership If And When Silicon Valley Slips," Forbes,
http://www.forbes.com/sites/rebeccafannin/2013/07/12/china-still-likely-to-take-over-techleadership-if-and-when-silicon-valley-slips)

? Its a question thats


often pondered and debated, especially in the Valley, which has the most
to lose if the emerging markets of China or India take over
leadership. KPMG took a look at this question and other trends in its annual
Technology Innovation Survey, and found that the center of gravity may not
be shifting quite so fast to the East as once predicted. The KPMG survey of
811 technology executives globally found that one-third believe the Valley
Will Silicon Valley continue to maintain its market-leading position for technology innovation

will likely lose its tech trophy to an overseas market within just four years.
That percentage might seem high, but it compares with nearly half (44 percent) in last years survey. Its a notable improvement for
the Valley, as the U.S. economy and tech sector pick up. Which country will lead in disruptive breakthroughs? Here, the U.S. again
solidifies its long-standing reputation as the worlds tech giant while China has slipped in stature from a year ago, according to the
survey. In last years poll, the U.S. and China were tied for the top spot. But today, some 37 percent predict that the U.S. shows the
most promise for tech disruptions, little surprise considering Google GOOG +2.72%s strong showing in the survey as top company

China, which is
progressing from a reputation for just copying to also innovating or microinnovating. India, with a heritage of leadership in outsourcing, a large talent pool
of engineers, ample mentoring from networking groups such as TiE, and a vibrant
mobile communications market, ranked right behind the U.S. and China two years in
a row. Even though Chinas rank slid in this years tech innovation survey, its Silicon
Dragon tech economy is still regarded as the leading challenger and most likely to
replace the Valley, fueled by the markets huge, fast-growing and towering brands
such as Tencent, Baidu BIDU -1.13%and Alibaba, and a growing footprint overseas.
KPMG partner Egidio Zarrella notes that China is innovating at an impressive
speed, driven by domestic consumption for local brands that are unique to the
market. China will innovate for Chinas sake, he observes, adding that with
improved research and development capabilities, China will bridge the gap in
expanding globally. For another appraisal of Chinas tech innovation prowess, see Forbes post detailing how Mary
innovator in the world with its Google glass and driver-less cars. Meanwhile, about one-quarter pick

Meekers annual trends report singles out the markets merits, including the fact that China leads the world for the most Internet
and mobile communications users and has a tech-savvy consumer class that embraces new technologies. Besides China, its India
that shines in the KPMG survey.

India scores as the second-most likely country to topple the


U.S. for tech leadership. And, significantly, this emerging tiger nation ranks first on
an index that measures each countrys confidence in its own tech innovation
abilities. Based on ten factors, India rates highest on talent, mentoring, and
customer adoption of new technologies. The U.S. came in third on the confidence
index, while Israels Silicon Wadi ranked second. I srael was deemed strong in disruptive technologies,
talent and technology infrastructure. The U.S. was judged strongest in tech infrastructure, access to alliances and partnerships,
talent, and technology breakthroughs, and weakest in educational system and government incentives. Those weaknesses for the
U.S. are points that should be underscored in Americas tech clusters and in the nations capital as future tech leadership unfolds .

A second part of the comprehensive survey covering tech sectors pinpointed cloud
computing and mobile communications as hardly a fad but here to stay at least for

the next three years as the most disruptive technologies. Both were highlighted in
the 2012 report a well. In a change from last year, however, big data and biometrics
(face, voice and hand gestures that are digitally read) were identified as top sectors
that will see big breakthroughs. Its brave new tech world.

2NC Perception Link


The AFF restores trust in internet tech
Danielle Kehl et al 14, Senior Policy Analyst at New Americas Open
Technology Institute. Kevin Bankston is a Policy Director at OTI, Robyn
Greene is a Policy Counsel at OTI, Robert Morgus is a Research Associate at
OTI, Surveillance Costs: The NSAs Impact on the Economy, Internet
Freedom & Cybersecurity, July 2014, pg 40-1
The U.S. government should not require or request that new surveillance capabilities or security
vulnerabilities be built into communications technologies and services, even if these are intended only to facilitate

lawful surveillance. There is a great deal of evidence that backdoors fundamentally weaken the security of
hardware and software, regardless of whether only the NSA purportedly knows about said vulnerabilities, as some
of the documents suggest. A policy state- ment from the Internet Engineering Task Force in 2000 emphasized that
adding a requirement for wiretapping will make affected protocol designs considerably more complex. Experience
has shown that complexity almost inevitably jeopardizes the security of communications. 355 More recently, a May
2013 paper from the Center for Democracy and Technology on the risks of wiretap modifications to endpoints
concludes that deployment of an intercept capability in communications services, systems and applica- tions
poses serious security risks. 356 The authors add that on balance mandating that endpoint software vendors build
intercept functionality into their products will be much more costly to personal, economic and governmental
security overall than the risks associated with not being able to wiretap all communications. 357 While NSA
programs such as SIGINT Enablingmuch like proposals from domestic law enforcement agen- cies to update the
Communications Assistance for Law Enforcement Act (CALEA) to require dig- ital wiretapping capabilities in modern
Internet- based communications services 358 may aim to promote national security and law enforcement by
ensuring that federal agencies have the ability to intercept Internet communications, they do so at a huge cost to
online security overall. Because of the associated security risks, the U.S. government should not mandate or
request the creation of surveillance backdoors in prod- ucts, whether through legislation, court order, or the
leveraging industry relationships to convince companies to voluntarily insert vulnerabilities. As Bellovin et al.
explain, complying with these types of requirements would also hinder innovation and impose a tax on software
development in addition to creating a whole new class of vulnerabilities in hardware and software that un- dermines
the overall security of the products. 359 An amendment offered to the NDAA for Fiscal Year 2015 (H.R. 4435) by
Representatives Zoe Lofgren (D-CA) and Rush Holt (D-NJ) would have prohibited inserting these kinds of
vulnerabilities outright. 360 The Lofgren-Holt proposal aimed to prevent the funding of any intelligence agency,
intelligence program, or intelligence related activity that mandates or requests that a device manufacturer,
software developer, or standards organization build in a backdoor to circumvent the encryption or privacy
protections of its products, unless there is statutory authority to make such a mandate or request. 361 Although
that measure was not adopted as part of the NDAA, a similar amendment sponsored by Lofgren along with
Representatives Jim Sensenbrenner (D-WI) and Thomas Massie (R-KY), did make it into the House-approved version
of the NDAAwith the support of Internet companies and privacy orga- nizations 362 passing on an
overwhelming vote of 293 to 123. 363 Like Representative Graysons amendment on NSAs consultations with NIST
around encryption, it remains to be seen whether this amendment will end up in the final appropri- ations bill that
the President signs. Nonetheless, these legislative efforts are a heartening sign and are consistent with
recommendations from the Presidents Review Group that the U.S. govern- ment should not attempt to deliberately
weaken the security of commercial encryption products. Such mandated vulnerabilities, whether required under
statute or by court order or inserted simply by request, unduly threaten innovation in secure Internet technologies

A clear policy against


such vulnerability mandates is necessary to restore international trust in
U.S. companies and technologies.
while introducing security flaws that may be exploited by a variety of bad actors.

Policies such as the Secure Data Act are perceived as


strengthening security
Castro and McQuinn 15
(Castro, Daniel and McQuinn, Alan. Information Technology and Innovation Foundation. The
Information Technology and Innovation Foundation (ITIF) is a Washington, D.C.-based think tank at the
cutting edge of designing innovation strategies and technology policies to create economic
opportunities and improve quality of life in the United States and around the world. Founded in 2006,
ITIF is a 501(c) 3 nonprofit, non-partisan organization that documents the beneficial role technology
plays in our lives and provides pragmatic ideas for improving technology-driven productivity, boosting
competitiveness, and meeting todays global challenges through innovation. Daniel Castro is the vice
president of the Information Technology and Innovation Foundation. His research interests include
health IT, data privacy, e-commerce, e-government, electronic voting, information security, and
accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability
Office (GAO) where he audited IT security and management controls at various government agencies.

He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security
Technology and Management from Carnegie Mellon University. Alan McQuinn is a research assistant
with the Information Technology and Innovation Foundation. Prior to joining ITIF, Mr. McQuinn was a
telecommunications fellow for Congresswoman Anna Eshoo and an intern for the Federal
Communications Commission in the Office of Legislative Affairs. He got his B.S. in Political
Communications and Public Relations from the University of Texas at Austin. Beyond the USA
Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, ITIF. June 2015.
http://www2.itif.org/2015-beyond-usa-freedom-act.pdf//ghs-kw)

the U.S. government should draw a clear line in the sand and declare
that the policy of the U.S. government is to strengthen not weaken
information security. The U.S. Congress should pass legislation, such as
the Secure Data Act introduced by Sen. Wyden (D-OR), banning any
government efforts to introduce backdoors in software or weaken
encryption.43 In the short term, President Obama, or his successor, should sign an executive order formalizing this policy as
Second,

well. In addition, when U.S. government agencies discover vulnerabilities in software or hardware products, they should responsibly
notify these companies in a timely manner so that the companies can fix these flaws. The best way to protect U.S. citizens from
digital threats is to promote strong cybersecurity practices in the private sector.

2NC Chinese Markets Link


Domestic markets are key to Chinese techplan steals Chinese
market share
Lohr 12/2
(Steve Lohr. "In 2015, Technology Shifts Accelerate and China
Rules, IDC Predicts," NYT. 12-2-2014.
http://bits.blogs.nytimes.com/2014/12/02/in-2015-technologyshifts-accelerate-and-china-rules-idc-predicts///ghs-kw)
China. Most of the reporting and commentary
information
technology, its just the opposite, Frank Gens, IDCs chief analyst, said in an interview. China has a
roaring domestic market in technology. In 2015, IDC estimates that nearly 500 million
smartphones will be sold in China, three times the number sold in the United States
and about one third of global sales. Roughly 85 percent of the smartphones sold in
China will be made by its domestic producers like Lenovo, Xiaomi, Huawei, ZTE and
Coolpad. The rising prowess of Chinas homegrown smartphone makers will make it tougher on outsiders, as
Samsungs slowing growth and profits recently reflect. More than 680 million people in China will
be online next year, or 2.5 times the number in the United States. And the China
numbers are poised to grow further , helped by its national initiative, the Broadband China Project,
Beyond the detail, a couple of larger themes stand out. First is

recently on the Chinese economy has been about its slowing growth and challenges. In

intended to give 95 percent of the countrys urban population access to high-speed broadband networks. In all,

Chinas spending on information and communications technology will be more than


$465 billion in 2015, a growth rate of 11 percent. The expansion of the China tech
market will account for 43 percent of tech-sector growth worldwide.

The Chinese market is key to Chinese tech growth


Mozur 1/28
(Paul Mozur. Reporter for the NYT. "New Rules in China Upset Western Tech Companies," New York
Times. 1-28-2015. http://www.nytimes.com/2015/01/29/technology/in-china-new-cybersecurity-rulesperturb-western-tech-companies.html//ghs-kw)

servers and mainframes in China were still produced by


multinationals. Still, Chinese companies are catching up at the lower end . For all
enterprise hardware, local brands represented 21.3 percent revenue share
in 2010 in P.R.C. market and we expect in 2014 that number will reach
43.1 percent, he said, using the abbreviation for the Peoples Republic of China. Thats a huge
jump.
Mr. Yao said 90 percent of high-end

Chinese tech is key to the global industry


Lohr 12/2
(Steve Lohr. "In 2015, Technology Shifts Accelerate and China
Rules, IDC Predicts," NYT. 12-2-2014.
http://bits.blogs.nytimes.com/2014/12/02/in-2015-technologyshifts-accelerate-and-china-rules-idc-predicts///ghs-kw)

China. Most of the reporting and commentary


information
technology, its just the opposite, Frank Gens, IDCs chief analyst, said in an interview. China has a
roaring domestic market in technology. In 2015, IDC estimates that nearly 500 million
smartphones will be sold in China, three times the number sold in the United States
and about one third of global sales. Roughly 85 percent of the smartphones sold in
China will be made by its domestic producers like Lenovo, Xiaomi, Huawei, ZTE and
Beyond the detail, a couple of larger themes stand out. First is

recently on the Chinese economy has been about its slowing growth and challenges. In

Coolpad.

The rising prowess of Chinas homegrown smartphone makers will make it tougher on outsiders, as

More than 680 million people in China will


be online next year, or 2.5 times the number in the United States. And the China
numbers are poised to grow further , helped by its national initiative, the Broadband China Project,
Samsungs slowing growth and profits recently reflect.

intended to give 95 percent of the countrys urban population access to high-speed broadband networks. In all,

Chinas spending on information and communications technology will be


more than $465 billion in 2015, a growth rate of 11 percent. The
expansion of the China tech market will account for 43 percent of techsector growth worldwide.

2NC Tech K2 China Growth


Tech is key to Chinese growth
Xinhua 7/24
(Xinhua. Major Chinese news agency. "Industrial profits decline while high-tech sector shines in China
WCT. 7-24-2015. http://www.wantchinatimes.com/news-subclass-cnt.aspx?
id=20150328000036&amp;cid=1102//ghs-kw)

China's
high-tech industry flourished with the value-added output of the high-tech sector
growing 12.3% year-on-year in 2014. The high-tech industry accounted for 10.6% of
the country's overall industrial value-added output in 2014, which rose 7% from 2013 to 22.8
trillion yuan (US$3.71 trillion). The fast expansion of the high-tech and modern
service industries shows China's economy is advancing to the "middle and
high end," said Xie Hongguang, deputy chief of the NBS. China should work toward greater investment in "soft
Driven by the country's restructuring efforts amid the economic "new normal" of slow but quality growth,

infrastructure"like innovationinstead of "hard infrastructure" to climb the global value chain, said Zhang Monan,
an expert with the China Center for International Economic Exchanges. Indeed,

boosting innovation has

been put at the top of the government's agenda

as China has pledged to boost the


implementation of the "Made in China 2025" strategy, which will upgrade the manufacturing sector and help the
country achieve a medium-high level of economic growth.

China transitioning to tech-based economy


Barry

van Wyk Upstart: Chinas emergence in technology and innovation

---- by Barry van Wyk, The

Beijing Axis First published: May 27, 2010 Last updated: June 3, 2010

Significant progress has already been achieved with the MLP, and it is not hard to identify signs
of Chinas rapidly improving innovative abilities. GERD increased to 1.54 per cent in 2008 from 0.57
per cent in 1995. Occurring at a time when its GDP was growing exceptionally fast, Chinas GERD now
ranks behind only the US and Japan. The number of triadic patents (granted in all three of the major

patent offices in the US, Japan and Europe) granted to China remains relatively small, reaching 433 in 2005
(compared to 652 for Sweden and 3,158 for Korea), yet Chinese patent applications are increasing rapidly.
Chinese patent applications to the World Intellectual Property Office (WIPO), for example, increased by 44 per
cent in 2005 and by a further 57 per cent in 2006. From a total of about 20,000 in 1998, Chinas output
of scientific papers has increased fourfold to about 112,000 as of 2008, moving China to second
place in the global rankings, behind only the US. In the period 2004 to 2008, China produced about
400,000 papers, with the major focus areas being material science, chemistry, physics, mathematics and
engineering, but new fields like biological and medical science also gaining prominence.

China transitioning now


Trends in China's Transition toward a Knowledge Economy Authors:

Adam

Segal, Ira A. Lipman Senior

Fellow for Counterterrorism and National Security Studies Ernest J. Wilson III January/February 20 06 Asian
Survey http://www.cfr.org/publication/9924/trends_in_chinas_transition_toward_a_knowledge_economy.html
During the past decade, China has arguably placed more importance on reforming and modernizing
its information and communication technology (ICT) sector than any other developing country in the
world. Under former Premier Zhu Rongji, the Chinese leadership was strongly committed to making ICT
central to its national goalsfrom transforming Chinese society at home to pursuing its ambitions as a world
economic and political power. In one of his final speeches, delivered at the first session of the 10th National
Peoples Congress in 2003, Zhu implored his successors to energetically promote information
technology (IT) applications and use IT to propel and accelerate industrialization so that the
Chinese Communist Party (CCP) can continue to build a well-off society.1

2NC Global Econ I/L


China economic crash goes globaloutweighs the US and
disproves resiliency empirics
Pesek 14
(Writer for Bloomberg, an edited economic publication What to Fear If China Crashes, Bloomberg
View, http://www.bloombergview.com/articles/2014-07-16/what-to-fear-if-china-crashes)

Few moments in modern financial history were scarier than the week of Sept. 15, 2008,
when first Lehman Brothers and then American International Group collapsed. Who
could forget the cratering stock markets, panicky bailout negotiations, rampant foreclosures, depressing job losses

Yet a Chinese
crash might make 2008 look like a garden party. As the risks of one increase, it's worth
exploring how it might look. After all, China is now the world's biggest trading nation ,
the second-biggest economy and holder of some $4 trillion of foreign-currency
reserves. If China does experience a true credit crisis, it would be felt around the
world. "The example of how the global financial crisis began in one poorly-understood financial market and
and decimated retirement accounts -- not to mention the discouraging recovery since then?

spread dramatically from there illustrates the capacity for misjudging contagion risk," Adam Slater wrote in a July 14

Lehman and AIG, remember, were just two financial firms out of
dozens. Opaque dealings and off-balance-sheet investment vehicles made it virtually impossible even for the
managers of those companies to understand their vulnerabilities -- and those of the broader financial system. The
term "shadow banking system" soon became shorthand for potential instability
and contagion risk in world markets. Well, China is that and more. China surpassed
Oxford Economics report.

Japan in 2011 in gross domestic product and it's gaining on the U.S. Some World Bank researchers even think China
is already on the verge of becoming No. 1 (I'm skeptical). China's world-trade weighting has doubled in the last
decade. But the real explosion has been in the financial sector. Since 2008, Chinese stock valuations surged from
$1.8 trillion to $3.8 trillion and bank-balance sheets and the money supply jumped accordingly. China's broad
measure of money has surged by an incredible $12.5 trillion since 2008 to roughly match the U.S.'s monetary stock.
This enormous money buildup fed untold amounts of private-sector debt along with public-sector institutions. Its
scale, speed and opacity are fueling genuine concerns about a bad-loan meltdown in an economy that's 2 1/2 times
bigger than Germany's. If that happens, at a minimum it would torch China's property markets and could take down
systemically important parts of Hong Kong's banking system. The reverberations probably wouldn't stop there,
however, and would hit resource-dependent Australia, batter trade-driven economies Japan, Singapore, South Korea
and Taiwan and whack prices of everything from oil and steel to gold and corn. " Chinas

importance for

the world economy and the rapid growth of its financial system, mean that there are widespread
concerns that a financial crisis in China would also turn into a global crisis ," says
London-based Slater. "A bad asset problem on this scale would dwarf that seen in the major emerging financial
crises seen in Russia and Argentina in 1998 and 2001, and also be more severe than the Japanese bad loan problem
of the 1990s." Such risks belie President Xi Jinping's insistence that China's financial reform process is a domestic
affair, subject neither to input nor scrutiny by the rest of the world. That's not the case. Just like the Chinese
pollution that darkens Asian skies and contributes to climate change, China's financial vulnerability is a global
problem. U.S. President Barack Obama made that clear enough in a May interview with National Public Radio. We
welcome Chinas peaceful rise," he said. In many ways, it would be a bigger national security problem for us if
China started falling apart at the seams. China's ascent obviously preoccupies the White House as it thwarts U.S.
foreign-policy objectives, taunts Japan and other nations with territorial claims in the Pacific and casts aspersions on

The
potential for things careening out of control in China are real . What worries bears
America's moral leadership. But China's frailty has to be on the minds of U.S. policy makers, too

such as Patrick Chovanec of Silvercrest Asset Management in New York, is Chinas unaltered obsession with building
the equivalent of new Manhattans almost overnight even as the nation's financial system shows signs of buckling.
As policy makers in Beijing generate even more credit to keep bubbles from bursting, the shadow banking system
continues to grow. The longer China delays its reckoning, the worst it might be for China -- and perhaps the rest of
us.

CCP collapse causes the second Great Depression


BHANDARI. 10.
Maya. Head of Emerging Markets Analysis, Lombard Street Research. If the
Chinese Bubble Bursts THE INTERNATIONAL ECONOMY.
http://www.international-economy.com/TIE_F10_ChinaBubbleSymp.pdf
The

latest financial crisis proved the central role of China in driving global economic

outcomes. China is the chief overseas surplus country corresponding to the U.S. deficit, and it
was excess ex ante Chinese savings which prompted ex post U _S. dis-saving. The massive ensuing build-up
of debt triggered a Great Recession almost as bad as the Great Depression . This causal
direction, from excess saving to excess spending, is confirmed by low global real interest rates through much of the

Had over-borrowing been the cause rather than effect , then real interest
rates would have been bid up to attract the required capital. A prospective hard landing
in China might thus be expected to have serious global implications. The Chinese economy
Goldilocks period.

did slow sharply over the last eighteen months, but only briefly, as large-scale Irhind-the-scenes stimulus meant
that it quickly retumed to overheating. Given its 910 percent "trend" growth rate, and 30 per. cent import ratio,
China is nearly twice as powerful a global growth locomotive as the United States, based on its implied import gain.

surrounding export hubs, whose growth prospects are a "second derivative" of what transpires
would suffer most directly from Chinese slowing, the knock to global growth
would be significant. Voracious Chinese demand has also been a crucial driver of global
commodity prices, particularly metals and oil, so they too may face a hard landing if
Chinese demand dries up.
So while the
in China,

CCP collapse deals a massive deflationary shock to the world.


ZHAO. 10.
Chen. Chief Global Strategist and Managing Editor for Global Investment
Strategy, BCA Research Group. If the Chinese Bubble Bursts THE
INTERNATIONAL ECONOMY. http://www.internationaleconomy.com/TIE_F10_ChinaBubbleSymp.pdf
At the onset, I believe the odds of a China asset bub- ble bursting are very low. It is difficult to argue that Chinese
asset markets, particularly real estate, are indeed already in a 'bubble. " Property prices in tier two and tier three
cities are actually quite cheap, but for pur- poses of discussion, there is always the danger that asset values could
get massively inflated over the next few years. If so, a crash would be inevitable. In fact, China experienced a
devastating real estate meltdown and "growth recession" in 199394, when then-premier Zhu Rongii initiated a
credit crackdown to rein in spreading inflation and real estate speculation. Property prices in major cities dropped
by over 40 per- cent and private sector GDP growth dropped to 3 per. cent from double-digit levels. Non-performing
loans soared to 30 pernt of total banking sector assets. It took more than seven years for the government to clean

If another episode of a bursting asset


bubble were to happen in China, the damage to the banking sector could be rather
severe. History has repeatedly show-n that credit inflation begets asset bubbles and,
almost by definition, a bursting asset bubble always leads to a banking crisis and severe
credit contraction. In China's case, bank credit is the lifeline for large state-owned companies, and a credit
up the financial mess and recapitalize the banking system.

crunch could choke off growth of these enterprises quickly. The big difference between today's situation and the
early 1990s, however, is that the Chinese authorities have accumulated '.ast reserves _ China also runs a huge cunent account surplus. In the early 1990s, China's reserves had dwindled to almost nothing and the current account
was in massive deficit. As a real estate meltdown led to a collapse in the Chinese currency in 199293. In other
words, Beijing today has a lot of resources at its disposal to stimulate the economy or to recapitalize the banking
system, whenever necessary. Therefore, the impact of a bursting bubble on growth could be very sham and even

bursting China
bubble would also be felt acutely in commodity prices. The commodity story has been
built around the China story. Naturally, a bursting China bub- ble would deal a devastating
blow to the commodities as well as commodity producers such as Latin America, Australia, and
Canada, among others. Asia as a whole, and Japan in particular, would also be acutely
affected by a "growth recession" in China. The economic integration between China and the
rest of Asia is welldocumented but it is important to note that there has been virtually no domestic
severe, but it would be short-lived because of supp-an from public sector spending _ A

spending in Japan in recent years and the country's economic growth has been leveraged almost entirely on exports

A bursting China bubble could seriously impair Japan's economic and asset market
performance Finally, a bursting China bubble would be a mas- sive deflationary
shock to the world economy. With China in growth recession, global saving excesses could
surge and world aggregate demand would vastly defi- cient. Bond yields could move
to new lows and stocks would drop, probably precipitouslyin short, investors would face very
bleak and frightening prospects.
to China

2NC US Econ I/L


Chinese growth turns the case --- strong Chinese technological
power forms linkages with US companies --- drives growth of
US companies
NRC 10 National Research Council The Dragon and the Elephant: Understanding the Development of
Innovation Capacity in China and India: Summary of a Conference www.nap.edu/openbook.php?
record_id=12873&page=13

Wadhwa found in his surveys that companies go offshore for reasons of cost and where the
markets are. Meanwhile, Asian immigrants are driving enterprise growth in the United States. Twenty-five
percent of technology and engineering firms launched in the last decade and 52% of Silicon Valley startups
had immigrant founders. Indian immigrants accounted for one-quarter of these. Among Americas new
immigrant entrepreneurs, more than 74 percent have a masters or a PhD degree. Yet the backlog of U.S.
immigration applications puts this stream of talent in limbo. One million skilled immigrants are
waiting for the annual quota of 120,000 visas, with caps of 8,400 per country. This is causing a reverse

brain drain from the U nited S tates back to countries of origin, the majority to India and China.
This endangers U.S. innovation and economic growth. There is a high likelihood, however, that
returning skilled talent will create new linkages to U.S. companies , as they are
doing within General Electric, IBM, and other companies. Jai Menon of IBM Corporation began his
survey of IBMs view of global talent recruitment by suggesting that aa. IBM pursues growth of its operations
as a global entity. There are 372,000 IBMers in 172 countries; 123,000 of these are in the Asia-Pacific region.
Eighty percent of the firms R&D activity is still based in the United States. IBM supports open standards
development and networked business models to facilitate global collaboration. Three factors drive the firms
decisions on staff placement and location of recruitment -- economics, skills and environment. IBM India has
grown its staff tenfold in five years; its $6 billion investment in three years represents a tripling of resources in
people, infrastructure and capital. Increasingly, as Vivek Wadhwa suggested, people get degrees in the United
States and return to India for their first jobs. IBM follows a comparable approach in China, with
10,000+ IBM employees involved in R&D, services and sales. In 2006, for the first time the number
of service workers overtook the number of agricultural laborers worldwide. Thus the needs of a service
economy comprise an issue looming for world leaders.

CCP collapse hurts US economy


Karabell 13
(Zachary. American author, historian, money manager and economist. Karabell is President of River
Twice Research, where he analyzes economic and political trends. He is also a Senior Advisor for
Business for Social Responsibility. Previously, he was Executive Vice President, Head of Marketing and
Chief Economist at Fred Alger Management, a New York-based investment firm, and President of Fred
Alger and Company, as well as Portfolio Manager of the China-US Growth Fund, which won both a
Lipper Award for top performance and a 5-star designation from Morningstar, Inc.. He was also
Executive Vice President of Alger's Spectra Funds, a no-load family of mutual funds that launched the
$30 million Spectra Green Fund, which was based on the idea that profit and sustainability are linked.
At Alger, he oversaw the creation, launch and marketing of several funds, led corporate strategy for
acquisitions, and represented the firm at public forums and in the media. Educated at Columbia,
Oxford, and Harvard, where he received his Ph.D., he is the author of several books. The U.S. cant
afford a Chinese economic collapse. Reuters. http://blogs.reuters.com/edgy-optimist/2013/03/07/theu-s-cant-afford-a-chinese-economic-collapse/)
Is China about to collapse? That question has been front and center in the past weeks as the country completes its
leadership transition and after the exposure of its various real estate bubbles during a widely watched 60 Minutes
expos this past weekend. Concerns about soaring property prices throughout China are hardly new, but they have
been given added weight by the government itself. Recognizing that a rapid implosion of the property market would
disrupt economic growth, the central government recently announced far-reaching measures designed to dent the
rampant speculation. Higher down payments, limiting the purchases of investment properties, and a capital gains
tax on real estate transactions designed to make flipping properties less lucrative were included. These measures,
in conjunction with the new governments announcing more modest growth targets of 7.5 percent a year, sent
Chinese equities plunging and led to a slew of commentary in the United States saying China would be the next
shoe to drop in the global system. Yet there is more here than simple alarm over the viability of Chinas economic
growth. There is the not-so-veiled undercurrent of rooting against China. It is difficult to find someone who explicitly
wants it to collapse, but the tone of much of the discourse suggests bloodlust. Given that China largely escaped the
crises that so afflicted the United States and the eurozone, the desire to see it stumble may be understandable. No

one really likes a global winner if that winner isnt you. The need to see China fail verges on jingoism. Americans
distrust the Chinese model, find that its business practices verge on the immoral and illegal, that its reporting and
accounting standards are sub-par at best and that its system is one of crony capitalism run by crony communists.
On Wall Street, the presumption usually seems to be that any Chinese company is a ponzi scheme masquerading as
a viable business. In various conversations and debates, I have rarely heard Chinas economic model mentioned
without disdain. Take, as just one example, Gordon Chang in Forbes: Beijings technocrats can postpone a

consequences of a
Chinese collapse, however, would be severe for the United States and for the world .
There could be no major Chinese contraction without a concomitant contraction in the
United States. That would mean sharply curtailed Chinese purchases of U.S. Treasury bonds ,
far less revenue for companies like General Motors, Nike, KFC and Apple that have robust
business in China (Apple made $6.83 billion in the fourth quarter of 2012, up from $4.08 billion a year prior),
and far fewer Chinese imports of high-end goods from American and Asian companies. It
would also mean a collapse of Chinese imports of materials such as copper, which would in
turn harm economic growth in emerging countries that continue to be a prime market
for American, Asian and European goods. China is now the worlds second-largest
economy, and property booms have been one aspect of its growth. Individual Chinese cannot invest outside of
reckoning, but they have not repealed the laws of economics. There will be a crash. The

the country, and the limited options of Chinas stock exchanges and almost nonexistent bond market mean that if
you are middle class and want to do more than keep your money in cash or low-yielding bank accounts, you buy
either luxury goods or apartments. That has meant a series of property bubbles over the past decade and a series
of measures by state and local officials to contain them. These recent measures are hardly the first, and they are
not likely to be the last. The past 10 years have seen wild swings in property prices, and as recently as 2011 the
government took steps to cool them; the number of transactions plummeted and prices slumped in hot markets like
Shanghai as much as 30, 40 and even 50 percent. You could go back year by year in the 2000s and see similar
bubbles forming and popping, as the government reacted to sharp run-ups with restrictions and then eased them
when the pendulum threatened to swing too far. China has had a series of property bubbles and a series of property
busts. It has also had massive urbanization that in time has absorbed the excess supply generated by massive
development. Today much of that supply is priced far above what workers flooding into Chinas cities can afford. But
that has always been true, and that housing has in time been purchased and used by Chinese families who are
moving up the income spectrum, much as U.S. suburbs evolved in the second half of the 20th century. More to the
point, all property bubbles are not created equal. The housing bubbles in the United States and Spain, for instance,
would never had been so disruptive without the massive amount of debt and the financial instruments and
derivatives based on them. A bursting housing bubble absent those would have been a hit to growth but not a
systemic crisis. In China, most buyers pay cash, and there is no derivative market around mortgages (at most
theres a small shadow market). Yes, there are all sorts of unofficial transactions with high-interest loans, but even
there, the consequences of busts are not the same as they were in the United States and Europe in recent years.
Two issues converge whenever China is discussed in the United States: fear of the next global crisis, and distrust
and dislike of the country. Concern is fine; we should always be attentive to possible risks. But Chinas property
bubbles are an unlikely risk, because of the absence of derivatives and because the central government is clearly
alert to the markets behavior. Suspicion and antipathy, however, are not constructive. They speak to the ongoing
difficulty China poses to Americans sense of global economic dominance and to the belief in the superiority of freemarket capitalism to Chinas state-managed capitalism. The U.S. system may prove to be more resilient over time;

Its success does not require Chinas failure, nor will


Chinas success invalidate the American model. For our own self-interest we should
be rooting for their efforts, and not jingoistically wishing for them to fail.
it has certainly proven successful to date.

2NC Impact UQ
Latest data show Chinese economy is growing nowignore
stock market claims which dont accurately reflect economic
fundamentals
Miller and Charney 7/15
(Miller, Leland R. and Charney, Craig. Mr. Miller is president and Mr. Charney is research director of
China Beige Book International, a private economic survey. Chinas Economy Is Recovering, Wall
Street Journal, 7/15/2015. http://www.wsj.com/articles/chinas-economy-is-recovering-1436979092//ghskw)

China released second-quarter statistics Wednesday that showed the economy growing
at 7%, the same real rate as the first quarter but with stronger nominal growth. That
result, higher than expected and coming just after a stock-market panic, surprised some
commentators and even aroused suspicion that the government cooked the numbers for political reasons.
While official data is indeed unreliable, our firm's latest research confirms that the
Chinese economy is improving after several disappointing quarters -- just not
for the reasons given by Beijing. The China Beige Book (CBB), a private survey of more than 2,000
Chinese firms each quarter, frequently anticipates the official story. We documented the
2012 property rebound, the 2013 interbank credit crunch and the 2014 slowdown in capital expenditure before any

The modest but broad-based improvement in the


Chinese economy that we tracked in the second quarter may seem at odds with the headlines
of carnage in the country's financial markets. But stock prices in China have almost
nothing to do with the economy's fundamentals. Our data show sales revenue, capital
expenditure, new domestic orders, hiring, wages and profits were all
better in the second quarter, making the improvement unmistakable -- albeit
not outstanding in any one category. In the labor market, both employment and wage growth
strengthened, and prospects for hiring look stable. This is not new: Our data have shown the
labor market remarkably steady over the past year, despite the economy's overall deceleration. Inflation data
are also a reason for optimism. Along with wages, input costs and sales prices grew
faster in the second quarter. The rate is still slower than a year ago, but at least this is a break from the
previously unstoppable tide of price deterioration. While it is just one quarter, our data suggest deflation may
have peaked. With the explosive stock market run-up occupying all but the final weeks of the quarter, it might
of them showed up in official statistics.

seem reasonable to conclude that this rally was the impetus behind the better results. Not so. Of all our indicators,
capital expenditure should have responded most positively to a boom in equities prices, but the uptick was barely

The strength of the second-quarter performance is instead found in


widespread expanding sales volumes, which firms were able to accomplish without
sacrificing profit margins. The fact that stronger sales, rather than greater
investment, was the driving force this quarter is itself an encouraging sign in light of
China's longstanding problem of excess investment and inadequate consumption.
These gains also track across sectors, highlighted by a welcome resurgence in both
property and retail. Property saw its strongest results in 18 months, buoyed
by stronger commercial and residential realty as well as transportation
construction. Six of our eight regions were better than last quarter , led by the Southwest
noticeable.

and North. The results were also an improvement over the second quarter of last year, if somewhat less so, with

Retailers, meanwhile, reported a second


consecutive quarter of improvement, both on-quarter and on-year, with growth
accelerating. For the first time in 18 months, the retail sector also had faster growth
than manufacturing, underscoring the danger of treating manufacturing as the bellwether for the economy.
residential construction the sector's major remaining black eye.

Chinas economy is stabilizing now but its fragile


AFP and Reuters 7/15

(Agence France-Presse and Reuters on Deutsche Welle. "China beats expectations on economic
growth," DW. 07-15-2015. http://www.dw.com/en/china-beats-expectations-on-economic-growth/a18584453//ghs-kw)
Slowing growth in key areas like foreign trade, state investment and domestic demand had prompted

economists to predict a year-on-year GDP increase of just under 7 percent for the April-June
quarter. The figure, released by the National Bureau of Statistics (NBS) on Wednesday, matched first-quarter growth

The government has officially set 7 percent as its target for GDP growth
this year. "We are aware that the domestic and external economic conditions are
still complicated, the global economic recovery is slow and tortuous and the
foundation for the stabilization of China's economy needs to be further
consolidated," NBS spokesman Sheng Laiyun told reporters. However, "the major indicators of the
second quarter showed that the growth was stabilized and ready to pick up, the
economy developed with positive changes and the vitality of the economic
development was strengthened," Sheng added. Industrial output, including production at
factories, workshops and mines also rose by 6.8 percent in June compared to 6.1 percent in May, the NBS
said. Tough transition, stock market fluctuating The robust growth comes despite a difficult
economic year for China. Figures released on Monday showed a dipped in foreign trade in the first half of
the year - with exports up slightly but imports well down. Public investment, for years the driver of double-digit
in China exactly.

percentage growth in China, is down as the government seeks to rely more on consumer demand - itself slow to
pick up. In recent weeks, the Shanghai stock market has been falling sharply, albeit after a huge boom in months
leading up to the crash.

Surveys prove China is experiencing growth now


Reuters 6/23
(Reuters. "Chinas Economy Appears to Be Stabilizing, Reports Show," International New York Times. 623-2015. http://www.nytimes.com/2015/06/24/business/international/chinas-economy-appears-to-bestabilizing-reports-show.html//ghs-kw)

Chinas factory activity showed signs of stabilizing in June, with two


nongovernment surveys suggesting that the economy might be regaining some
momentum, while many analysts expected further policy support to ensure a more sure-footed recovery. The
SHANGHAI

preliminary purchasing managers index for China published by HSBC and compiled by Markit, a data analysis firm,
edged up to 49.6 in June. It was the surveys highest level in three months but still below the 50 mark, which would

The pickup in new orders which


returned to positive territory at 50.3 in June was driven by a strong rise in the new
export orders subcomponent, suggesting that foreign demand may finally be
turning a corner, Capital Economics analysts wrote in a research note. Todays P.M.I. reading reinforces our
have pointed to an expansion. The final reading for May was 49.2.

view that the economy has started to find its footing. But companies stepped up layoffs, the survey showed,
shedding jobs at the fastest pace in more than six years. Annabel Fiddes, an economist at Markit, said:
Manufacturers continued to cut staff. This suggests companies have relatively muted growth expectations. She

A much rosier
picture was painted by a separate survey, a quarterly report by China Beige Book International, a
data analysis firm, describing a broad-based recovery in the second quarter, led
primarily by Chinas interior provinces. Among major sectors, two developments
stand out: a welcome resurgence in retail which saw rising revenue growth
despite a slip in prices and a broad-based rebound in property, said the reports
authors, Leland Miller and Craig Charney. Manufacturing, services, real estate,
agriculture and mining all had year-on-year and quarterly gains, they said.
said that she expected Beijing to step up their efforts to stimulate growth and job creation.

2NC US Heg I/L


Chinese growth is key to US hegemony

Yiwei 07 Wang yiwei, Center for American Studies @ Fudan University, China's Rise: An Unlikely Pillar of US Hegemony, Harvard
International Review, Volume 29, Issue 1 Spring7, pp. 60-63.

Chinas rise is taking place in this context. That is to say, Chinese development is merely one facet of Asian
and developing states economic progress in general. Historically, the United States has provided the
dominant development paradigm for the world. But today, China has come up with development strategies
that are different from that of any other nation-state in history and are a consequence of the global migration
of industry along comparative advantage lines. Presently, the movement of light industry and consumer goods
production from advanced industrialized countries to China is nearly complete, but heavy industry is only
beginning to move. Developed countries dependence on China will be far more pronounced

following this movement. As global production migrates to China and other developing
countries, a feedback loop will emerge and indeed is already beginning to emerge. Where
globalization was once an engine fueled by Western muscle and steered by Western policy,
there is now more gas in the tank but there are also more hands on the steering wheel. In the
past, developing countries were often in a position only to respond to globalization, but now, developed
countries must respond as well. Previously the United States believed that globalization was synonymous with
Americanization, but todays world has witnessed a United States that is feeling the influence of the world as
well. In the past, a sneeze on Wall Street was followed by a downturn in world markets. But in February 2007,
Chinese stocks fell sharply and Wall Street responded with its steepest decline in several years. In this way,
the whirlpool of globalization is no longer spinning in one direction. Rather, it is generating feedback
mechanisms and is widening into an ellipse with two focal points: one located in the United States, the
historical leader of the developed world, and one in the China, the strongest country in the new developing
world power bloc. Combating Regionalization It is important to extend the discussion beyond platitudes
regarding US decline or the rise of China and the invective-laden debate over threats and security issues
that arises from these. We must step out of a narrowly national mindset and reconsider what Chinese
development means for the United States. One of the consequences of globalization has been that

countries such as China, which depend on exporting to US markets, have accumulated large
dollar reserves. This has been unavoidable for these countries, as they must purchase dollars in order to
keep the dollar strong and thus avoid massive losses. Thus, the United States is bound to bear a trade
deficit, and moreover, this deficit is inextricably tied to the dollars hegemony in todays
markets. The artificially high dollar and the US economy at large depend in a very real sense
on Chinas investment in the dollar. Low US inflation and interest rates similarly depend on
the thousands of Made in China labels distributed across the United States. As Paul Krugman
wrote in The New York Times, the situation is comparable to one in which the American sells the house but
the money to buy the house comes from China. Former US treasury secretary Lawrence Summers even
affirms that China and the United States may be in a kind of imprudent balance of financial terror. Today,
the US trade deficit with China is US$200 billion. China holds over US$1 trillion in foreign exchange reserves
and US$350 billion in US bonds. Together, the Chinese and US economies account for half of global economic
growth. Thus, a fantastic situation has arisen: Chinas rise is actually supporting US hegemony. Taking US

hegemony and Western preeminence as the starting point, many have concluded that the
rise of China presents a threat. The premise of this logic is that the international system predicated on
US hegemony and Western preeminence would be destabilized by the rise of a second major power. But this
view is inconsistent with the phenomenon of one-way globalization. The so-called process of
one-way globalization can more truly be called Westernization. Todays globalization is still in
large part driven by the West, inasmuch as it is tinged by Western unilateralism and entails the
dissemination of essentially Western standards and ideology. For example, Coca Cola has become a
Chinese cultural icon, Louis Vuitton stores crowd high-end shopping districts in Shanghai, and, as gender
equality progresses, Chinese women look to Western women for inspiration. In contrast, Haier,
the best-known Chinese brand in the United States, is still relatively unknown, and Wang Fei, who is widely
regarded in China as the pop star who was able to make it in the United States, has less name-recognition
there than a first-round American Idol cut.

2NC Growth Impacts


Chinese growth prevents global economic collapse, war over
Taiwan and CCP collapse
Lewis 08 [Dan, Research Director Economic Research Council, The
Nightmare of a Chinese Economic Collapse, World Finance, 5/13,
http://www.worldfinance.com/news/home/finalbell/article117.html]
In 2001, Gordon Chang authored a global bestseller "The Coming Collapse of China." To suggest that the worlds
largest nation of 1.3 billion people is on the brink of collapse is understandably for many, a deeply unnerving
theme. And many seasoned China Hands rejected Changs thesis outright. In a very real sense, they were of
course right. Chinas expansion has continued over the last six years without a hitch . After
notching up a staggering 10.7 percent growth last year, it is now the 4th largest economy in the world with a
nominal GDP of $2.68trn. Yet there are two Chinas that concern us here; the 800 million who live in the cities,
coastal and southern regions and the 500 million who live in the countryside and are mainly engaged in agriculture.
The latter which we in the West hear very little about are still very poor and much less happy. Their poverty and
misery do not necessarily spell an impending cataclysm after all, that is how they have always have been. But it
does illustrate the inequity of Chinese monetary policy. For many years, the Chinese yen has been held at an
artificially low value to boost manufacturing exports. This has clearly worked for one side of the economy, but not
for the purchasing power of consumers and the rural poor, some of who are getting even poorer. The central reason
for this has been the inability of Chinese monetary policy to adequately support both Chinas. Meanwhile, rural

unrest in China is on the rise fuelled not only by an accelerating income gap with the
coastal cities, but by an oft-reported appropriation of their land for little or no compensation
by the state. According to Professor David B. Smith, one of the Citys most accurate and respected economists in
recent years, potentially far more serious though is the impact that Chinese monetary policy could have on many
Western nations such as the UK. Quite simply, Chinas undervalued currency has enabled Western governments to
maintain artificially strong currencies, reduce inflation and keep interest rates lower than they might otherwise be.
We should therefore be very worried about how vulnerable Western economic growth is to an upward revaluation of
the Chinese yuan. Should that revaluation happen to appease Chinas rural poor, at a stroke, the dollar, sterling and
the euro would quickly depreciate, rates in those currencies would have to rise substantially and the yield on
government bonds would follow suit. This would add greatly to the debt servicing cost of budget deficits in the USA,
the UK and much of euro land. A reduction in demand for imported Chinese goods would quickly entail a decline in
Chinas economic growth rate. That is alarming. It has been calculated that to keep Chinas society

stable ie to manage the transition from a rural to an urban societywithout devastating


unemployment - the minimum growth rate is 7.2 percent. Anything less than that and
unemployment will rise and the massive shift in population from the country to the cities
becomes unsustainable. This is when real discontent with communist party rulebecomes
vocal and hard to ignore. It doesnt end there. That will at best bring a global recession. The
crucial point is that communist authoritarian states have at least had some success in
keeping a lid on ethnic tensions so far. But when multi-ethnic communist countries fall
apartfrom economic stress and the implosion of central power, history suggests that they
dont become successful democracies overnight. Far from it. Theres a very real chance that
China might go the way of Yugoloslavia or the Soviet Union chaos, civil unrestand
internecine war. In the very worst case scenario,a Chinese government might seek to maintain
national cohesion by going to war with Taiwan whom America is pledged to defend.

Chinese economic growth prevents global nuclear war


Kaminski 7 (Antoni Z., Professor Institute of Political Studies, World
Order: The Mechanics of Threats (Central European Perspective), Polish
Quarterly of International Affairs, 1, p. 58)
As already argued, the economic advance of China has taken place with relatively few corresponding changes in the
political system, although the operation of political and economic institutions has seen some major changes. Still,
tools are missing that would allow the establishment of political and legal foundations for the modem economy, or
they are too weak. The tools are efficient public administration, the rule of law, clearly defined ownership rights,

an economic crisis in China.


Considering the importance of the state for the development of the global economy, the crisis would have
serious global repercussions. Its political ramifications could be no less dramatic owing to the special
efficient banking system, etc. For these reasons, many experts fear

position the military occupies in the Chinese political system, and the existence of many potential vexed issues in

A potential hotbed of conflict is also


Taiwan's status. Economic recession and the related destabilization of internal policies could lead
to apolitical, or even military crisis. The likelihood of the global escalation of the conflict is
high, as the interests of Russia, China, Japan, Australia and , first and foremost, the US
clash in the region.
East Asia (disputes over islands in the China Sea and the Pacific).

Chinas economic rise is good --- theyre on the brink of


collapse --- causes CCP instability and lashout --- also tubes the
global economy, US primacy, and Sino relations
Mead 9 Walter Russell Mead, Henry A. Kissinger Senior Fellow in U.S.
Foreign Policy at the Council on Foreign Relations, Only Makes You
Stronger, The New Republic, 2/4/9, http://www.tnr.com/story_print.html?
id=571cbbb9-2887-4d81-8542-92e83915f5f8
The greatest danger both to U.S.-China relations and to American power itself is
probably not that China will rise too far, too fast; it is that the current crisis might end
China's growth miracle. In the worst-case scenario, the turmoil in the international economy will plunge
China into a major economic downturn. The Chinese financial system will implode
as loans to both state and private enterprises go bad. Millions or even tens of millions of Chinese will be
unemployed in a country without an effective social safety net. The collapse of asset
bubbles in the stock and property markets will wipe out the savings of a generation of the Chinese
middle class. The political consequences could include dangerous unrest--and a
bitter climate of anti-foreign feeling that blames others for China's woes. (Think of

Weimar Germany, when both Nazi and communist politicians blamed the West for Germany's economic
travails.) Worse, instability could lead to a vicious cycle , as nervous investors moved their
money out of the country, further slowing growth and, in turn, fomenting ever-greater
bitterness. Thanks to a generation of rapid economic growth, China has so far been able to manage
the stresses and conflicts of modernization and change; nobody knows what will happen if
the growth stops.

Growth decline threatens CCP rule---theyll start diversionary


wars in response
Shirk 7 Susan L. Shirk is an expert on Chinese politics and former Deputy
Assistant Secretary of State during the Clinton administration. She was in the
Bureau of East Asia and Pacific Affairs (People's Republic of China, Taiwan,
Hong Kong and Mongolia). She is currently a professor at the Graduate
School of International Relations and Pacific Studies at the University of
California, San Diego. She is also a Senior Director of Albright Stonebridge
Group, a global strategy firm, where she assists clients with issues related to
East Asia. China: Fragile Superpower, Book

By sustaining high rates of economic growth, Chinas leaders create new jobs and
limit the number ofunemployed workers who might go to the barricades. Binding the public to
the Party through nationalism also helps preempt opposition. The trick is to find a foreign policy approach that can achieve both these vital objectives
simultaneously. How long can it last? Viewed objectively, Chinas communist regime looks surprisingly resil- ient. It may be capable of surviving for years
to come so long as the economy continues to grow and create jobs. Survey research in Beijing shows wide- spread support (over 80 percent) for the
political system as a whole linked to sentiments of nationalism and acceptance of the CCPs argument about stability first.97 Without making any
fundamental changes in the CCP- dominated political systemleaders from time to time have toyed with reform ideas such as local elections but in each
instance have backed away for fear of losing controlthe Party has bought itself time. As scholar Pei Minxin notes, the ability of communist regimes to use
their patronage and coercion to hold on to power gives them little incentive to give up any of that power by introducing gradual democratization from

the greatest
political risk lying ahead of them is the possibility of an economic crash that throws
millions of workers out of their jobs or sends millions of depositors to withdraw their savings from the shaky banking system.
A massive environmental or public health disaster also could trigger regime collapse , especially if
peoples lives are endangered by a media cover-up imposed by Party authorities. Nationwide rebellion becomes a real
possibility when large numbers of people are upset about the same issue at the same time. Another
dangerous scenario is a domesticor international crisis in which the CCP leaders feel
compelled to lash out against Japan, Taiwan, or the United States because from
their point of view not lashing out might endanger Party rule .
above. Typically, only when communist systems implode do their political fun- damentals change.98 As Chinas leaders well know,

Chinese Growth Key to Military Restraint on Taiwan- Decline of


Economic Influence Causes China to Resort to Military
Aggression
Lampton, 3 (David, Director Chinese Studies, Nixon Center, FDCH, 3/18)
The Chinese realize that power has different faces--military, economic, and
normative (ideological) power. Right now, China is finding that in the era of
globalization, economic power (and potential economic power) is the form of power it
has in greatest abundance and which it can use most effectively. As long as economic
influence continues to be effective for Beijing, as it now seems to be in dealing with Taiwan,
for example, China is unlikely to resort to military intimidation as its chief foreign policy
instrument.

Decline causes lashout- nationalists target the US and Taiwan


Friedberg professor of IR at Princeton2011 (July/August, Aaron L., professor of
politics and international affairs at the Woodrow Wilson School at Princeton University, Hegemony with Chinese
Characteristics, The National Interest, lexis)

fears of aggression are heightened by an awareness that anxiety over a lack of


legitimacy at home can cause nondemocratic governments to try to deflect popular
frustration and discontent toward external enemies . Some Western observers worry, for
example, that if Chinas economy falters its rulers will try to blame foreigners and even
manufacture crises with Taiwan, Japan or the United States in order to rally their
people and redirect the populations anger. Whatever Beijings intent, such
confrontations couldeasilyspiral out of control .Democratic leaders are hardly immune to the
Such

temptation of foreign adventures. However, because the stakes for them are so much lower (being voted out of
office rather than being overthrown and imprisoned, or worse), they are less likely to take extreme risks to retain
their hold on power.

2NC China-India War Impact


Economic collapse will crush party legitimacy and ignite social
instability Li 9 (Cheng, Dir. of Research, John L. Thornton China Center,
Chinas Team of Rivals Brookings Foundation Article
series,Marcyhttp://www.brookings.edu/articles/2009/03_china_l
i.aspx)
The two dozen senior politicians who walk the halls of Zhongnanhai, the compound of the Chinese Communist
Partys leadership in Beijing, are worried. What was inconceivable a year ago now threatens their

Exports, critical to Chinas searing economic growth,


have plunged. Thousands of factories and businesses, especially those in the
prosperous coastal regions, have closed. In the last six months of 2008, 10
million workers, plus 1 million new college graduates, joined the already
gigantic ranks of the countrys unemployed. During the same period, the
Chinese stock market lost 65 percent of its value, equivalent to $3 trillion.
The crisis, President Hu Jintao said recently, is a test of our ability to control
a complex situation, and also a test of our partys governing ability.With this
rule: an economy in freefall.

rapid downturn, the Chinese Communist Party suddenly looks vulnerable. Since Deng Xiaoping
initiated economic reforms three decades ago, the partys legitimacy has relied upon its ability to

keep the economy running at breakneck pace.If China is no longer able to maintain a high
growth rate or provide jobs for its ever growing labor force, massive public dissatisfaction
and social unrest could erupt. No one realizes this possibility more than the handful of people who steer
Chinas massive economy. Double-digit growth has sheltered them through a SARS epidemic, massive earthquakes,
and contamination scandals. Now, the crucial question is whetherthey are equipped to handle an
economic crisis of this magnitudeand survive the political challenges it will bring . This year marks the
60th anniversary of the Peoples Republic, and the ruling party is no longer led by one strongman, like
Mao Zedong or Deng Xiaoping. Instead, the Politburo and its Standing Committee, Chinas most powerful
body, are run by two informal coalitions that compete against each other for power, influence, and control
over policy. Competition in the Communist Party is, of course, nothing new. But the jockeying today is no
longer a zero-sum game in which a winner takes all. It is worth remembering that when Jiang Zemin
handed the reins to his successor, Hu Jintao, in 2002, it marked the first time in the republics history that the
transfer of power didnt involve bloodshed or purges. Whats more, Hu was not a protg of Jiangs; they belonged

post-Deng China has


been run by a team of rivals. This internal competition was enshrined as party practice a little more
to competing factions. To borrow a phrase popular in Washington these days,

than a year ago. In October 2007, President Hu surprised many China watchers by abandoning the partys
normally straightforward succession procedure and designating not one but two heirs apparent. The Central
Committee named Xi Jinping and Li Keqiangtwo very different leaders in their early 50s to the
nine-member Politburo Standing Committee, where the rulers of China are groomed. The future roles of these two
men, who will essentially share power after the next party congress meets in 2012, have since been refined: Xi will
be the candidate to succeed the president, and Li will succeed Premier Wen Jiabao. The two rising stars
share little in terms of family background, political association, leadership skills, and policy orientation. But they are
each heavily involved in shaping economic policyand they are expected to lead the two competing

coalitions that will be relied upon to craft Chinas political and economic trajectory in the
next decade and beyond.

Regime collapse causes China-India war


Cohen 02 (Stephen, Senior Fellow Brookings Institution, Nuclear Weapons and Nuclear War in South Asia: An
Unknowable Future, May, http://www.brookings.edu/dybdocroot/views/speeches/cohens20020501.pdf)

A similar argument may be made with respect to China. China is a country that has had its share of upheavals in
the past. While there is no expectation today of renewed internal turmoil, it is important to remember that closed
authoritarian societies are subject to deep crisis in moments of sudden change. The breakup of the Soviet Union

and Yugoslavia, and the turmoil that has ravaged many members of the former communist bloc are examples of
what could happen to China. A severe economic crisis, rebellions in Tibet and Xinjiang, a reborn democracy
movement and a party torn by factions could be the ingredients of an unstable situation. A vulnerable Chinese
leadership determined to bolster its shaky position by an aggressive policy toward India or the United States or both
might become involved in a major crisis with India, perhaps engage in nuclear saber-rattling. That would encourage
India to adopt a stronger nuclear posture, possibly with American assistance.

Causes nuclear use

Jonathan S. Landay, National Security and Intelligence Correspondent, -2K [Top Administration Officials Warn
Stakes for U.S. Are High in Asian Conflicts, Knight Ridder/Tribune News Service, March 10, p. Lexis]
Few if any experts think China and Taiwan, North Korea and South Korea, or India and Pakistan

are spoiling to fight. But even a minor miscalculation by any of them could destabilize Asia,
jolt the global economy and even start a nuclear war. India, Pakistan and China all have
nuclear weapons, and North Korea may have a few, too. Asia lacks the kinds of organizations,
negotiations and diplomatic relationships that helped keep an uneasy peace for five decades
in Cold War Europe. Nowhere else on Earth are the stakes as high and relationships so fragile, said Bates Gill,
director of northeast Asian policy studies at the Brookings Institution, a Washington think tank. We see the
convergence of great power interest overlaid with lingering confrontations with no institutionalized security
mechanism in place. There are elements for potential disaster. In an effort to cool the regions tempers, President
Clinton, Defense Secretary William S. Cohen and National Security Adviser Samuel R. Berger all will hopscotch
Asias capitals this month. For America, the stakes could hardly be higher. There are 100,000 U.S. troops in Asia
committed to defending Taiwan, Japan and South Korea, and the United States would instantly become embroiled if
Beijing moved against Taiwan or North Korea attacked South Korea. While Washington has no defense

commitments to either India or Pakistan, a conflict between the two could end the global
taboo against using nuclear weapons and demolishthe already shaky international
nonproliferation regime. In addition, globalization has made a stable Asia _ with its massive markets, cheap
labor, exports and resources _ indispensable to the U.S. economy. Numerous U.S. firms and millions of American
jobs depend on trade with Asia that totaled $600 billion last year, according to the Commerce Department.

2NC Bioweapons Impact


The CCP would lash out for power, and they would use
bioweapons
Renxin 05Renxin, Journalist, 8-3-2K5 (San, CCP Gambles Insanely to Avoid Death, Epoch Times,
www.theepochtimes.com/news/5-8-3/30931.html)

Since the Partys life is above all else, it would not be surprising if the CCP resorts
to the use of biological, chemical, and nuclear weapons in its attempt to postpone
its life. The CCP,that disregards human life, would not hesitate to kill two hundred million
Americans, coupled with seven or eight hundred million Chinese, to achieve its
ends. The speech, free of all disguises, lets the public see the CCP for what it really is: with evil filling its every
cell, the CCP intends to fightall of mankind in its desperate attempt to clingto life. And

that is the theme of the speech. The theme is murderous and utterly evil. We did witness in China beggars who
demanded money from people by threatening to stab themselves with knives or prick their throats on long nails.
But we have never, until now, seen a rogue who blackmails the world to die with it by wielding biological, chemical,
and nuclear weapons. Anyhow, the bloody confession affirmed the CCPs bloodiness: a monstrous murderer, who
has killed 80 million Chinese people, now plans to hold one billion people hostage and gamble with their lives. As
the CCP is known to be a clique with a closed system, it is extraordinary for it to reveal its top secret on its own.
One might ask: what is the CCPs purpose to make public its gambling plan on its deathbed? The answer is: the
speech would have the effect of killing three birds with one stone. Its intentions are the following: Expressing the
CCPs resolve that it not be buried by either heaven or earth (direct quote from the speech). But then, isnt the
CCP opposed to the universe if it claims not to be buried by heaven and earth? Feeling the urgent need to harden
its image as a soft egg in the face of the Nine Commentaries. Preparing publicity for its final battle with mankind by
threatening war and trumpeting violence. So, strictly speaking, what the CCP has leaked out is more of an attempt
to clutch at straws to save its life rather than to launch a trial balloon. Of course, the way the speech was
presented had been carefully prepared. It did not have a usual opening or ending, and the audience, time, place,
and background related to the speech were all kept unidentified. One may speculate or imagine as one may, but
never verify. The aim was obviously to create a mysterious setting. In short, the speech came out as something
one finds difficult to tell whether it is false or true.

Outweighs and causes extinction


Ochs 2Past president of the Aberdeen Proving Ground Superfund Citizens Coalition, Member of the Depleted
Uranium Task force of the Military Toxics Project, and M of the Chemical Weapons Working Group [Richard Ochs, ,
June 9, 2002, Biological Weapons Must Be Abolished Immediately,
http://www.freefromterror.net/other_articles/abolish.html]

genetically engineered biological weapons, many


without a known cure or vaccine, are an extreme danger to the continued survival
of life on earth. Any perceived military value or deterrence pales in comparison to the great risk these
weapons pose just sitting in vials in laboratories. While a nuclear winter, resulting from a massive exchange of
Of all the weapons of mass destruction, the

nuclear weapons, could also kill off most of life on earth and severely compromise the health of future generations,

are easier to control. Biological weapons , on the other hand, can get out of
control very easily, as the recent anthrax attacks has demonstrated. There is no way to guarantee the
security of these doomsday weapons because very tiny amounts can be stolen or accidentally released and
then grow or be grown to horrendous proportions. The Black Death of the Middle Ages would be
they

small in comparison to the potential damage bioweapons could cause. Abolition of chemical weapons is less of a
priority because, while they can also kill millions of people outright, their persistence in the environment would be
less than nuclear or biological agents or more localized. Hence, chemical weapons would have a lesser effect on
future generations of innocent people and the natural environment. Like the Holocaust, once a localized chemical
extermination is over, it is over. With nuclear and biological weapons, the killing will probably never end.
Radioactive elements last tens of thousands of years and will keep causing cancers virtually forever. Potentially

agents by the hundreds with no known cure could wreck


even greater calamity on the human race than could persistent radiation. AIDS and ebola viruses
worse than that, bio-engineered

are just a small example of recently emerging plagues with no known cure or vaccine. Can we imagine hundreds of
such plagues? HUMAN EXTINCTION IS NOW POSSIBLE. Ironically, the Bush administration has just
changed the U.S. nuclear doctrine to allow nuclear retaliation against threats upon allies by conventional weapons.
The past doctrine allowed such use only as a last resort when our nations survival was at stake. Will the new policy
also allow easier use of US bioweapons? How slippery is this slope?

2NC AT Collapse Good


Reject their collapse good argumentstheyre racist and
incoherentChinese collapse decimates the U.S. for several
reasons
Karabell, 13PhD @ Harvard, President of River Twice Research
Zachary, The U.S. cant afford a Chinese economic collapse, The Edgy
Optimist, a Reuters blog run by Karabell, March 7,
http://blogs.reuters.com/edgy-optimist/2013/03/07/the-u-s-cant-afford-achinese-economic-collapse/ --BR
Is China about to collapse? That question has been front and center in the past weeks as
the country completes its leadership transition and after the exposure of its various real estate bubbles during a

Concerns about soaring property prices throughout China


are hardly new, but they have been given added weight by the government itself .
widely watched 60 Minutes expos this past weekend.

Recognizing that a rapid implosion of the property market would disrupt economic growth, the central government
recently announced far-reaching measures designed to dent the rampant speculation. Higher down payments,
limiting the purchases of investment properties, and a capital gains tax on real estate transactions designed to
make flipping properties less lucrative were included. These measures, in conjunction with the new governments
announcing more modest growth targets of 7.5 percent a year, sent Chinese equities plunging and led to a slew of

Yet there is
more here than simple alarm over the viability of Chinas economic growth. There is
the not-so-veiled undercurrent of rooting against China . It is difficult to find
someone who explicitly wants it to collapse, but the tone of much of the discourse
suggests bloodlust. Given that China largely escaped the crises that so afflicted the
United States and the eurozone, the desire to see it stumble may be
understandable. No one really likes a global winner if that winner isnt you. The
need to see China fail verges on jingoism. Americans distrust the Chinese
model, find that its business practices verge on the immoral and illegal, that its
reporting and accounting standards are sub-par at best and that its system is one of
crony capitalism run by crony communists. On Wall Street, the presumption usually
seems to be that any Chinese company is a ponzi scheme masquerading as a viable
business. In various conversations and debates, I have rarely heard Chinas
economic model mentioned without disdain. Take, as just one example, Gordon
Chang in Forbes: Beijings technocrats can postpone a reckoning, but they have not
repealed the laws of economics. There will be a crash. The consequences of a
Chinese collapse, however, would be severe for the United States and for the
world. There could be no major Chinese contraction without a concomitant
contraction in the United States. That would mean sharply curtailed Chinese
purchases of U.S. Treasury bonds, far less revenue for companies like General
Motors, Nike, KFC and Apple that have robust business in China (Apple made $6.83 billion in
the fourth quarter of 2012, up from $4.08 billion a year prior), and far fewer Chinese imports of highend goods from American and Asian companies. It would also mean a collapse of
Chinese imports of materials such as copper, which would in turn harm economic
growth in emerging countries that continue to be a prime market for American,
Asian and European goods. China is now the worlds second-largest economy, and property booms have
commentary in the United States saying China would be the next shoe to drop in the global system.

been one aspect of its growth. Individual Chinese cannot invest outside of the country, and the limited options of
Chinas stock exchanges and almost nonexistent bond market mean that if you are middle class and want to do
more than keep your money in cash or low-yielding bank accounts, you buy either luxury goods or apartments. That
has meant a series of property bubbles over the past decade and a series of measures by state and local officials to
contain them. These recent measures are hardly the first, and they are not likely to be the last. The past 10 years
have seen wild swings in property prices, and as recently as 2011 the government took steps to cool them; the
number of transactions plummeted and prices slumped in hot markets like Shanghai as much as 30, 40 and even
50 percent. You could go back year by year in the 2000s and see similar bubbles forming and popping, as the

government reacted to sharp run-ups with restrictions and then eased them when the pendulum threatened to
swing too far. China has had a series of property bubbles and a series of property busts. It has also had massive
urbanization that in time has absorbed the excess supply generated by massive development. Today much of that
supply is priced far above what workers flooding into Chinas cities can afford. But that has always been true, and
that housing has in time been purchased and used by Chinese families who are moving up the income spectrum,
much as U.S. suburbs evolved in the second half of the 20th century. More to the point, all property bubbles are not
created equal. The housing bubbles in the United States and Spain, for instance, would never had been so
disruptive without the massive amount of debt and the financial instruments and derivatives based on them. A
bursting housing bubble absent those would have been a hit to growth but not a systemic crisis. In China, most
buyers pay cash, and there is no derivative market around mortgages (at most theres a small shadow market). Yes,
there are all sorts of unofficial transactions with high-interest loans, but even there, the consequences of busts are

Two issues converge


whenever China is discussed in the United States: fear of the next global crisis, and
distrust and dislike of the country. Concern is fine; we should always be attentive to
possible risks. But Chinas property bubbles are an unlikely risk, because of the absence of derivatives and
because the central government is clearly alert to the markets behavior. Suspicion and antipathy,
however, are not constructive. They speak to the ongoing difficulty China poses to
Americans sense of global economic dominance and to the belief in the superiority
of free-market capitalism to Chinas state-managed capitalism. The U.S. system
may prove to be more resilient over time; it has certainly proven successful to date.
Its success does not require Chinas failure, nor will Chinas success invalidate
the American model. For our own self-interest we should be rooting for their
efforts, and not jingoistically wishing for them to fail.
not the same as they were in the United States and Europe in recent years.

2NC AT Collapse Inevitable


Status quo isnt sufficient to trigger collapse because the US is
lagging behind
Forbes, 7/9/2014
US Finance/Economics News Report Service
(John Kerry In Beijing: Four Good Reasons Why The Chinese View American
Leaders As Empty
Suits,http://www.forbes.com/sites/eamonnfingleton/2014/07/09/john-kerryin-beijing-four-good-reasons-why-the-chinese-treat-american-leaders-asjackasses/)
2. American policymakers have procrastinated in meeting the Chinese
challenge because they have constantly for more than a decade now
been misled by siren American voices predicting an imminent Chinese
financial collapse. China is a big economy and large financial collapses are
not inconceivable. But even the most disastrous such collapse would be
unlikely to stop the Chinese export drive in its tracks. American policymakers
have failed to pay sufficient attention to the central objective of Chinese
policy, which is to take over from the United States, Japan and Germany as
the worlds premier source of advanced manufactured products.

Consensus exists and the best markers point to a slow decline,


and the worst markers make sense in the context of china
Huang, 2/11, a senior associate in the Carnegie Asia Program, where his
research focuses on Chinas economic development and its impact on Asia
and the global economy (Yukon, Do Not Fear a Chinese Property Bubble,
Carnegie Endowment for International Peace,
http://carnegieendowment.org/2014/02/11/do-not-fear-chinese-propertybubble/h0oz)
Yet when analysts drill into the balance sheets of borrowers and banks, they find
little evidence of impending disaster. Government debt ratios are not high by global
standards and are backed by valuable assets at the local level. Household debt is a fraction
of what it is in the west, and it is supported by savings and rising incomes. The profits and cash
positions of most firms for which data are available have not deteriorated significantly
while sovereign guarantees cushion the more vulnerable state enterprises. The consensus, therefore, is
that Chinas debt situation has weakened but is manageable. Why are the views from

detailed sector analysis so different from the red flags signalled by the broader macro debt indicators? The answer
lies in the role that land values play in shaping these trends. Take the two most pressing concerns: rising debt
levels as a share of gross domestic product and weakening links between credit expansion and GDP growth. The
first relates to the surge in the ratio of total credit to GDP by about 50-60 percentage points over the past five
years, which is viewed as a strong predictor of an impending crash. Fitch, a rating agency, is among those who see
this as the fallout from irresponsible shadow-banking which is being channelled into property development, creating
a bubble. The second concern is that the credit impulse to growth has diminished, meaning that more and more
credit is needed to generate the same amount of GDP, which reduces prospects for future deleveraging. Linking
these two concerns is the price of land including related mark-ups levied by officials and developers. But its
significance is not well understood because Chinas property market emerged only in the late 1990s, when the
decision was made to privatise housing. A functioning resale market only began to form around the middle of the
last decade. That is why the large stimulus programme in response to the Asia financial crisis more than a decade
ago did not manifest itself in a property price surge, whereas the 2008-9 stimulus did. Over the past decade, no
other factor has been as important as rising property values in influencing growth patterns and perceptions of
financial risks. The weakening impact of credit on growth is largely explained by the divergence between fixed asset
investment (FAI) and gross fixed capital formation (GFCF). Both are measures of investment. FAI measures
investment in physical assets including land while GFCF measures investment in new equipment and structures,

excluding the value of land and existing assets. This latter feeds directly into GDP, while only a portion of FAI shows
up in GDP accounts. Until recently, the difference between the two measures did not matter in interpreting
economic trends: both were increasing at the same rate and reached about 35 per cent of GDP by 2002-03. Since
then, however, they have diverged and GFCF now stands at 45 per cent of GDP while the share of FAI has jumped to
70 per cent. Overall credit levels have increased in line with the rapid growth in FAI rather than the more modest
growth in GFCF. Most of the difference between the ratios is explained by rising asset prices. Thus a large share of
the surge in credit is financing property related transactions which explains why the growth impact of credit has

Is the increase in property and underlying land prices sustainable, or is it a


bubble? Part of the explanation is unique to China. Land in China is an asset whose market value went largely
declined.

unrecognised when it was totally controlled by the State. Once a private property market was created, the process
of discovering lands intrinsic value began, but establishing such values takes time in a rapidly changing economy.

Price Index indicates that from 2004-2012, land prices have


increased approximately fourfold nationally, with more dramatic increases in major cities such as
The Wharton/NUS/Tsinghua Land

Beijing balanced by modest rises in secondary cities. Although this may seem excessive, such growth rates are
similar to what happened in Russia after it privatised its housing stock. Once the economy stabilised, housing prices

Could investors have overshot the mark in China?


Possibly, but the land values should be high given Chinas large population, its
in Moscow increased six fold in just six years.

shortage of plots that are suitable for construction and its rapid economic growth. Nationally, the ratio of incomes to
housing prices has improved and is now comparable to the levels found in Australia, Taiwan and the UK. In Beijing

Much of the recent


surge in the credit to GDP ratio is actually evidence of financial deepening rather
than financial instability as China moves toward more market-based asset values . If
so, the higher credit ratios are fully consistent with the less alarming impressions
that come from scrutiny of sector specific financial indicators.
and Shanghai prices are similar to or lower than Delhi, Singapore and Hong Kong.

2NC AT Stocks
Chinas stock market is loosely tied to its economystructural
factors are fine and stock declines dont accurately reflect
growth
Rapoza 7/9
(Kenneth Rapoza. Contributing Editor at Forbes. "Don't Mistake China's Stock Market For China's
Economy," Forbes. 7-9-2015. http://www.forbes.com/sites/kenrapoza/2015/07/09/dont-mistake-chinasstock-market-for-chinas-economy///ghs-kw)

Chinas A-share market is rebounding, but whether or not it has hit bottom is beside
the point. What matters is this: the equity market in China is a more or less a
gambling den dominated by retail investors who make their investment
decisions based on what they read in investor newsletters. Its a herd
mentality. And more importantly, their trading habits do not reflect
economic fundamentals. The countrys stock market plays a smaller role in its
economy than the U.S. stock market does in ours, and has fewer linkages to the rest
of the economy, says Bill Adams, PNC Financials senior international economist in Pittsburgh. The fact
that the two are unhinged limits the potential for Chinas equity correction or a
bubble to trigger widespread economic distress. The recent 25% decline in the
Deutsche X-Trackers China A-Shares (ASHR) fund, partially recuperated on Thursday, is not a
signal of an impending Chinese recession. PNCs baseline forecast for Chinese
real GDP growth in 2015 remains unchanged at 6.8% despite the correction , a
correction which has been heralded by the bears as the beginning of the end for Chinas capitalist experiment.

Chinas economy, like its market, is transforming. China is moving away from being a
low-cost producer and exporter, to becoming a consumer driven society. It wants to
professionalize its financial services sector, and build a green-tech economy to help
eliminate its pollution problems. Its slowly opening its capital account and taking
steps to reforming its financial markets. There will be errors and surprises, and anyone who thinks
otherwise will be disappointed. Over the last four weeks, the Chinese government misplayed its
hand when it decided to use tools for the economy mainly an interest rate
reduction and reserve ratio requirement cuts for banks in an effort to provide the market with more
liquidity. It worked for a little while, and recent moves to change rules on margin, and even utilize a circuit-breaker
mechanism to temporarily delist fast-tanking companies from the mainland stock market, might have worked if the

The timing was terrible. And it pushed people into


panic selling, turning China into the biggest financial market headline this side of Athens. For better or for
worse, Beijing now has no choice but to go all-in to defend equities, some investors told FORBES. But Chinas
real economy is doing much better than the Shanghai and Shenzhen
exchanges suggest. According to China Beige Book, the Chinese economy actually
recovered last quarter. Markets are focusing on equities and PMI indicators from
the state and HSBC as a gauge, but it should become clear in the coming weeks
that Chinas stock market is not a reflection of the fundamentals. The Good, The Bad and
Greece crisis didnt pull the plug on global risk.

the Ugly To get a more detailed picture of what is driving Chinas growth slowdown, it is necessary to look at a
broader array of economic and financial indicators. The epicenter of Chinas problems are the industrial and
property sectors. Shares of the Shanghai Construction Group, one of the largest developers listed on the Shanghai
stock exchange, is down 42.6% in the past four weeks, two times worse than the Shanghai Composite Index. China
Railway Group is down 33%, also an underperformer. Growth in real industrial output has declined from 14% in mid2011 to 5.9% in April, growth in fixed-asset investment declined 50% over the same period and electricity
consumption by primary and secondary industries is in decline. Chinas trade with the outside world is also falling,
though this data does not always match up with other countries trade figures. Real estate is in decline as Beijing
has put the breaks on its housing bubble. Only the east coast cities are still seeing price increases, but construction
is not booming in Shanghai anymore. The two main components of that have prevented a deeper downturn in
activity are private spending on services, particularly financial services, and government-led increases in
transportation infrastructure like road and rail. Retail sales, especially e-commerce sales that have benefited the
likes of Alibaba and Tencent, both of which have outperformed the index, have been growing faster than the overall
economy. Electricity consumption in the services sector is expanding strongly. Growth in household incomes is

outpacing GDP growth. China has begun the necessary rebalancing towards a more sustainable, consumption-led
growth model, says Jeremy Lawson, chief economist at Standard Life Investments in the U.K. He warns that its
still too early to claim success. Since 2011, developed markets led by the S&P 500 have performed better than
China, but for one reason and one reason only: The central banks of Europe, the U.K., Japan and of course the U.S.
have bought up assets in unprecedented volumes using printed money, or outright buying securities like the Feds
purchase of bonds and mortgage backed securities. Why bemoan Chinas state intervention when central bank
intervention has been what kept southern Europe afloat, and the U.S. stock market on fire since March 2009?

Companies in China are still making money. I think people have no clue on China,
says Jan Dehn, head of research at Ashmore in London, a $70 billion emerging market fund manager with money at

They dont see the big picture. And they forget it is still
an emerging market. The Chinese make mistakes and will continue to make
mistakes like all governments. However, they will learn from their mistakes. The
magnitude of most problems are not such that they lead to systematic meltdown.
Each time the market freaks out, value often deep value starts to
emerge. Long term, these volatile episodes are mere blips . They will not change the course
of internationalization and maturing of the market, Dehn told FORBES. China is still building markets . It
has a large environmental problem that will bode well for green tech firms like BYD. Its middle class is not
shrinking. Its billionaires are growing in numbers. They are reforming all the time.
work in mainland China securities.

And in the long term, China is going to win. Markets are impatient and love a good drama. But investing is not a
soap opera. Its not Keeping up with the Kardashians youre buying, youre buying the worlds No. 2 economy, the
biggest commodity consumer in the world, and home to 1.4 billion people, many of which have been steadily
earning more than ever. Chinas transition will cause temporary weakness in growth and volatility, maybe even

Why The Stock Market


Correction Wont Hurt China The Chinese equity correction is healthy and unlikely to
have major adverse real economy consequences for several reasons: First, Chinas
A-shares are still up 79% over the past 12 months. A reversal of fortunes was a
shoo-in to occur. Second, Chinese banks are basically not involved in providing
leverage and show no signs of stress. The total leverage in Chinese financial
markets is about four trillion yuan ($600 billion). Stock market leverage is concentrated in the
crazy volatility. But you have to break eggs to make an omelette, says Dehn.

informal sector with trust funds and brokerages accounting for a little over half of the leverage. Margin financing
via brokerages is down from 2.4 trillion yuan to 1.9 trillion yuan and lets not forget that Chinese GDP is about 70

Third, there is very little evidence that the moves in the stock market will
have a major impact on the real economy and consumption via portfolio loss. Stocks
comprise only 15% of total wealth. Official sector institutions are large holders of
stocks and their spending is under control of the government. As for the retail
investor, they spend far less of their wealth than other countries. China has a 49%
savings rate. Even if they lost half of it, they would be saving more than Americans,
the highly indebted consumer society the world loves to love. During the rally over the past
twelve months, the stock market bubble did not trigger a boost in consumption
indicating that higher equity gains didnt impact spending habits too much. The
Chinese stock market is only 5% of total social financing in China. Stock markets
only finance 2% of Chinese fixed asset investment. Only 1% of company loans have
been put up with stocks as collateral, so the impact on corporate activity is going to
be limited. The rapid rally and the violent correction illustrate the challenges of capital account liberalization,
trillion yuan.

the need for a long-term institutional investor base, index inclusion and deeper financial markets, including foreign
institutional investors, Dehn says. The A-shares correction is likely to encourage deeper financial reforms, not a
reversal.

Plan Flaw

1NCs

1NC CT
Counterplan text: The United States federal government
should neither mandate the creation of surveillance backdoors
in products nor request privacy keys and should terminate
current backdoors created either by government mandates or
government requested keys.
Three arguments here:
4. A. Mandate means to make required
Merriam-Websters Dictionary of Law 96
(Merriam-Websters Dictionary of Law, 1996,
http://dictionary.findlaw.com/definition/mandate.html//ghs-kw)

mandate n [Latin mandatum , from neuter of mandatus , past participle of mandare to entrust, enjoin,

probably irregularly from manus hand + -dere to put] 1 a : a formal communication from a reviewing court
notifying the court below of its judgment and directing the lower court to act accordingly b : mandamus 2
in the civil law of Louisiana : an act by which a person gives another person the power to transact for him
or her one or several affairs 3 a : an authoritative command : a clear authorization or direction [the of the
full faith and credit clause "National Law Journal "] b : the authorization to act given by a constituency to
its elected representative vt mandated mandating : to make mandatory or required
[the Pennsylvania Constitution s a criminal defendant's right to confrontation "National Law Journal "]

B. Circumvention: companies including those under PRISM


agree to provide data because the government pays them
Timberg and Gecllman 13
(Timberg, Craig and Gellman, Barton. Reporters for the Washington Post, citing government
budgets and internal documents. NSA paying U.S. companies for access to communications
networks, Washington Post. 8/29/2013. https://www.washingtonpost.com/world/nationalsecurity/nsa-paying-us-companies-for-access-to-communicationsnetworks/2013/08/29/5641a4b6-10c2-11e3-bdf6-e4fc677d94a1_story.html//ghs-kw)

The National Security Agency is paying hundreds of millions of dollars a year


to U.S. companies for clandestine access to their communications networks ,
filtering vast traffic flows for foreign targets in a process that also sweeps in large volumes of American telephone calls, e-

The bulk of the spending, detailed in a multi-volume intelligence budget


goes to participants in a Corporate Partner Access Project for major U.S.
telecommunications providers. The documents open an important window into surveillance operations
mails and instant messages.

obtained by The Washington Post,

on U.S. territory that have been the subject of debate since they were revealed by The Post and Britains Guardian
newspaper in June. New details of the corporate-partner project, which falls under the NSAs Special Source Operations,

the agency taps into high volume circuit and packet-switched


networks, according to the spending blueprint for fiscal 2013. The program was expected to cost $278 million in
confirm that

the current fiscal year, down nearly one-third from its peak of $394 million in 2011. Voluntary cooperation from the
backbone providers of global communications dates to the 1970s under the cover name BLARNEY, according to
documents provided by former NSA contractor Edward Snowden. These relationships long predate the PRISM program
disclosed in June, under which American technology companies hand over customer data after receiving orders from the
Foreign Intelligence Surveillance Court. In briefing slides, the NSA described BLARNEY and three other corporate projects
OAKSTAR, FAIRVIEW and STORMBREW under the heading of passive or upstream collection. They capture data as
they move across fiber-optic cables and the gateways that direct global communications traffic. Read the documents
Budget Inside the secret 'black budget' View select pages from the Office of the Director of National Intelligence's topsecret 2013 budget with key sections annotated by The Washington Post. The documents offer a rare view of a secret
surveillance economy in which government officials set financial terms for programs capable of peering into the lives of
almost anyone who uses a phone, computer or other device connected to the Internet. Although the companies are

multimillion-dollar
payments could create a profit motive to offer more than the
required assistance. It turns surveillance into a revenue stream , and
required to comply with lawful surveillance orders, privacy advocates say the

thats not the way its supposed to work, said Marc Rotenberg, executive director of the Electronic Privacy Information
Center, a Washington-based research and advocacy group. The fact that the government is paying money to telephone
companies to turn over information that they are compelled to turn over is very troubling. Verizon, AT&T and other major
telecommunications companies declined to comment for this article, although several industry officials noted that
government surveillance laws explicitly call for companies to receive reasonable reimbursement for their costs. Previous
news reports have made clear that companies frequently seek such payments , but never before
has their overall scale been disclosed. The budget documents do not list individual companies, although they do break

down spending among several NSA programs, listed by their code names. There is no record in the documents obtained
by The Post of money set aside to pay technology companies that provide information to the NSAs PRISM program. That
program is the source of 91 percent of the 250 million Internet communications collected through Section 702 of the FISA
Amendments Act, which authorizes PRISM and the upstream programs, according to an 2011 opinion and order by the

companies that provide information to


PRISM, including Apple, Facebook and Google, say they take no payments from the government when they comply
with national security requests. Others say they do take payments in some circumstances. The Guardian
reported last week that the NSA had covered millions of dollars in costs that some technology
companies incurred to comply with government demands for information. Telecommunications
companies generally do charge to comply with surveillance requests, which come from state, local and federal law
Foreign Intelligence Surveillance Court. Several of the

enforcement officials as well as intelligence agencies. Former telecommunications executive Paul Kouroupas, a security

companies welcome the


revenue and enter into contracts in which the government makes
higher payments than otherwise available to firms receiving reimbursement for complying with surveillance
orders. These contractual payments , he said, could cover the cost of buying and installing new
equipment, along with a reasonable profit. These voluntary agreements simplify the
governments access to surveillance, he said. It certainly lubricates the
[surveillance] infrastructure, Kouroupas said. He declined to say whether Global Crossing, which
officer who worked at Global Crossing for 12 years, said that some

operated a fiber-optic network spanning several continents and was bought by Level 3 Communications in 2011, had such
a contract. A spokesman for Level 3 Communications declined to comment.

5. Plan flaw: the plan mandates that we stop surveilling


backdoors, request public encryption keys, and close
existing backdoorsthat guts solvency because the
government can still create backdoors with encryption
keys
6. Presumption: we dont mandate back doors in the status
quo, all their ev is in the context of a bill that would
require backdoors in the future, so the AFF does nothing

1NC KQ
The Secure Data Act of 2015 states that no agency may
mandate backdoors
Secure Data Act of 2015
(Wyden, Ron. Senator, D-OR. S. 135, known as the Secure Data Act of 2015, introduced in Congress
1/8/2015. https://www.congress.gov/bill/114th-congress/senate-bill/135/text//ghs-kw)
SEC. 2. PROHIBITION ON DATA SECURITY VULNERABILITY MANDATES. (a) In General.Except as provided in

no agency may mandate that a manufacturer , developer, or seller of


covered products design or alter the security functions in its product or service to
allow the surveillance of any user of such product or service, or to allow the physical
search of such product, by any agency.
subsection (b),

Mandate means to make required


Merriam-Websters Dictionary of Law 96
(Merriam-Websters Dictionary of Law, 1996,
http://dictionary.findlaw.com/definition/mandate.html//ghs-kw)

mandate n [Latin mandatum , from neuter of mandatus , past participle of mandare to entrust, enjoin, probably

irregularly from manus hand + -dere to put] 1 a : a formal communication from a reviewing court notifying the court
below of its judgment and directing the lower court to act accordingly b : mandamus 2 in the civil law of Louisiana :
an act by which a person gives another person the power to transact for him or her one or several affairs 3 a : an
authoritative command : a clear authorization or direction [the of the full faith and credit clause "National Law
Journal "] b : the authorization to act given by a constituency to its elected representative vt mandated
mandating : to make mandatory or required [the Pennsylvania Constitution s a criminal
defendant's right to confrontation "National Law Journal "]

Circumvention: companies including those under PRISM agree


to provide data because the government pays them
Timberg and Gellman 13
(Timberg, Craig and Gellman, Barton. Reporters for the Washington Post, citing government budgets
and internal documents. NSA paying U.S. companies for access to communications networks,
Washington Post. 8/29/2013. https://www.washingtonpost.com/world/national-security/nsa-paying-uscompanies-for-access-to-communications-networks/2013/08/29/5641a4b6-10c2-11e3-bdf6e4fc677d94a1_story.html//ghs-kw)

The National Security Agency is paying hundreds of millions of dollars a year to U.S.
companies for clandestine access to their communications networks , filtering vast traffic
flows for foreign targets in a process that also sweeps in large volumes of American telephone calls, e-mails and instant messages.

The bulk of the spending, detailed in a multi-volume intelligence budget obtained by The Washington Post, goes to
participants in a Corporate Partner Access Project for major U.S. telecommunications providers. The
documents open an important window into surveillance operations on U.S. territory that have been the subject of debate since they
were revealed by The Post and Britains Guardian newspaper in June. New details of the corporate-partner project, which falls under
the NSAs Special Source Operations, confirm that

the agency taps into high volume circuit and

packet-switched networks,

according to the spending blueprint for fiscal 2013. The program was expected to cost
$278 million in the current fiscal year, down nearly one-third from its peak of $394 million in 2011. Voluntary cooperation from the
backbone providers of global communications dates to the 1970s under the cover name BLARNEY, according to documents
provided by former NSA contractor Edward Snowden. These relationships long predate the PRISM program disclosed in June, under
which American technology companies hand over customer data after receiving orders from the Foreign Intelligence Surveillance
Court. In briefing slides, the NSA described BLARNEY and three other corporate projects OAKSTAR, FAIRVIEW and STORMBREW
under the heading of passive or upstream collection. They capture data as they move across fiber-optic cables and the
gateways that direct global communications traffic. Read the documents Budget Inside the secret 'black budget' View select pages
from the Office of the Director of National Intelligence's top-secret 2013 budget with key sections annotated by The Washington
Post. The documents offer a rare view of a secret surveillance economy in which government officials set financial terms for
programs capable of peering into the lives of almost anyone who uses a phone, computer or other device connected to the Internet.

multimilliondollar payments could create a profit motive to offer more than the
required assistance. It turns surveillance into a revenue stream , and thats not
Although the companies are required to comply with lawful surveillance orders, privacy advocates say the

the way its supposed to work, said Marc Rotenberg, executive director of the Electronic Privacy Information Center, a Washingtonbased research and advocacy group. The fact that the government is paying money to telephone companies to turn over
information that they are compelled to turn over is very troubling. Verizon, AT&T and other major telecommunications companies
declined to comment for this article, although several industry officials noted that government surveillance laws explicitly call for
companies to receive reasonable reimbursement for their costs. Previous news reports have made clear that

companies

frequently seek such payments, but never before has their overall scale been disclosed. The budget documents do
not list individual companies, although they do break down spending among several NSA programs, listed by their code names.
There is no record in the documents obtained by The Post of money set aside to pay technology companies that provide information
to the NSAs PRISM program. That program is the source of 91 percent of the 250 million Internet communications collected through
Section 702 of the FISA Amendments Act, which authorizes PRISM and the upstream programs, according to an 2011 opinion and
order by the Foreign Intelligence Surveillance Court. Several of the

companies that provide information

to PRISM, including Apple, Facebook and Google, say they take no payments from the government when they comply with
national security requests. Others say they do take payments in some circumstances. The Guardian reported last
week that the NSA had covered millions of dollars in costs that some technology companies incurred to
comply with government demands for information. Telecommunications companies generally do charge to comply
with surveillance requests, which come from state, local and federal law enforcement officials as well as intelligence agencies.
Former telecommunications executive Paul Kouroupas, a security officer who worked at Global Crossing for 12 years, said that some

companies welcome the revenue and enter into contracts in which the
government makes higher payments than otherwise available to firms receiving reimbursement for
complying with surveillance orders. These contractual payments, he said, could cover the cost of buying and
installing new equipment, along with a reasonable profit. These voluntary agreements simplify
the governments access to surveillance, he said. It certainly lubricates the
[surveillance] infrastructure, Kouroupas said. He declined to say whether Global Crossing, which operated a
fiber-optic network spanning several continents and was bought by Level 3 Communications in 2011, had such a contract. A
spokesman for Level 3 Communications declined to comment.

2NC

2NC Mandate
Mandate is an order or requirement
The People's Law Dictionary 02
(Hill, Gerald and Kathleen. Gerald Hill holds a J.D. from Hastings College of the Law of the University of
California. He was Executive Director of the California Governor's Housing Commission, has drafted
legislation, taught at Golden Gate University Law School, served as an arbitrator and pro tem judge,
edited and co-authored Housing in California, was an elected trustee of a public hospital, and has
testified before Congressional committees. Kathleen Hill holds an M.A. in political psychology from
California State University, Sonoma. She was also a Fellow in Public Affairs with the prestigious Coro
Foundation, earned a Certificat from the Sorbonne in Paris, France, headed the Peace Corps Speakers'
Bureau in Washington, D.C., worked in the White House for President Kennedy, and was Executive
Coordinator of the 25th Anniversary of the United Nations. Kathleen has served on a Grand Jury,
chaired two city commissions and has developed programs for the Institute of Governmental Studies
of the University of California. The Peoples Law Dictionary, 2002.
http://dictionary.law.com/Default.aspx?selected=1204//ghs-kw)

mandate n. 1) any mandatory order or requirement under statute, regulation, or by a


public agency. 2) order of an appeals court to a lower court (usually the original trial court in the case) to

comply with an appeals court's ruling, such as holding a new trial, dismissing the case or releasing a prisoner whose
conviction has been overturned. 3) same as the writ of mandamus, which orders a public official or public body to
comply with the law.

2NC Circumvention
NSA enters into mutually agreed upon contracts for back doors
Reuters 13
(Menn, Joseph. Exclusive: Secret contract tied NSA and security industry pioneer, Reuters.
12/20/2013. http://www.reuters.com/article/2013NC/12/21/us-usa-security-rsaidUSBRE9BJ1C220131221//ghs-kw)
As a key part of a campaign to embed encryption software that it could crack into widely used computer products,

the U.S. National Security Agency arranged a secret $10 million contract with RSA,
one of the most influential firms in the computer security industry , Reuters has learned.
Documents leaked by former NSA contractor Edward Snowden show that the NSA created and
promulgated a flawed formula for generating random numbers to create a
"back door" in encryption products, the New York Times reported in September. Reuters later
reported that RSA became the most important distributor of that formula by rolling it
into a software tool called Bsafe that is used to enhance security in personal computers and many other
products. Undisclosed until now was that RSA received $10 million in a deal that set
the NSA formula as the preferred, or default, method for number generation in the BSafe
software, according to two sources familiar with the contract. Although that sum might seem paltry, it
represented more than a third of the revenue that the relevant division at
RSA had taken in during the entire previous year, securities filings show. The earlier
disclosures of RSA's entanglement with the NSA already had shocked some in the close-knit world of computer
security experts. The company had a long history of championing privacy and security, and
it played a leading role in blocking a 1990s effort by the NSA to require a special
chip to enable spying on a wide range of computer and communications products.

RSA, now a subsidiary of computer storage giant EMC Corp, urged customers to stop using the NSA formula after
the Snowden disclosures revealed its weakness. RSA and EMC declined to answer questions for this story, but RSA
said in a statement: "RSA always acts in the best interest of its customers and under no circumstances does RSA
design or enable any back doors in our products. Decisions about the features and functionality of RSA products are
our own." The NSA declined to comment. The RSA deal shows one way the NSA carried out what Snowden's
documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools.

NSA

documents

released in recent months called for using "commercial relationships" to


advance that goal, but did not name any security companies as collaborators. The NSA came under attack this
week in a landmark report from a White House panel appointed to review U.S. surveillance policy. The panel noted
that "encryption is an essential basis for trust on the Internet," and called for a halt to any NSA efforts to undermine

RSA employees interviewed said that the company erred in agreeing


to such a contract, and many cited RSA's corporate evolution away from pure cryptography products as
it. Most of the dozen current and former
one of the reasons it occurred.

Case

Economy Adv

Notes
This advantage makes NO sense. Venezia ev doesnt say Internet would
collapse, just that thered be a bunch of identity theft, etc. This has no
bearing on backdoors effects on physical infrastructure
30 second explainer: backdoors collapse the internet (not true), internet k2
the conomy b/c new industries and faster growth, econ collapse = ext b/c
Harris and Burrows

CX Questions
Venezia doesnt say the internet would be eliminated, just that data would be
decrypted and that there would be mass identity theftwheres the ev into
Internet collapse?

1NC No Tech Damage


Surveillance doesnt harm US tech and the tech sector is high
their ev is speculation and only we have hard data
Insider Surveillance 14
(Insider Surveillance. Insider Surveillance is the most widely read source of information on
surveillance technologies for law enforcement, government agencies, military intelligence,
communications companies and technology leaders who together safeguard national security and
protect the public from criminals and terrorists. The publication reflects the expertise of the
intelligence, law enforcement and public policy communities on surveillance and is followed by
members in over 130 nations from Washington, D.C. to London, Paris, Beijing, Moscow, Rome,
Madrid, Berlin, Tokyo, Lahore, Delhi, Abu Dhabi, Rio de Janeiro, Mexico City, Seoul and thousands of
places in between. "De-Bunking the Myth of U.S. Tech Sales Lost Due to NSA," . 9-24-2014.
https://insidersurveillance.com/de-bunking-myth-of-u-s-tech-sales-lost-due-nsa///ghs-kw)

Flashback to October 2013. The sky is falling! The sky is falling! Customers
worldwide are furious about NSA spying. That means imminent doom for the U.S.
tech industry. Offshore sales will plummet as buyers drop U.S. tech
products/services and buy local instead. The end is nigh! News flash for Chicken Little:
The skys still up there. Its shining bright over a U.S. tech market that in
the past year has experienced almost unprecedented growth largely
thanks to foreign sales. As to impending Armageddon for the tech sector, to date no one has
positively identified a single nickel of tech industry revenue or profit lost due to
foreign customers purported anger over the NSA . On the contrary, the U.S. technology and

aligned sectors in defense have enjoyed a banner yet. A few points to consider: U.S. tech stocks are near an all-time
high. The Morgan Stanley High-Technology Index 35, which includes Amazon, Apple, Google, Microsoft and Netflix
among the most vociferous Internet and cloud companies blaming NSA for lost profits today stands 23.4% higher
than its 52-week low one year ago when anti-surveillance furor reached its peak. In recent weeks the index has
stood as high as 25% above the October 2013 low point. Not too shabby for a sector supposedly on the ropes.

Foreign sales lead the march to U.S. tech profits. According to an AP story posted after
2Q2014 earnings: Technology trendsetters Apple Inc., Google Inc., Facebook Inc. and
Netflix Inc. all mined foreign countries to produce earnings or revenue that
exceeded analysts projections in their latest quarters. In the second quarter, Google
generated 58% of its revenue outside the U.S. Facebook continued to draw 55% of
revenue from overseas. Netflix added 1.1 million new foreign subscribers double the number won in the
U.S. and Canada during the second quarter. Apple reported soaring sales of its iPhone in China,
Russia, India and Brazil, offsetting tepid growth in the U.S. Net net, the industrys
biggest gains came in the very markets that tech leaders last year cited as
being at risk. U.S. defense contractors fare best offshore. Faced with dwindling U.S. Defense
Department purchases the U.S. hasnt purchase a single new F-16 in the last 10 years defense
suppliers decision to pursue foreign buyers has fueled a bonanza. Sales to Israel,
Oman and Iraq keep Lockheed Martins plant humming in F-16 production. Over at
Sikorsky Aircraft, makers of the Black Hawk helicopter, the company late last year
reported a 30-year contract with Taiwan valued at over US$1.0 billion. International
sales at Boeings defense division comprise 24% of the companys $US33 billion in
defense sales. To be sure, the defense market is a tough one. However, when U.S. sales are lost its
not because a foreign buyer was angry over NSA and decided to buy weapons
systems in-country. More often the answer is far simpler: competition from a major
non-U.S. player. Example: Turkeys decision to dis Raytheons bid for a long range air defense system was a
simple dollars and cents matter: China, not exactly a bastion of human rights, won the contract. Russian and

No one uttered a peep peep about the NSA.


Defense executives dont sit around fretting about foreign sales supposedly lost due
to U.S. spying. Their real worry is China, an increasingly aggressive player in the
defense systems market. The story of the U.S. tech and defense industries
rampage of profits over the last year much and sometimes most of it
European companies were also among the losers.

driven by foreign buyers is borne out by the numbers: more sales,


higher revenues and equities prices. Those are all hard numbers, too, not
guess work. The same cant be said of the tech leaders who don sackcloth and
ashes when bewailing the imagined impact of the NSA scandal on offshore sales
while growing rich in the same markets. Where, one might well ask, is the documentation supporting
the doom-mongers forecasts? Lets travel back in time. The Open Technology Institute Paper Beginning with a
meeting of some 30 tech industry leaders with President Barack Obama last December, the cascade of warnings
gained mass. Soon after, pollsters and financial analysts chimed in, pointing to public surveys showing widespread
global anger over the NSA and threats that foreign buyers would keep their tech wallets at home. The poster

Tech companies expressed grave concern that foreign


customers would cease to use U.S. cloud companies, many of which operate
offshore data centers that would seem easy targets for the NSA. As proof that this trend
child of the complaints: cloud computing.

already had wings, analysts pointed to a Swiss cloud company Artmotion which in June 2013 touted a sudden
45% surge in revenue, supposedly due to non-U.S. customers exiting more vulnerable services provided by
American companies, in favor of Artmotion. [More about Artmotion in a moment.] Similar charges dribbled into the

the crowning blow, if one wants to call it such, came


in late July with the publication of Surveillance Costs: The NSAs Impact on the
Economy, Internet Freedom and Cybersecurity, a 35-page policy paper by the Open
Technology Institute (OTI). Why the seven-month delay? One would presume that OTI wanted to take
media during the first half of 2014. But

sufficient time to amass evidence of the disastrous impact of the NSA on U.S. technology and the economy. The

the end result of all OTIs effort is lame at best,


scurrilous at worst. In the policys papers discussion of Direct Economic Costs to
American Companies, the authors quote widely from TIME Magazine, The
Washington Post and Congressional testimony on how NSA Spying Could Cost U.S.
Tech Giants Billions. Cloud computing is presented as the immediate victim. The OTI paper cites
the example of Dropbox, Amazon Web Services (AWS) and Microsoft Azure suffering
severe losses in foreign cloud sales to Switzerlands Artmotion due to foreign
anger over the NSA. The source: An article published in The International Business
Times, Companies Turn to Switzerland for Cloud Storage, on July 4, 2013 three
weeks after the first NSA revelations by Edward Snowden. Describing Artmotion as
questions is: Did they succeed? Frankly,

Switzerlands biggest offshore hosting company, the article quotes the companys CEO Mateo Meier claiming a

Aspects of the original article and the policy paper


show how easily speculation is presented as fact by sloppy authors eager to make a
point without bothering to check their facts : Nowhere in the International
Business Times story is any evidence produced showing AWS, Dropbox or
Azure losing business. Nor is any concrete number on losses presented.
The closest the reporter can come is to aver: However now services like Dropbox,
AWS and Azure are seen as potentially insecure. . .. Seen by whom? The IBT
doesnt say. The OTI policy paper cites the IBT article as the source for an assertion that companies like
45% jump in revenue during that period.

Dropbox and Amazon were beginning to lose business to overseas business. Remember: the IBT didnt cite any
losses by these companies it merely said they were seen [by unnamed sources] as potentially insecure. Its
anybodys guess whether Artmotion is Switzerlands biggest (or smallest) offshore hosting company. Artmotion
is a privately held company. It does not provide any public data on finances, numbers of employees or clients, or
any other information that could be used to determine the companys size. A 45% revenue gain in three weeks
would defy the odds for a large enterprise, so to borrow the practice of speculating from our subject it is most
likely that Artmotion is a smaller entrepreneurial venture led by a CEO who had the savvy to capitalize on the NSA

Large enterprise customers, who took years to trust the idea of


handing over their data to third party cloud providers, are notoriously
slow to embrace change. The likelihood of FTSE1000 companies shifting cloud
service providers in three weeks! is preposterous. Even Mom and Pop cloud
customers would scarcely be apt to change their minds and shift all their cloudstored data that quickly. Even assuming that the overnight 45% revenue boost claim is true, where is the
scandal.

proof tying this cash surge to non-U.S. customers defecting from Amazon, Dropbox or Google to Artmotion? Answer:

There is no proof. Its pure hearsay. If were picking on Artmotion overmuch, its for good cause.
This case study is the most substantial proof in the entire OTI paper. From there it degenerates into even more

dubious assessments by analysts and industry think tanks. Of these, one of the better studies is by the
International Technology and Innovation Foundation (ITIF), generally hailed as a non-partisan group. Published in
August 2013, the report honestly states that at that early date, the data are still thin clearly this is a developing
story and perceptions will likely evolve. But even ITIF resorts to maybes versus facts. Example: a projection that
U.S. cloud computing companies might lose US $21.5 billion by 2016, presuming that 10% of foreign customers
flee, or up to US$ 35 billion assuming a 20% attrition rate. The basis for these assumptions: a survey by yet another
think tank, The Cloud Security Alliance, which found 10% of non-U.S. respondents saying they had cancelled a

authors leap from


speculation to fact, or quote studies based on assumptions by one group that hinge
on conclusions of yet another organization. All sources tend to be very early days,
when emotions on the NSA ran high. If the current numbers exist bearing out the case for NSA spying
project with a U.S. cloud communications provider. And so it goes with the OTI study. The

damaging U.S. tech companies foreign sales, then why doesnt OTI quote them? Instead, the farther one progresses
into the OTI policy paper, the more infatuated its authors become with wildly exaggerated projections of tech
industry losses. Within a few paragraphs of ITIFs claims of cloud losses reaching $US 35 billion, we find a truly
astounding quote from Forrester Research. Not to be outdone by a mere think tank, the famous industry analyst
group forecasts U.S. cloud company losses of $US 180 billion by 2016. Thats a good trick for an industry whose
total growth was projected to reach just $US 210 billion also by the year 2016 and also by Forrester, just a few
months earlier.

2NC No Tech Damage


Companies wont leave the USmarket is too large
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)
The risks related to going dark are real. When the President of the United States,60 the Prime Minister of the
United Kingdom,61 and the Director of the FBI62 all publically express deep concerns about how this phenomenon
will endanger their respective nations, it is difficult to ignore. Today, encryption technologies that are making it
increasingly easy for individual users to prevent even lawful government access to potentially vital information
related to crimes or other national security threats. This evolution of individual encryption capabilities represents a
fundamental distortion of the balance between government surveillance authority and individual liberty central to
the Fourth Amendment. And balance is the operative word. The right of The People to be secure against
unreasonable government intrusions into those places and things protected by the Fourth Amendment must be
vehemently protected. Reasonable searches, however, should not only be permitted, but they should be mandated

Congress has the authority to ensure that such searches are possible.
While some argue that this could cause American manufacturers to suffer,
saddled as they will appear to be by the Snowden Effect, the rules will
apply equally to any manufacturer that wishes to do business in the
United States. Considering that the United States economy is the largest
in the world, it is highly unlikely that foreign manufacturers will forego
access to our market in order to avoid having to create CALEA-like
solutions to allow for lawful access to encrypted data. Just as foreign cellular
telephone providers, such as T-Mobile, are active in the United States, so too will
foreign device manufacturers and other communications services adjust their
technology to comply with our laws and regulations. This will put American and
foreign companies on an equal playing field while encouraging ingenuity and
competition. Most importantly, the right of the people to be secure in their persons, houses, papers, and
where necessary.

effects will be protected not only against unreasonable searches and seizures, but also against attacks by
criminals and terrorists. And is not this, in essence, the primary purpose of government?

1NC Tech High


US tech leadership is strong now, despite Asias rise in science
their ev
Segal 4
(Adam, director of the Program on Digital and Cyberspace Policy at the
Council on Foreign Relations (CFR), An expert on security issues, technology
development, November/December 2004 Issue, Is America Losing Its Edge,
https://www.foreignaffairs.com/articles/united-states/2004-11-01/americalosing-its-edge, BC)
United States' global primacy depends in large part on its ability to develop new
technologies and industries faster than anyone else. For the last five decades,
U.S. scientific innovation and technological entrepreneurship have ensured the
country's economic prosperity and military power. It was Americans who
invented and commercialized the semiconductor, the personal computer, and the
Internet; other countries merely followed the U.S. lead. Today, however, this technological edge-so
long taken for granted-may be slipping, and the most serious challenge is coming from Asia.
Through competitive tax policies, increased investment in research and
development (R&D), and preferential policies for science and technology
(S&T) personnel, Asian governments are improving the quality of their
science and ensuring the exploitation of future innovations. The percentage
of patents issued to and science journal articles published by scientists in China,
Singapore, South Korea, and Taiwan is rising. Indian companies are quickly becoming the secondThe

largest producers of application services in the world, developing, supplying, and managing database and other

South Korea has rapidly eaten away at the U.S.


advantage in the manufacture of computer chips and telecommunications software.
And even China has made impressive gains in advanced technologies such as
lasers, biotechnology, and advanced materials used in semiconductors, aerospace,
and many other types of manufacturing. Although the United States' technical
dominance remains solid, the globalization of research and development is exerting considerable
types of software for clients around the world.

pressures on the American system. Indeed, as the United States is learning, globalization cuts both ways: it is both

The United States will never


can remain dominant only by
continuing to innovate faster than everyone else. But this won't be easy; to keep its
privileged position in the world, the United States must get better at fostering
technological entrepreneurship at home.
a potent catalyst of U.S. technological innovation and a significant threat to it.
be able to prevent rivals from developing new technologies; it

2NC Tech High


Tech sector is growing
Grisham 2/10 (Preston Grisham, United States Tech Industry Employs 6.5

Million in 2014, February 10th, 2015, https://www.comptia.org/aboutus/newsroom/press-releases/2015/02/10/united-states-tech-industryemploys-6.5-million-in-2014)


Washington, D.C., February 10, 2015 The U.S. tech industry added 129,600 net
jobs between 2013 and 2014, for a total of nearly 6.5 million jobs in the U.S .,
according to Cyberstates 2015: The Definitive State-by-State Analysis of the
U.S. Tech Industry published by CompTIA. The report represents a
comprehensive look at tech employment, wages, and other key economic
factors nationally and state-by-state, covering all 50 states, the District of
Columbia, and Puerto Rico. This years edition shows that tech industry jobs
account for 5.7 percent of the entire private sector workforce . Tech industry
employment grew at the same rate as the overall private sector, 2 percent,
between 2013-2014. Growth was led by the IT services sector which added

63,300 jobs between 2013 and 2014 and the R&D, testing, and engineering
services sector that added 50,700 jobs. The U.S. tech industry continues to
make significant contributions to our economy, said Todd Thibodeaux, president
and CEO, CompTIA. The tech industry accounts for 7.1 percent of the overall U.S.
GDP and 11.4 percent of the total U.S. private sector payroll. With annual average
wages that are more than double that of the private sector, we should be doing all
we can to encourage the growth and vitality of our nations tech industry.

Tech spending increasing now despite projections


Seitz 1/30/15
(Patrick, 1/30/15, Investors Business Daily, Software apps to continue
dominating cloud sales, http://news.investors.com/technology-click/013015736967-software-as-a-service-gets-lions-share-of-public-cloud-revenue.htm,
7/13/15, SM)
Public cloud computing services are a bright spot in the otherwise stagnant corporate information technology
market, and software-as-a-service (SaaS) vendors are seen benefiting disproportionately in the years ahead.

Public cloud spending reached $67 billion in 2014 and is expected to hit $113 billion
in 2018, Technology Business Research said in a report Wednesday. "While the vast majority of IT companies
remain plagued by low-single-digit revenue growth rates at best, investments in public cloud from
software-centric vendors such as Microsoft and SAP are moving the corporate
needle," TBR analyst Jillian Mirandi said in a statement. Microsoft (NASDAQ:MSFT) is pushing the cloud

development platform Azure and migrating Office customers to the cloud-based Office 365. SAP (NYSE:SAP) got a
late start to the public cloud but has acquired SuccessFactors and Ariba to accelerate its efforts. The second half of
2014 was marked by partnerships and integration of services from different vendors in the software-as-a-service

vendors like Salesforce.com (NYSE:CRM) and Workday (NYSE:WDAY) have also added
cloud-based analytics applications, which have increased their appeal to business
users, Mirandi said. Software-as-a-service accounted for 62% of public cloud spending
last year, and the percentage will decline only modestly in the years ahead. Technology Business Research
estimates that SaaS will be 59.5% of public cloud spending in 2018. Infrastructure-as-a-service
(IaaS)is the second-largest category of public cloud spending, at 28.5% in 2014, but
climbing to 30.5% in 2018. IaaS vendors include Amazon.com's (NASDAQ:AMZN) Amazon Web Services,
sector. SaaS

Microsoft and Google (NASDAQ:GOOGL). Platform-as-a-service (PaaS) is the third category, accounting for 9.5% of
spending last year and projected to be 10% in 2018, TBR says. PaaS vendors include Google, Microsoft and
Salesforce.com.

Tech industry spending high now


Columbus 14
(Louis, 2/24/14, Forbes, The Best Cloud Computing Companies And CEOs To
Work For In 2014,
http://www.forbes.com/sites/louiscolumbus/2014/02/24/the-best-cloudcomputing-companies-and-ceos-to-work-for-in-2014/, 7/17/15, SM)
IT decision makers spending on security technologies will increase 46% in 2015, with cloud
computing increasing 42% and business analytics investments up 38% . . Enterprise

investments in storage will increase 36%, and for wireless & mobile, 35%. Cloud computing initiatives are the most
important project for the majority of IT departments today (16%) and are expected to cause the most disruption in
the future. IDG predicts the majority of cloud computings disruption will be focused on improving service and
generating new revenue streams. These and other key take-aways are from recent IDG Enterprise research titled
Computerworld Forecast Study 2015. The goal of the study was to determine IT priorities for 2015 in areas such as
spending, staffing and technology. Computerworld spoke with 194 respondents, 55% of which are from the
executive IT roles. 19% from mid-level IT, 16% in IT professional roles and 7% in business management. You can
find the results and methodology of the study here. Additional key take-aways from the study include: Enterprises
are predicting they will increase their spending on security technologies by 46%, cloud computing by 42% with the

greatest growth in enterprises with over 1,000 employees (52% ), 38% in business analytics,
36% for storage solutions and 35% for wireless & mobile. The following graphic provides an overview of the top five
tech spending increases in 2015:

Tech spending is through the roof now


Holland 1/26 (Simon Holland, Marketing technology industry set for
explosive revenue gains, 1/26/15
http://www.marketingtechnews.net/news/2015/jan/26/marketing-technologyindustry-set-explosive-revenue-gains/)

Companies investing in marketing technology will continue to raise their budgets ,


with global vendor revenue forecasted to touch $32.2 billion by 2018. The
projections, part of an IDC webinar on the marketing software revolution, reveal a compound annual
growth rate (CAGR) of 12.4% and total spend of $130 billion across the five-year stretch
between 2014 and 2015. Customer relationship management software is a sizable
growth sector of marketing, with projections from IDCs software tracker predicting CRM application
revenue will reach $31.7 billion by 2018, a CAGR of 6.9%. A MaaS revival Most marketing solutions are
available in the cloud, but some large businesses are acquiring these point solutions, investing in them and then
turning them into a marketing as a service platform. The MaaS, an industry segment bundling a tech platform,
creative services and the IT services to run it, is making a comeback after economic uncertainty stunted investment
in this area for so many years. IDCs view on marketing as a service platforms is that it will blend global media and
marketing tech expenditure. There may have been little or no budget being attributed to this type of product in

IDC has forecasted increases in the run up to 2018. Getting the investment in early
IDC that puts
spend from digital marketing leaders at $14 million while achievers and contenders
set aside $4.2 million and $3.1 million respectively .
2014, but

can set a company up for a similar or larger return later down the road, a fact demonstrated by

The tech sector is growing nowemployment


Snyder 2/5 (Bill Snyder, The best jobs are in tech, and so is the job

growth, Febuary 5th, 2015, http://www.infoworld.com/article/2879051/itcareers/the-best-jobs-are-in-tech-and-so-is-the-job-growth.html)


In 2014,

IT employment grew by 2.4 percent. Although that doesnt sound like much,

it represents more than 100,000 jobs. If the projections by CompTIA and others hold up, the
economy will add even more this year. Tech dominates the best jobs in America A
separate report by Glassdoor, a large job board that includes employee-written reviews of companies and top
managers, singled out 25 of the best jobs in America, and 10 of those were in IT. Judged by a combination of
factors -- including earnings potential, career opportunities, and the number of current job listings -- the highestrated tech job was software engineer, with an average base salary of $98,074. In the last three months, employers
have posted 104,828 openings for software engineers and developers on the Glassdoor job site, though many are
no longer current. (Glassdoor combines the titles of software developers and software engineers, so we don't know
how many of those positions were just for engineers.) The highest-paid tech occupation listed on Glassdoor is
solutions architect, with an average base pay of $121,657. Looked at more broadly, the hottest tech occupation in
the United States last year was Web developer, for which available jobs grew by 4 percent to a total of 235,043 jobs
-- a substantial chunk of the 4.88 million employed tech workers, according to the U.S. Bureau of Labor Statistics.

overall
tech job growth of 2.4 percent . Taken together, the two new reports provide more evidence that we
can expect at least another year of buoyant employment prospects in IT -- and give
As for tech support, jobs in that occupation increased by 2.5 percent to 853,256, which is a bit more than

rough guidelines of the skills you need to get a great job and the potential employers you might contact. Hiring

Most striking is the shift in employer attitudes over the last year or
two, says Tim Herbert, CompTIAs vice president of research. Theres less concern about the
bottom dropping out, he said. Even worst-case estimates by employers are not at all
bad, he adds. The survey found that 43 percent of the companies say they are understaffed, and 68 percent say
they expect filling those positions will be challenging or very challenging. If thats the case, supply and
demand should push salaries even higher. One of the most positive trends in last
years employment picture is the broad wave of IT hiring stretching across different
sectors of the economy. Companies that posted the largest number of online ads for IT-related jobs were
across the economy

Accenture, Deloitte, Oracle, General Dynamics, Amazon.com, JP Morgan, United Health, and Best Buy, according to
Burning Glass Technologies Labor Insights, which tracks online advertising. Information

technology now
pervades the entire economy, says CompTIAs Herbert. Whats more, technologies like cloud
computing and software as a service are cheap enough and stable enough for small
and medium-sized businesses to adopt, which in turn creates even more job
opportunities, he notes.

1NC No Localization
No localizationtrade restrictions
Marel, Makivama, and Bauer in 15(Published May 2014 by Erik
van der MArel, Hosku Lee-Makivama, Matthias Bauer ECIPE, 7-152015, "The Costs of Data Localisation: A Friendly Fire on
Economic Recovery,"
http://www.ecipe.org/publications/dataloc/ )
This paper aims to quantify the losses that result from data localisation requirements and related data privacy and
security laws that discriminate against foreign suppliers of data, and downstream goods and services providers,

The study looks at the effects of recently proposed or enacted


legislation in seven jurisdictions, namely Brazil, China, the European Union
(EU), India, Indonesia, South Korea and Vietnam.Access to foreign markets
and globalised supply chains are the major sources of growth, jobs and
new investments in particular for developing economies. Manu- facturing
and exports are also dependent on having access to a broad range of services at competitive prices, which depend on secure and efficient access
to data. Data localisation potentially affects any business that uses the internet to produce, deliver, and receive
payments for their work, or to pay their salaries and taxes .The impact of recently proposed or
enacted legislation on GDP is substantial in all seven countries : Brazil (-0.2%),
China (-1.1%), EU (-0.4%), India (-0.1%), Indone- sia (-0.5%), Korea (-0.4%) and Vietnam (-1.7 %). These
changes significantly affect post-crisis economic recovery and can undo
the productivity increases from major trade agreements, while economic
growth is often instrumental to social stability.If these countries would
also introduce economy-wide data localisation require- ments that apply
across all sectors of the economy, GDP losses would be even high- er: Brazil (-0.8%), the EU (using GTAP8.

1.1%), India (-0.8%), Indonesia (-0.7%), Korea (-1.1%).The impact on overall domestic investments is also
considerable: Brazil (-4.2%), China (-1.8%), the EU (-3.9%), India (-1.4%), Indonesia (-2.3%), Korea (-0.5%) and
Vietnam (-3.1). Exports of China and Indonesia also decrease by -1.7% as a conse- quence of direct loss of
competitiveness.Welfare losses (expressed as actual economic losses by the citizens) amount to up to $63 bn for
China and $193 bn for the EU. For India, the loss per worker is equivalent to 11% of the average month salary, and

the negative
impact of disrupting cross-border data flows should not be ignored. The
globalised economy has made unilateral trade restric- tions a
counterproductive strategy that puts the country at a relative loss to
others, with no possibilities to mitigate the negative impact in the long
run. Forced locali- sation is often the product of poor or one-sided
economic analysis, with the sur- reptitious objective of keeping foreign
competitors out. Any gains stemming from data localisation are too small
to outweigh losses in terms of welfare and output in the general economy.
almost 13 percent in China and around 20% in Korea and Brazil.The findings show that

Localization does not kill the Internet or economy their


evidence concedes that foreign countries still want access to
the US market
Richard Adhikari, 7-16-2015, "The Fallout From the NSA's Backdoors
Mandate," No Publication,
http://www.ecommercetimes.com/story/81530.html)//GV
The United States National Security Agency (NSA)

is widely believed to have mandated high-tech


vendors build backdoors into their hardware and software. Reactions from foreign
governments to the news are harming American businesses and, some contend, may result in

the breakup of the Internet. For example, Russia is moving to paper and typewriters in
some cases to move certain types of information, Private .me COO Robert Neivert told the ECommerce Times. Governments are pushing to enact laws to force the localization of data -generally meaning they won't allow data to be stored outside their borders to
protect citizens against NSA-type surveillance -- a move that's of particular concern
to American businesses, according to a Lawfare Research paper. That's because they deem U.S. firms untrustworthy
for having provided the NSA with access to the data of their users. Revisiting the Tower of Babel? "There's an increased use of
networks on behalf of Europe and other allies that do not pass through U.S. companies or U.S.-controlled networks," Neivert said.

Some countries are even proposing to break up the Internet . However, "people
who say these things threaten the Internet itself are misunderstanding
things," Jonathan Sander, strategy & research officer of Stealthbits Technologies, told the E-Commerce Times. "The
Internet produces too much wealth for too many people and organizations
for anyone, including the U.S., to threaten it." The U.S. economy "is one of
the best weapons we have in the technology war," Sander continued. The
U.S. market "is too big for foreign governments to ignore," which is why
foreign companies continue doing business with the U.S. Concern has
been expressed about invasions of privacy through surveillance, but this
issue is "a matter of policy" and there are differences in how citizens of
different countries approach it, Sander pointed out. "In the EU and, to a lesser extent [Australia and New
Zealand], privacy is an issue at the ballot box so there are laws reflecting that." In the U.S., however, privacy "has yet to seriously
break through as an issue, so there has been less motion," Sander remarked. Massive Cost to U.S. Businesses In August of last year,
the German government reportedly warned that Windows 8 could act as a Trojan when combined with version 2.0 of the Trusted
Platform Module (TPM), a specification for a secure cryptoprocessor. The TPM is included in many laptops and tablets, and the
concern is that TPM 2.0 makes trusted computing functions mandatory rather than opt-in as before, meaning it can't be disabled.
Further, it can let Microsoft establish a backdoor into the device it's in. Microsoft's response was that OEMs can turn off the TPM in
x86 computers. The German government will end its contract with Verizon; Brazil has decided to replace its fighter jets with ones
made by Sweden's Saab instead of Boeing; and Web hosting firm Servint Corp. reported a 30 percent decline in overseas business
since the NSA leaks first made news in June 2013. "There is both diplomatic and economic backlash against these tactics," Robyn
Greene, policy counsel at New America's Open Technology Institute, told the E-Commerce Times. It's difficult to establish an exact
dollar amount, but "experts

have estimated that losses to the U.S. cloud industry alone


could reach (US)$180 billion over the next three years," Greene said. "Additionally,
major U.S. tech companies like Cisco and IBM have lost nearly one-fifth of their
business in emerging markets because of a loss of trust." Foreign companies are
using their non-U.S. status to advertise themselves as more secure or protective of
privacy, Greene remarked. The Other Side of the Story On the other hand, Cisco's share of the
service provider router and carrier Ethernet market bounced back strongly
after an unusually weak Q2, primarily because of a strong performance in
the Asia-Pacific and the EMEA regions, SRG Research reported. "Cisco is in a league
of its own, with a global presence, credibility and product range that
cannot be matched by its competitors," John Dinsdale, managing director and chief analyst at SRG, told
the E-Commerce Times. "When demand increases, there is only a rather short list of
vendors who can satisfy it, and Cisco clearly has the strongest story to
tell." In addition, the allegations that U.S. high-tech firms built backdoors into
their products are not true, contended Philip Lieberman, president of Lieberman Software. "I have never
seen any cooperation between U.S.-owned software or hardware manufacturers to
insert backdoors into their products for the use of the NSA ," Lieberman told the E-Commerce
Times. "The damage that such an inclusion would cause to the company that did so would be catastrophic and probably

With its backdoors, the NSA "broke the foundational


element of trust, and that's something very difficult to recover from . [It has] in effect
destroyed the trusted and secure reputation of U.S. companies ," said Neivert. "More and
more we will see U.S. tech companies focusing on distinguishing their
products and services with heightened security offerings and working to
achieve legislative reforms that would rein in [surveillance practices]. That's
unrecoverable." Rebuilding Faith and Trust

the case with the Reform Government Surveillance Coalition and tech industry trade associations that represent thousands of
companies," New America's Open Technology Institute's Greene added

2NC No Localization
No localizationtransition costs
Makivama in 14 (Hosku Lee-Makivama, director of European Centre for
International Political Economy (ECIPE) and a leading author on trade
diplomacy, EU-Far East relations and the digital economy. ECIPE, 7-16-2015,
"The costs of data localization," http://www.ecipe.org/blog/the-costs-of-datalocalization/)
In the aftermath of recent revelations on mass-scale electronic
surveillance, there has been a widespread proliferation of internet
restrictions. One of the most drastic, yet a common policy response to the
problem has been the mandatory requirement on storing critical data on
servers physically located inside the country. This policy of data
localisation has been considered by a number of countries including Brazil,
where the multistakeholder summit NetMundial 2014 is taking place this
week.What Brazil and other countries like China, the European Union, India, Indonesia,
Korea and Vietnam (who all have considered similar strategies) fail or choose to fail to see, is
that information security is not a function of where data is physically
stored or processed. A forthcoming [released on May 15th] study by

ECIPE economists shows

that cross-border data flow is essential for developing economies to secure

access to foreign markets and participation in global supply chains. Thus ,

growth, jobs and new investments.

data is a major sources of

Manufacturing and exports are highly dependent on access

Forced
data localisation affects any business that uses the internet to produce,
deliver, and receive payments for their work, or to pay their salaries and
taxes. The results of our study show that even the current language of Brazils Marco Civil da Internet (without
to support services at competitive prices services that depend on secure and efficient access to data.

mandatory data localization) results in a GDP loss of -0.2%; EU GDPR results in -0.4%. Other results include China (1.1%), India (-0.1%), Indonesia (-0.5%), Korea (-0.4%) and Vietnam (-1.7%) for their internet policies.An economywide data localisation requirement (or discriminatory barriers to that effect) would substantially increase the GDP
loss if they are enforced: Brazil (-0.8%), the EU (-1.1%), India (-0.8%), Indonesia (-0.7%), Korea (-1.1%). Even
conservative estimates are sufficient to eradicate all post-crisis economic recovery, benefits from all their currently
negotiated trade agreements, or may even cause social unrest in some countries.Impact on investments is also
considerable: Brazil (at least -4.2%), China (-1.8%), the EU (-3.9%), India (-1.4%), Indonesia (-2.3%), Korea (-0.5%)
and Vietnam (-3.1%). Exports of China and Indonesia also decrease by -1.7% as a direct consequence of loss of
competitiveness. Welfare losses (expressed as actual financial loss by its citizens) are up to 63bn USD for China
and 193 bn USD for the EU. The findings show that the negative impact from disrupting data should not be ignored.
The globalised economy has made unilateral trade restrictions a counterproductive strategy that puts the country
at a relative loss to others, with no possibilities to mitigate the negative impact in the long term. Forced localisation
is often the product of poor, one-sided economic analysis, often with the surreptitious objective of keeping foreign
competitors out although economic and security gains are too small to outweigh losses in terms of jobs and
output in the general economy.

Big tech companies are not going to localize their data


Miller in 14 (Claire Cain Miller, 1-24-2014, "Google Pushes Back
Against Data Localization," Bits Blog,
http://bits.blogs.nytimes.com/2014/01/24/google-pushes-back-against-datalocalization/, NB)
The big tech companies have put forth a united front when it comes to
pushing back against the government after revelations of mass surveillance. But
their cooperation goes only so far. Microsoft this week suggested that it would deepen its existing efforts to allow
customers to store their data near them and outside the United States. Google, for its part, has been fighting this
notion of so-called data localization. If

data localization and other efforts are successful,

then what we will face is the effective Balkanization of the Internet and
the creation of a splinternet broken up into smaller national and regional pieces, with barriers
around each of the splintered Internets to replace the global Internet we know today, Richard Salgado, Googles
director of law enforcement and information security, told a congressional panel in November. Data crisscrosses the
globe among data centers, and companies often store redundant copies of data in different places in case of natural
disaster or technical failure. In most cases, companies cannot even pinpoint precisely where certain data is

located. At the same time, the United States government is tapping the
fiber-optic network that connects data centers worldwide, according to
leaked documents. So even if data is stored outside the United States, it
could be intercepted during its travels. Still, Microsoft and other tech
companies are trying to prevent foreign customers from switching to
services outside the United States. In the next three years, the cloud
computing industry could lose $180 billion, 25 percent of its revenue,
because of such defections, according to Forrester, a research company.
Yet even though Google faces these same risks and requests from foreign
customers, its policy position is for surveillance reform instead of data
localization, according to a person briefed on Googles policy who would speak only anonymously. Though
Google at one time tried to offer customers the ability to store their data in one location in response to requests, it

Google decided
data is more secure if it is stored in multiple locations and that storing it
in one location slows Google services and makes accessing the data less
convenient for customers, the person said. Mr. Salgado said a proposed law in Brazil that would
does not offer that feature now because it determined it was illogical, the person said.

require all data of Brazilian citizens and companies to be stored in the country would be so difficult to comply with
that Google could

be barred from doing business in one of the worlds most


significant markets. For a great many around the globe, the Snowden disclosures
revealed a disturbing relationship between the major U.S. technology
firms and the American national security establishment . Specifically, the disclosures
showed that Yahoo, Google, and other large American tech companies had
provided the NSA with access to the data of the users of their services .
Although there were many programs that tied the major American firms to the NSA , three in particular
drew special ire: the much-discussed PRISM7 program, a collaborative
effort between the NSA and the FBI which compelled Internet companies
to hand over data held within servers located on U.S. soil in response a
subpoena issued by a special intelligence court, and two programs known
as MUSCULAR and TEMPORA,89 both of which allowed the NSA (in
partnership with Britains signals intelligence agency, the GCHQ) to access
information transmitted through the data communication links of
American-owned firms located outside the U.S., where statutory
limitations on data collection are far less stringent.10 he fact that American companies
provided the U.S. government with information and access to data (knowingly in some cases, apparently unwittingly

has led many foreign leaders to conclude that only domestic firms
or at least non-American firms operating exclusively within local
jurisdictions, can be trusted to host the data of their citizens . Prominent political
in others)

voices around the globe have been anything but subtle in their articulation of this assessment. Following the

German Interior Minister Hans-Peter


Friedrich declared that, whoever fears their communication is being
intercepted in any way should use services that don't go through
American servers.11 Frances Minister for the Digital Economy similarly insisted that it was now
publication of the PRISM program in the Guardian newspaper,

necessary to locate datacenters and servers in [French] national territory in order to better ensure data
security.12 Brazilian President Dilma Rousseff agreed, insisting that, "there

is a serious problem of
storage databases abroad. That certain situation we will no longer
accept."13 Unsurprisingly, these declarations from government officials at
the ministerial level and higher, and the policy responses those

declarations suggest, are profoundly troubling to American technology


companies. U.S. firms have issued dire warnings in response ,14 predicting that
they could lose tens of billions of dollars in revenue abroad as distrustful foreign governments and customers move

Firms fear that the antiAmerican backlash and potentially resulting data localization laws
(depending on the specifics of the rules enacted) will mean that they will
be forced out of certain markets, or forced to build expensive and
oftentimes unnecessarily redundant data centers abroad. Analysts are suggesting
either by choice or by legal mandate to non-U.S. alternatives.

the fallout could mirror what happened to Huawei and ZTE, the Chinese technology and telecommunications firms
that were forced to abandon some U.S. contracts when American lawmakers accused the companies of planting in
their products coding backdoors for the Chinese Peoples Liberation Army and intelligence services. 15 A muchcited estimate16 by the Information Technology and

1NC Alt Causes


Unilaterialism fails and alt causes to econ slowdown their
evidence
Richard N. Haass 13, President of the Council on Foreign Relations, 4/30/13,
The World Without America, http://www.projectsyndicate.org/commentary/repairing-the-roots-of-american-power-by-richardn--haass
The most critical threat facing the United States now and for the foreseeable
future is not a rising China, a reckless North Korea, a nuclear Iran, modern terrorism, or climate change.
Although all of these constitute potential or actual threats, the biggest challenges facing the US are its burgeoning debt,
crumbling infrastructure, second-rate primary and secondary schools,
outdated immigration system, and slow economic growth in short, the domestic
foundations of American power . Readers in other countries may be tempted to react to this judgment with a dose of schadenfreude, finding more
Let me posit a radical idea:

than a little satisfaction in Americas difficulties. Such a response should not be surprising. The US and those representing it have been guilty of hubris (the US may often be the
indispensable nation, but it would be better if others pointed this out), and examples of inconsistency between Americas practices and its principles understandably provoke charges of
hypocrisy. When America does not adhere to the principles that it preaches to others, it breeds resentment. But, like most temptations, the urge to gloat at Americas imperfections and

Americas failure to deal with its internal


challenges would come at a steep price . Indeed, the rest of the worlds stake in American success is nearly as large as that of the US
itself. Part of the reason is economic. The US economy still accounts for about one-quarter of global output. If US growth accelerates, Americas
capacity to consume other countries goods and services will increase , thereby boosting
growth around the world. At a time when Europe is drifting and Asia is slowing , only the US
(or, more broadly, North America) has the potential to drive global economic recovery . The US
remains a unique source of innovation. Most of the worlds citizens communicate with mobile devices based on technology
developed in Silicon Valley; likewise, the Internet was made in America. More recently, new technologies developed in the US
greatly increase the ability to extract oil and natural gas from
underground formations. This technology is now making its way around
the globe, allowing other societies to increase their energy production and
decrease both their reliance on costly imports and their carbon emissions.
The US is also an invaluable source of ideas. Its world-class universities
educate a significant percentage of future world leaders. More fundamentally, the US
has long been a leading example of what market economies and democratic politics
can accomplish. People and governments around the world are far more likely to become
more open if the American model is perceived to be succeeding. Finally, the world faces
many serious challenges, ranging from the need to halt the spread of weapons of mass destruction,
fight climate change, and maintain a functioning world economic order that promotes trade
and investment to regulating practices in cyberspace, improving global health, and
preventing armed conflicts. These problems will not simply go away or sort
themselves out . While Adam Smiths invisible hand may ensure the success of free markets, it is powerless in the
world of geopolitics . Order requires the visible hand of leadership to formulate and
realize global responses to global challenges. Dont get me wrong: None of this is meant to suggest that the US can deal
effectively with the worlds problems on its own. Unilateralism rarely works. It is not just that the US
lacks the means; the very nature of contemporary global problems
suggests that only collective responses stand a good chance of
succeeding. But multilateralism is much easier to advocate than to design and implement. Right
now there is only one candidate for this role: the US. No other country has the
necessary combination of capability and outlook . This brings me back to the argument that the US must put
its house in order economically, physically, socially, and politically if it is to have the resources
needed to promote order in the world. Everyone should hope that it does: The alternative to a world led by the
US is not a world led by China, Europe, Russia, Japan, India, or any other country, but
struggles ought to be resisted. People around the globe should be careful what they wish for.

a world that is not led at all . Such a world would almost certainly be characterized by chronic crisis
and conflict. That would be bad not just for Americans, but for the vast majority of the planets inhabitants.
rather

Alt cause loss of foreign investment is because the NSA


surveils foreign suspects; the Aff can only resolve domestic
surveillance
Benner 14
(Katie, 12/19/14, BloombergView, Microsoft and Google in a Post-Snowden World, Katie
Benner is a columnist @ BloombergView reporting on companies, culture, and technology,
http://www.bloombergview.com/articles/2014-12-19/microsoft-and-google-in-a-postsnowdenworld, 7/13/15, SM)
His documents revealed myriad NSA spy programs that hoovered up information on foreign
suspects as well as U.S. citizens. The agency had also pressured telecom companies like Verizon and
Internet giants like Google to feed customer data into the government's vast surveillance operation. As the
Snowden revelations showed, the U.S. government was also actively exploiting corporate security flaws to take
whatever it wanted from those companies. In the wake of all of that, tech firms immediately tried to
distance themselves from the NSA, even as the Snowden revelations tarnished their reputations with
corporate clients, consumers and governments worldwide. Companies warned that fallout from the Snowden
revelations would hurt their future earnings and, anecdotally, it seemed that global customers started to look for
alternatives to U.S. tech suppliers.

2NC Alt Causes


Their evidence says alt causes to econ tax, trade, and
investment policies
Area Development December, 2012, "Participation in the Global Economy
Keeps U.S. Economy Growing," Area Development,
http://www.areadevelopment.com/BusinessGlobalization/December2012/glob
al-participation-grows-US-economy-1259168.shtml)//GV

A study recently released by The Business Roundtable and the United States Council
for International Business comes to the conclusion that the success of U.S.
companies in the global economy directly relates to economic growth and job
creation here at home. Authored by Matthew Slaughter, Ph.D., of the Tuck School of Business at Dartmouth, American
Companies and Global Supply Networks: Driving U.S. Economic Growth and Jobs by Connecting with the World, shows how
globally engaged U.S. companies, their international operations, and supply
networks are linked to U.S. economic growth and employment . Despite ongoing economic
uncertainties, this study underscores the fact that millions of good American jobs are created when
companies engage in growing global markets via international trade and
investment, Slaughter says. The benefits of global engagement impact all levels of our
economy, not just those companies engaged in international commerce . The study, which
profiled Dow Chemical Co., Coca-Cola Co., ExxonMobil, FedEx Corp., IBM, Procter & Gamble, and Siemens, comes to the following

globally engaged U.S. companies are the driving force in U.S. capital
investment, R&D, and international trade, which in turn foster U.S. economic growth
conclusions: (1)

and the creation of well-paying jobs; (2) in order to access new customers and innovative ideas and continue to grow, these
companies must participate in the global economy; and (3) global growth supports further hiring, investment, and R&D at these
companies

U.S. facilities, while also creating jobs at other U.S. companies, often smalland medium-sized, within their global supply chains. The success of globally
engaged U.S. companies has a direct and very positive impact on Main Street USA ,
says John Engler, president of Business Roundtable. To encourage and enable our companies to seek
new markets and succeed anywhere in the world , we need tax, trade, and
investment policies that reflect todays competitive global economy. The
benefits at home are enormous. factor.

1NC No Protectionism
Protectionism wont happen institutional and legal factors
Molinuevo 10 - trade policy expert [Martin, Protectionism in services
during the global crisis a (trade) war in shallow trenches? United Nations
Economic and Social Commission for Asia and the Pacific 2010]
The analysis of the measures taken during the 2008-2009 global economic crisis suggests
that, when in it comes to international trade and investment in services, the scenario of a global trade
war or restrictive measures, has not really ever become a real one. Clearly, the
crisis seems to have granted the opportunity to some countries to give in to
protectionist pressures, particularly in industries where such pressures are
traditionally strong, such as automobiles, machinery industries and agriculture. To that end,
Governments have resorted to measures that tend to be poorly covered by the
international legal framework, such as subsidies schemes and government
procurement. With regard to international trade and investment in services, the analysis suggests that
a number of economic, legal and institutional factors complement each other to
create strong incentives against a general surge of protectionism . These elements,
indeed, de facto eliminate from the domestic regulatory capacity a number of
instruments that would allow Governments to protect domestic industries and
isolate them from the global economy. In such a legal, economic and institutional
context, a trade war seems unlikely. The above findings confirm the general perception that
international trade in services remains an area which is less accessible to direct governmental intervention. While in
the area of trade in goods, the Governments have a number of instruments to affect particular, chosen goods, at

When it comes to trade in services, regulatory action for individual


sectors tends to be more costly and less readily available , which acts as a
disincentive for the introduction of protectionist measures . National policymakers are better
their disposal.

equipped to focus on the development of general legal frameworks, leaving sector-specific matters to be developed
by specialized agencies with expertise in the individual sector. In the negotiating context, this translates into a need
for trade and foreign ministries to maintain close contacts with specific regulatory agencies. Another implication
relates to the strengthening of the multilateral trading system, and highlights apparent contradictions between
negotiations and actual policy needs. The above observations suggest that services generate less protectionist
pressures than trade in goods. Yet, at the multilateral level, a number of developing countries seem reluctant to
advance in international commitments in this area. This may in part be due to particular regulatory concerns

more active discussions on trade and


investment in services in multilateral negotiations would sustain the international
trading rules and would enhance coherence of the system , in particular vis--vis the
associated with certain services industries. However,

proliferation of preferential trade agreements. The regulatory developments on trade and investment in services
observed during the crisis also have strong implications for two matters on the multilateral agenda on services
disciplines. Some Asian WTO Members have devoted significant efforts to gather support for the introduction of a
special safeguard mechanism under GATS, with limited success. Such an instrument seems to offer few advantages
for regulators for the defense of domestic services in emergency situations. Indeed, no measure taken during the
economic crisis was aimed in that direction, not even in the financial sector. Trade negotiators would hence be well
advised to consider whether an emergency mechanism, that does not seem to attract major interest from their own
regulators in times of economic crisis, is worth investing such negotiating capital in. Conversely, the most popular
emergency measure resorted to during the crisis, subsidies, has received little interest at the multilateral table.
However, the GATS disciplines on non-discrimination do apply to state aid measures. The regulatory practice during
the global crisis has shown that emergency subsidies, temporary in nature, can prove a valuable instrument in

WTO Members may draw


on this experience in developing joint rules that would ensure that subsidies remain
a valuable tool in the policy options for Governments in times of crises, while setting
limits to the discriminatory and distortive effects that they may bring about.
times of crisis (i.e. promoting trade and investment rather than restricting it).

2NC No Protectionism
No protectionism trade is universally popular
Stokes, 14 director of global economic attitudes at the Pew Research
Center (Bruce, U.S. isolationism isn't protectionism, CNN, 1/14/2014,
http://globalpublicsquare.blogs.cnn.com/2014/01/14/u-s-isolationism-isntprotectionism/) //RGP
Isolationism is not protectionism. And confusing the two can create a false impression of the trajectory of U.S.

New polling data showing that the American public is


turning inward, preoccupied with domestic affairs and less interested in international engagement, is not
evidence of a rise in U.S. economic protectionism , with its grave consequences for global
business. Indeed, even as their doubts grow over the future U.S. geopolitical role, Americans say that the
benefits from U.S. participation in the global economy outweigh the risks . And even as
global engagement in the year ahead.

they harbor doubts about the impact of trade agreements on wages and jobs, public support for closer trade and

The Obama
administrations disengagement from Iraq and Afghanistan, its leading from
behind in Libya and its reluctance to become involved in the Syrian civil war all reflect a broad public
reassessment of Americas future security role in the world . But the White Houses
pursuit of the Trans-Pacific Partnership and the Transatlantic Trade and Investment
Partnership, two unprecedented trade deals, equally reflect Americans newfound
acceptance of the importance or at least inevitability of U.S. economic integration with
the rest of the world. Still, 2014 could well prove to be a year when the United States
is less globally engaged geopolitically, even while it is more engaged economically. Americans say that
business ties with other nations stands at its highest point in more than a decade.

the country does too much to solve world problems, and increasingly they want their leaders to pay more attention
to problems at home. About half the public see the United States as overextended abroad, according to a recent
Pew Research Center survey. When asked to describe why they feel this way, nearly half cite problems at home,
including the economy, which they say should get more attention instead. More from GPS: Americans see declining
U.S. prestige And such skepticism about international engagement has increased. Currently, about half the public
says the United States should mind its own business internationally and let other countries get along the best they
can on their own. Such public international ennui has waxed and waned at various times in recent history, most
notably after the Vietnam War. But this is the most lopsided split in favor of the U.S. minding its own business in

the American public expresses no such


reluctance about U.S. involvement in the global economy . About three-in-four
Americans say that growing trade and business ties between the United States and
other countries are good for the nation. And such support for increased trade and business
the nearly 50 years this question has been asked. Yet

connections has increased 24 percentage points since 2008. Moreover, at a time of deep partisan divides on many

Americans are united in endorsing global economic engagement . Solid


majorities of Republicans, Democrats and independents describe increased
international trade and business ties as good for the U.S . By more than two-to-one,
Americans also see more benefits than risks from greater involvement in the global
economy. This includes large majorities across education and income categories, as well as most Republicans,
issues,

Democrats and independents. To be sure, the public has worries about globalization. A 2010 Pew Research Center
survey found that a majority of the public said free trade agreements lead to job losses. And a plurality said that
free trade deals lower wages. Nevertheless, majorities wanted to increase trade with both Europe and Japan the
principal participants in the current transatlantic and transpacific trade negotiations. And Americans are of two
minds about foreign investment. Amid forecasts of massive new Chinese investment in the United States over the
next decade, a majority says that more foreign companies setting up operations in the United States would mostly
help the economy. But nearly three-quarters think that the economy would be hurt if more U.S. companies move

The prophecies of Americas retreat from


the world are premature. Americans may want a less forward-leaning geopolitical
posture in the world, but they still support greater U.S. global economic
engagement with it.
their operations abroad. So what does all this suggest?

1NC Protectionism=/=War
No impact to protectionism
Kanellos, 11 staff writer (Michael, Why Protectionism Works, Greentech,
1/12/2011, http://www.greentechmedia.com/articles/read/why-protectionismworks) //RGP
Politicians and some trade groups have begun to lobby the U.S. to pursue trade sanctions against China. At the

they warn that U.S. retaliatory trade barriers will cause the price of
solar panels and other technologies to rise, create economic inefficiencies
and slow innovation. If you don't believe in free trade and the market system, then you can look to North
same time,

Korea for a roadmap for economic development, one person told me. On the other hand, executives like Bill
Watkins, CEO of LED manufacturer Bridgelux, says Buy

American provisions remain one of


the best and simplest ways to grow the market. Think of it for a second. What if the federal
government created a fund that would pay communities to upgrade their streetlights to LEDs with a caveat that 60

Energy consumption would be


reduced, municipalities would see their power bills decline, and the fund
could be paid in part through the tax revenues coming from those booming
lighting and construction firms. Everybody, potentially, wins. This debate is further fueled
by the fact that nearly everyones opinion gets colored by a subjective worldview .
percent of the equipment came from U.S.-based facilities?

Is the government an encroaching evil or the people that keep rat droppings out of burger meat? See the

We
should try Buy American and Buy Local standards, and in a few years' time, pick which
comments: the issue arouses emotions. Personally, I come down on the side of experimental protectionism.

ones work best, if any. We have some with the ARRA and the DoD has imposed some already. Oil and coal get
subsidies. Whats the worse that could happen? An uptick to 9.8 percent unemployment? Here are the traditional

Everyone Else Does It. Does China play fair under WTO rules?
Does Europe? Does anyone reading this enjoy a thriving export business in
Japan? Look around the globe and you will find policies directed at building
economies through industrial subsidies and limiting contracts to local
suppliers that arguably run afoul of trade agreements. "In Michigan, we will give you the
bullet points why: 1.

whole factory, land included," Michael Eckhart of ACORE said in October. "In 2004, 2005 and 2006, we asked
[China] to become more efficient. The darn thing is, is that they did it and we got left in the dust. When I am
wearing my U.S. hat, I say this is a threat. But when I wear my renewables hat, I have to say this is the best thing
that ever happened." We already give out stimuli. Buy American provisions can further level the playing field. 2. We
Need Middle-Class Job Growth. A few years back, a well-known investor suggested to me that displaced IT workers
could find new careers in elder care. It reminded me of a summer job a friend once had. It involved applying
Tucks Medicated Pads (for cooling relief!) to a woman in a terminal care facility. Is this really the kind of opportunity
you want to bequeath to the upcoming generation? 3. IP Jobs Cant Employ Everyone. The oft-heard
prescription for economic recovery is that Americans should aspire to high value jobs with a heavy emphasis on

not everyone is smart enough, or has the time, to


get a PhD in biochemistry. IP firms dont employ massive numbers, either. The countries held up as
paragons of the IP model -- Finland, Israel, Singapore, Taiwan -- have somewhat small populations. 4. Buyers
Have Freedom. Most of these laws do not put restrictions on individuals or
private organizations. They only require federal, state or local agencies buy a
certain percentage of their products and services from U.S. suppliers . Why cant
intellectual property. Unfortunately,

the Department of Defense choose its vendors? Free trade advocates mistakenly imply that price should be the only
determining factor when picking a vendor. Instead, buyers can use any criteria -- good of the community, real
estate tax benefits, longstanding relationships -- they like this side of outright bribery. If the DoD thinks a sound
economy begets national security, so be it. 5. Its Not That Mystifying. Late last year, Suntech Power Holdings, the
large Chinese solar maker, opened a module assembly facility in Arizona. Why? To qualify under Buy in America

APower Technologies, a Chinese wind turbine maker, is building a factory in


Nevada for the same reasons. Dialight hailed a decision this week that limits lighting projects
statutes. The company also selects U.S. polysilicon, one of our strongest exports, to do the same. Meanwhile,

People
can figure out how to make it work. 6. Wall Street Will Comply. Investors and
venture capitalists, when faced with more government restrictions, will walk
away from energy, some claim.
sponsored by ARRA money to U.S. manufacturers. Dialight comes from the U.K. but has a U.S. subsidiary.

2NC Protectionism=/=War
2008 proves trade war wont happen
Zappone 12 staff writer [Chris, 'Murky protectionism' on the rise - but
no trade war Sydney Morning Herald 1/10/12
<http://www.smh.com.au/business/world-business/murky-protectionism-onthe-rise--but-no-trade-war-20120110-1pt3t.html>]
At the outset of the global financial crisis, the worlds leaders pledged to resist calls
to shield their local economies in order to prevent a trade war that could further damage

global growth. Four years on, with China slowing, Europe heading into recession and a political environment soured

how long will policymakers be able to resist


those calls for more protectionism? Free trade is going to be under pressure, said
Lowy Institute international economy program director Mark Thirlwell . Since 2007-08 the
by successive financial crises, the question arises:

case for moving to greater trade liberalisation has got tougher and the demands for protection have increased.
Only last week, China, which is grappling with a slowdown, raised the prospect of a trade war with the European
Union in response to the EU's implementation of a carbon emissions tax on air travel to and from Europe. Earlier
last month China imposed tariffs up to 21 per cent on US-made cars, affecting about $US4 billion imports a year.
Advertisement Across the Pacific, US politicians in the throes of an election year with 8.5 per cent unemployment
have issued more strident calls for China to play by the rules and allow the yuan to appreciate faster against the
US dollar. The US has also asked the World Trade Organisation to probe China's support for its solar panel industry
and the restrictions Beijing has placed on US poultry imports. In fact ,

the most recent WTO data shows


that the number of trade restrictive measures enacted by members rose 53 per
cent to 339 occurrences over the year to October. Yet the WTO admits that the motives behind

the spate of actions arent always simply to protect local jobs. Not all measures categorised as trade restrictive
may have been adopted with such an intention, the body said. In Brazil, for example, the steep rise in the value of
its currency, the real, has sparked a torrent of car imports into the country - similar to the online-overseas shopping
boom in Australia. Brazil has in turn put a one-year provisional 30 per cent increase on auto imports, to
counterbalance the effects of their strong currency. In the US, China and Australia, infrastructure spending
measures contain buy local requirements to stoke domestic growth, not necessary punish foreign businesses. The
federal government in September streamlined its anti-dumping system that eases the way for companies to ask for
investigations into imported goods that come in below market value to Australia. Again, well within the rules .

What weve seen is a gradual ratcheting up of trade intervention, said Mr


Thirlwell, amounting to what he calls murky protectionism or government
intervention through support for industries or complaints to global trade authorities.
To date, observers such as Mr Thirlwell say most countries have remained remarkably resistant
to throwing up significant trade barriers. For example, in November, the US, Australia and
seven other Asian-Pacific nations including Japan, outlined the plan for an ambitious
multilateral Trans-Pacific Partnership trade block worth 40 per cent of the worlds
trade, in an effort to increase the flow of cross-border goods and investment. Japan, China and South Korea are
also in the later stages of negotiation over a free trade deal between those three nations . Australian
National University international trade lecturer John Tang doesnt believe the world
is on the edge a new round of protectionism. I dont see a general sea change
towards protectionism for major trading blocks but that may be because so much of
the industrialised world is relying on developing countries to sustain their exports, he said.
Nevertheless, a shift in the political reality of the US, China or elsewhere could change that, he said. Washington
DC-based Brookings Institution fellow Joshua Meltzer said that if the euro zone broke up, elevating the crisis to a

I wouldnt go so far to say the


global economy is so integrated that we could never have anything that would
approach a trade war, said Washington DC-based Brookings Institution fellow
Joshua Meltzer. But I dont think were on that track.
new stage, nations may switch to much more protective measures.

1NC Econ =/= War


International norms maintain economic stability
***Zero empirical data supports their theory the only financial crisis of the
new liberal order experienced zero uptick in violence or challenges to the
central factions governed by the US that check inter-state violence they
have no theoretical foundation for proving causality
Barnett, 9 senior managing director of Enterra Solutions LLC (Thomas, The
New Rules: Security Remains Stable Amid Financial Crisis, 25 August 2009,
http://www.aprodex.com/the-new-rules--security-remains-stable-amidfinancial-crisis-398-bl.aspx)
When the global financial crisis struck roughly a year ago, the blogosphere was ablaze
with all sorts of scary predictions of, and commentary regarding, ensuing conflict and wars -- a
rerun of the Great Depression leading to world war, as it were. Now, as global economic news brightens and
recovery -- surprisingly led by China and emerging markets -- is the talk of the day, it's interesting to look back over

globalization's first truly worldwide recession has had


virtually no impact whatsoever on the international security landscape. None of the more
than three-dozen ongoing conflicts listed by GlobalSecurity.org can be clearly attributed
to the global recession. Indeed, the last new entry (civil conflict between Hamas and Fatah
in the Palestine) predates the economic crisis by a year, and three quarters of the chronic struggles began
in the last century. Ditto for the 15 low-intensity conflicts listed by Wikipedia (where the latest
the past year and realize how

entry is the Mexican "drug war" begun in 2006). Certainly, the Russia-Georgia conflict last August was specifically
timed, but by most accounts the opening ceremony of the Beijing Olympics was the most important external trigger
(followed by the U.S. presidential campaign) for that sudden spike in an almost two-decade long struggle between

we see a most
familiar picture: the usual mix of civil conflicts, insurgencies, and liberationthemed terrorist movements. Besides the recent Russia-Georgia dust-up, the only two
potential state-on-state wars (North v. South Korea, Israel v. Iran) are both tied to one side acquiring
a nuclear weapon capacity -- a process wholly unrelated to global economic trends. And with the
Georgia and its two breakaway regions. Looking over the various databases, then,

United States effectively tied down by its two ongoing major interventions (Iraq and Afghanistan-bleeding-into-

our involvement elsewhere around the planet has been quite modest, both
leading up to and following the onset of the economic crisis: e.g., the usual counter-drug efforts in Latin
Pakistan),

America, the usual military exercises with allies across Asia, mixing it up with pirates off Somalia's coast).
Everywhere else we find serious instability we pretty much let it burn, occasionally pressing the Chinese -unsuccessfully -- to do something. Our new Africa Command, for example, hasn't led us to anything beyond
advising and training local forces. So, to sum up: No significant uptick in mass violence or unrest
(remember the smattering of urban riots last year in places like Greece, Moldova and Latvia?); The usual
frequency maintained in civil conflicts (in all the usual places); Not a single state-on-state war directly caused (and
no great-power-on-great-power crises even triggered); No

great improvement or disruption in great-

power cooperation regarding the emergence of new nuclear powers (despite all that diplomacy); A
modest scaling back of international policing efforts by the system's acknowledged Leviathan power (inevitable
given the strain); and No

serious efforts by any rising great power to challenge that


Leviathan or supplant its role. (The worst things we can cite are Moscow's occasional deployments of strategic
assets to the Western hemisphere and its weak efforts to outbid the United States on basing rights in Kyrgyzstan;
but the best include China and India stepping up their aid and investments in Afghanistan and Iraq.) Sure, we've
finally seen global defense spending surpass the previous world record set in the late 1980s, but even that's likely
to wane given the stress on public budgets created by all this unprecedented "stimulus" spending. If anything, the

friendly cooperation on such stimulus packaging was the most notable greatpower dynamic caused by the crisis. Can we say that the world has suffered a distinct shift to
political radicalism as a result of the economic crisis? Indeed, no. The world's major economies remain
governed by center-left or center-right political factions that remain decidedly friendly to

both markets and trade. In the short run, there were attempts across the board to insulate economies from
immediate damage (in effect, as much protectionism as allowed under current trade rules), but there was no great
slide into "trade wars." Instead, the World Trade Organization is functioning as it was designed to function, and
regional efforts toward free-trade agreements have not slowed. Can we say Islamic radicalism was inflamed by the
economic crisis? If it was, that shift was clearly overwhelmed by the Islamic world's growing disenchantment with
the brutality displayed by violent extremist groups such as al-Qaida. And looking forward, austere economic times
are just as likely to breed connecting evangelicalism as disconnecting fundamentalism. At the end of the day, the
economic crisis did not prove to be sufficiently frightening to provoke major economies into establishing global
regulatory schemes, even as it has sparked a spirited -- and much needed, as I argued last week -- discussion of the
continuing viability of the U.S. dollar as the world's primary reserve currency. Naturally, plenty of experts and
pundits have attached great significance to this debate, seeing in it the beginning of "economic warfare" and the
like between "fading" America and "rising" China. And yet, in a world of globally integrated production chains and
interconnected financial markets, such "diverging interests" hardly constitute signposts for wars up ahead. Frankly, I
don't welcome a world in which America's fiscal profligacy goes undisciplined, so bring it on -- please! Add it all up

financial crisis has proven the great resilience of America's


post-World War II international liberal trade order.
and it's fair to say that this global

2NC Econ =/= War


Aggregate data proves interstate violence doesnt result from
economic decline
Drezner, 12 --- The Fletcher School of Law and Diplomacy at Tufts University
(October 2012, Daniel W., The Irony of Global Economic Governance: The
System Worked,
www.globaleconomicgovernance.org/wp-content/uploads/IR-ColloquiumMT12-Week-5_The-Irony-of-Global-Economic-Governance.pdf)
a dog that hasnt barked: the effect of the Great Recession on
cross-border conflict and violence. During the initial stages of the crisis, multiple analysts
asserted that the financial crisis would lead states to increase their use of force as a
tool for staying in power.37 Whether through greater internal repression, diversionary wars, arms races, or
a ratcheting up of great power conflict, there were genuine concerns that the global economic
downturn would lead to an increase in conflict . Violence in the Middle East, border disputes in the
The final outcome addresses

South China Sea, and even the disruptions of the Occupy movement fuel impressions of surge in global public
disorder.

The aggregate data suggests otherwise, however. The Institute for Economics and
Peace has constructed a Global Peace Index annually since 2007. A key conclusion
they draw from the 2012 report is that The average level of peacefulness in 2012 is
approximately the same as it was in 2007.38 Interstate violence in particular has
declined since the start of the financial crisis as have military expenditures in most sampled
countries. Other studies confirm that the Great Recession has not triggered any
increase in violent conflict; the secular decline in violence that started with the end of the Cold War has
not been reversed.39 Rogers Brubaker concludes, the crisis has not to date generated the
surge in protectionist nationalism or ethnic exclusion that might have been expected.40
None of these data suggest that the global economy is operating swimmingly. Growth remains unbalanced and
fragile, and has clearly slowed in 2012. Transnational capital flows remain depressed compared to pre-crisis levels,
primarily due to a drying up of cross-border interbank lending in Europe. Currency volatility remains an ongoing
concern. Compared to the aftermath of other postwar recessions, growth in output, investment, and employment in
the developed world have all lagged behind. But the Great Recession is not like other postwar recessions in either

One financial analyst


characterized the post-2008 global economy as in a state of contained
depression.41 The key word is contained, however. Given the severity, reach and depth of
the 2008 financial crisis, the proper comparison is with Great Depression. And by
that standard, the outcome variables look impressive. As Carmen Reinhart and Kenneth Rogoff
scope or kind; expecting a standard V-shaped recovery was unreasonable.

concluded in This Time is Different: that its macroeconomic outcome has been only the most severe global
recession since World War II and not even worse must be regarded as fortunate.42

Most rigorous historical analysis proves


Miller, 2K economist, adjunct professor in the University of Ottawas
Faculty of Administration, consultant on international development issues,
former Executive Director and Senior Economist at the World Bank, (Morris,
Poverty as a cause of wars?, Winter, Interdisciplinary Science Reviews, Vol.
25, Iss. 4, p. Proquest)
Perhaps one should ask, as some scholars do, whether it is not poverty as such but some
dramatic event or sequence of such events leading to the exacerbation of poverty that is
the factor that contributes in a significant way to the denouement of war. This calls for
addressing the question: do wars spring from a popular reaction to an economic

crisis that exacerbates poverty and/or from a heightened awareness of the poor
of the wide and growing disparities in wealth and incomes that diminishes their

tolerance to poverty? It seems reasonable to believe that a powerful "shock" factor


might act as a catalyst for a violent reaction on the part of the people or on the part of
the political leadership. The leadership, finding that this sudden adverse economic

and social impact destabilizing, would possibly be tempted to seek a diversion by


finding or, if need be, fabricating an enemy and setting in train the process
leading to war. There would not appear to be any merit in this hypothesis
according to a study undertaken by Minxin Pei and Ariel Adesnik of the Carnegie
Endowment for International Peace. After studying 93 episodes of economic crisis
in 22 countries in Latin America and Asia in the years since World War II they
concluded that Much of the conventional wisdom about the political impact of
economic crises may be wrong ..The severity of economic crisis - as measured
in terms of inflation and negative growth bore no relationship to the collapse of
regimes.(or, in democratic states, rarely) to an outbreak of violenceIn the
cases of dictatorships and semi-democracies, the ruling elites responded to crises
by increasing repression (thereby using one form of violence to abort another.)

Cybrersecurity Adv

Notes
30 second explainer: zero-day vulnerabilities (basically vulnerabilities in
software that are unknown and can be exploited in zero-days) makes nuke
power at risk, cyber-terror causes nuke meltdowns, extinction, retaliation,
nuke war, yadayadayada

CX Questions

1NC Cyber Inev


Cybersecurity vulnerabilities are inevitable
Corn 7/13
(Corn, Geoffrey S. * Presidential Research Professor of Law, South Texas College of Law; Lieutenant
Colonel (Retired), U.S. Army Judge Advocate Generals Corps. Prior to joining the faculty at South
Texas, Professor Corn served in a variety of military assignments, including as the Armys Senior Law
of War Advisor, Supervisory Defense Counsel for the Western United States, Chief of International Law
for U.S. Army Europe, and as a Tactical Intelligence Officer in Panama. Averting the Inherent Dangers
of 'Going Dark': Why Congress Must Require a Locked Front Door to Encrypted Data, SSRN. 07-132015. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2630361&download=yes//ghs-kw)
Like CALEA, a statutory obligation along the lines proposed herein will inevitably trigger criticisms and generate

One obvious criticism is that the creation of an escrow key or the maintenance
of a duplicate key by a manufacturer would introduce an unacceptable risk of
compromise for the device. This argument presupposes that the risk is significant, that
the costs of its exploitation are large, and that the benefit is not worth the risk. Yet
manufacturers, product developers, service providers and users
constantly introduce such risks. Nearly every feature or bit of code added
to a device introduces a risk, some greater than others. The vulnerabilities
that have been introduced to computers by software such as Flash, ActiveX
controls, Java, and web browsers are well documented .51 The ubiquitous SQL
database, while extremely effective at helping web designers create effective data
driven websites, is notorious for its vulnerability to SQL injection attacks. 52 The
adding of microphones to electronic devices opened the door to aural interceptions.
Similarly, the introduction of cameras has resulted in unauthorized video surveillance
of users. Consumers accept all of these risks, however, since we, as individual users
and as a society, have concluded that they are worth the cost. Some will inevitably
argue that no new possible vulnerabilities should be introduced into devices to allow
the government to execute reasonable, and therefore lawful, searches for unique and
otherwise unavailable evidence. However, this argument implicitly asserts that
there is no, or insignificant, value to society of such a feature. And herein lies the
Achilles heel to opponents of mandated front-door access: the conclusion is entirely at odds with
the inherent balance between individual liberty and collective security central to the
Fourth Amendment itself. Nor should lawmakers be deluded into believing that the
currently existing vulnerabilities that we live with on a daily basis are less significant
in scope than the possibility of obtaining complete access to the encrypted contents
of a device. Various malware variants that are so widespread as to be almost
omnipresent in our online community achieve just such access through what would
seem like minor cracks in the defense of systems. 53 One example is the Zeus
malware strain, which has been tied to the unlawful online theft of hundreds of
millions of dollars from U.S. companies and citizens and gives its operator complete
access to and control over any computer it infects .54 It can be installed on a machine through
the simple mistake of viewing an infected website or email, or clicking on an otherwise innocuous link.55 The
malware is designed to not only bypass malware detection software, but to
deactivate to softwares ability to detect it.56 Zeus and the many other variants of malware that
concerns.

are freely available to purchasers on dark-net websites and forums are responsible for the theft of funds from
countless online bank accounts (the credentials having been stolen by the malwares key-logger features), the theft
of credit card information, and innumerable personal identifiers.57

2NC Cyber Inev


Security issues are inevitable
Wittes 15
(Benjamin Wittes. Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance
Studies at the Brookings Institution. He is the author of several books and a member of the Hoover
Institution's Task Force on National Security and Law. "Thoughts on Encryption and Going Dark, Part II:
The Debate on the Merits," Lawfare. 7-22-2015. http://www.lawfareblog.com/thoughts-encryption-andgoing-dark-part-ii-debate-merits//ghs-kw)
On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week
with his warning that the FBI was "going dark" because of end-to-end encryption . In this post,
I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted

two distinct
sets of questions: One is the conceptual question of whether a world of end-to-end
strong encryption is an attractive idea. The other is whether assuming it is not an attractive
idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal an
extraordinary access scheme is technically possible without eroding other essential
security and privacy objectives. These questions often get mashed together, both because tech
and not all pushing in the same direction. Let me start by breaking the encryption debate into

companies are keen to market themselves as the defenders of their users' privacy interests and because of the

the questions are not the same, and it's


worth considering them separately. Consider the conceptual question first. Would it
be a good idea to have a world-wide communications infrastructure that is , as Bruce
Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all
libertarian ethos of the tech community more generally. But

device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the
FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want
to create an internet as secure as possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same? Conceptually speaking, I am with Comey

the matter does not seem to me an especially close call. The


belief in principle in creating a giant world-wide network on which
surveillance is technically impossible is really an argument for the
creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment ,
however, why anyone else would. Consider the comparable argument in physical
space: the creation of a city in which authorities are entirely dependent on citizen
reporting of bad conduct but have no direct visibility onto what happens on the
streets and no ability to conduct search warrants (even with court orders) or to
patrol parks or street corners. Would you want to live in that city? The idea that
ungoverned spaces really suck is not controversial when you're talking
about Yemen or Somalia. I see nothing more attractive about the creation of a
worldwide architecture in which it is technically impossible to intercept and read ISIS
communications with followers or to follow child predators into chatrooms where
they go after kids. The trouble is that this conceptual position does not answer the entirety of the policy
on this questionand

question before us. The reason is that the case against preserving some form of law enforcement access to

It is also a
series of arguments about the costsincluding the security costsof maintaining
the capacity to decrypt captured signal. Consider the report issued this past week by a group of
decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance.

computer security experts (including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys
Under Doormats: Mandating Insecurity By Requiring Government Access to All Data and Communications." The
report does not make an in-principle argument or a conceptual argument against extraordinary access. It argues,
rather, that the effort to build such a system risks eroding cybersecurity in ways far more important than the
problems it would solve. The authors, to summarize, make three claims in support of the broad claim that any

What are those


"grave security risks"? "[P]roviding exceptional access to communications would
force a U-turn from the best practices now being deployed to make the Internet
more secure. These practices include forward secrecy where decryption keys are deleted
exceptional access system would "pose . . . grave security risks [and] imperil innovation."

immediately after use, so that stealing the encryption key used by a communications server would not compromise
earlier or later communications. A related technique, authenticated encryption, uses the same temporary key to

"[B]uilding in
exceptional access would substantially increase system complexity" and
"complexity is the enemy of security." Adding code to systems increases that system's attack surface,
guarantee confidentiality and to verify that the message has not been forged or tampered with."

and a certain number of additional vulnerabilities come with every marginal increase in system complexity. So by
requiring a potentially complicated new system to be developed and implemented, we'd be effectively

"[E]xceptional access would create


concentrated targets that could attract bad actors." If we require tech companies to retain some
guaranteeing more vulnerabilities for malicious actors to hit.

means of accessing user communications, those keys have to stored somewhere, and that storage then becomes
an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large
numbers of users. The strong implication of the report is that these issues are not resolvable,
though the report never quite says that. But at a minimum, the authors raise a series of important questions about
whether such a system would, in practice, create an insecure internet in generalrather than one whose general
security has the technical capacity to make security exceptions to comply with the law. There is some reason, in my

the picture may not be quite as stark as the computer scientists


make it seem. After all, the big tech companies increase the complexity of their
software products all the time, and they generally regard the increased attack
surface of the software they create as a result as a mitigatable problem. Similarly,
there are lots of high-value intelligence targets that we have to secure and would
have big security implications if we could not do so successfully. And when it really counts,
view, to suspect that

that task is not hopeless. Google and Apple and Facebook are not without tools in the cybersecurity department.

The real question, in my view, is whether a system of the sort Comey imagines could be built in
fashion in which the security gain it would provide would exceed the heightened
security risks the extraordinary access would involve. As Herb Lin puts it in his excellent, and
admirably brief, Senate testimony the other day, this is ultimately a question without an answer in the absence of a
lot of new research. "One side says [the] access [Comey is seeking] inevitably weakens the security of a system and
will eventually be compromised by a bad guy; the other side says it doesnt weaken security and wont be
compromised. Neither side can prove its case, and we see a theological clash of absolutes." Only when someone
actually does the research and development and tries actually to produce a system that meets Comey's criteria are
we going to find out whether it's doable or not. And therein lies the rub, and the real meat of the policy problem, in
my view: Who's going to do this research? Who's going to conduct the sustained investment in trying to imagine a
system that secures communications except from government when and only government has a warrant to
intercept those communications? The assumption of the computer scientists in their report is that the burden of
that research lies with the government. "Absent a concrete technical proposal," they write, "and without answers to
the questions raised in this report, legislators should reject out of hand any proposal to return to the failed
cryptography control policy of the 1990s." Indeed, their most central recommendation is that the burden of
development is on Comey. "Our strong recommendation is that anyone proposing regulations should first present
concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses
and for hidden costs." In his testimony, Herb supports this call, though he acknowledges that it is not the inevitable
route: the government has not yet provided any specifics, arguing that private vendors should do it. At the same
time, the vendors wont do it, because [their] customers arent demanding such features. Indeed, many customers
would see such features as a reason to avoid a given vendor. Without specifics, there will be no progress. I believe
the government is afraid that any specific proposal will be subject to enormous criticismand thats truebut the
government is the party that wants . . . access, and rather than running away from such criticism, it should
embrace any resulting criticism as an opportunity to improve upon its initial designs." Herb might also have
mentioned that lots of people in the academic tech community who would be natural candidates to help develop
such an access system are much more interested in developing encryption systems to keep the feds out than to
under any circumstanceslet them in. The tech community has spent a lot more time and energy arguing against
the plausibility and desireability of implementing what Comey is seeking than it has spent in trying to develop
systems that deliver it while mitigating the risks such a system might pose. For both industry and the tech
communities, more broadly, this is government's problem, not their problem. Yet reviving the Clipper Chip model
in which government develops a fully-formed system and then puts it out publicly for the community to shoot down
is clearly not what Comey has in mind. He is talking in very different language: the language of performance
requirements. He wants to leave the development task to Silicon Valley to figure out how to implement

wants to describe what he needsdecrypted signal when he


has a warrantand leave the companies to figure out how to deliver it while still
providing secure communications in other circumstances to their customers. The
advantage to this approach is that it potentially lets a thousand flowers
bloom. Each company might do it differently. They would compete to
provide the most security consistent with the performance standard. They
government's requirements. He

could learn from each other. And government would not be in the position
of developing and promoting specific algorithms. It wouldn't even need to
know how the task was being done.

1NC No Cyber
Their impacts are all hypeno cyberattack
Walt 10 Stephen M. Walt 10 is the Robert and Rene Belfer Professor of
international relations at Harvard University "Is the cyber threat overblown?"
March 30
walt.foreignpolicy.com/posts/2010/03/30/is_the_cyber_threat_overblown
cyber-warfare

Am I the only person -- well, besides Glenn Greenwald and Kevin Poulson -- who thinks the "
" business may be overblown? Its
clear the U.S. national security establishment is paying a lot more attention to the issue, and colleagues of mine -- including some pretty serious and

looks to me
like a classic opportunity for threat-inflation. Mind you, I'm not saying that there aren't a lot of
level-headed people -- are increasingly worried by the danger of some sort of "cyber-Katrina." I don't dismiss it entirely, but this sure

shenanigans going on in cyber-space, or that various forms of cyber-warfare don't have military potential. So I'm not arguing for complete head-in-

heres what makes me worry that the threat is being overstated. First, the whole
issue is highly esoteric -- you really need to know a great deal about computer networks, software, encryption, etc., to know how
serious the danger might be. Unfortunately, details about a number of the alleged incidents that are being
invoked to demonstrate the risk of a "cyber-Katrina," or a cyber-9/11, remain classified, which makes it
hard for us lay-persons to gauge just how serious the problem really was or is. Moreover, even when we
hear about computers being penetrated by hackers, or parts of the internet crashing, etc., its hard to
know how much valuable information was stolen or how much actual damage was done .
And as with other specialized areas of technology and/or military affairs, a lot of the experts have a clear vested
interest in hyping the threat, so as to create greater demand for their services. Plus, we
already seem to have politicians leaping on the issue as a way to grab some pork for their states.
Second, there are lots of different problems being lumped under a single banner , whether the
the-sand complacency. But

label is "cyber-terror" or "cyber-war." One issue is the use of various computer tools to degrade an enemys military capabilities (e.g., by disrupting
communications nets, spoofing sensors, etc.). A second issue is the alleged threat that bad guys would penetrate computer networks and shut down
power grids, air traffic control, traffic lights, and other important elements of infrastructure, the way that internet terrorists (led by a disgruntled
computer expert) did in the movie Live Free and Die Hard. A third problem is web-based criminal activity, including identity theft or simple fraud (e.g.,
those emails we all get from someone in Nigeria announcing that they have millions to give us once we send them some account information). A
fourth potential threat is cyber-espionage; i.e., clever foreign hackers penetrate Pentagon or defense contractors computers and download
valuable classified information. And then there are annoying activities like viruses, denial-of-service attacks, and other things that affect the stability

This sounds like a rich menu of


potential trouble, and putting the phrase "cyber" in front of almost any noun makes it
sound trendy and a bit more frightening. But notice too that these are all somewhat different problems of quite different
importance, and the appropriate response to each is likely to be different too. Some issues -- such as the danger of
cyber-espionage -- may not require elaborate technical fixes but simply more rigorous
security procedures to isolate classified material from the web. Other problems may not require big federal
programs to address, in part because both individuals and the private sector
of web-based activities and disrupt commerce (and my ability to send posts into FP).

have incentives to protect themselves (e.g., via firewalls or by backing up critical data). And as Greenwald
warns, there may be real costs to civil liberties if concerns about vague cyber dangers lead us to grant the NSA or some other government agency

Is the danger
that some malign hacker crashes a power grid greater than the likelihood that a blizzard
would do the same thing? Is the risk of cyber-espionage greater than the potential danger
from more traditional forms of spying? Without a comparative assessment of different risks and the costs of mitigating each
greater control over the Internet. Third, this is another issue that cries out for some comparative cost-benefit analysis.

one, we will allocate resources on the basis of hype rather than analysis. In short, my fear is not that we won't take reasonable precautions against a
potential set of dangers; my concern is that we will spend tens of billions of dollars protecting ourselves against a set of threats that are not as
dangerous as we are currently being told they are.

2NC No Cyber
No cyber impact
Healey 3/20 Jason, Director of the Cyber Statecraft Initiative at the Atlantic
Council, "No, Cyberwarfare Isn't as Dangerous as Nuclear War", 2013,
www.usnews.com/opinion/blogs/world-report/2013/03/20/cyber-attacks-notyet-an-existential-threat-to-the-us
America does not face an existential cyberthreat today, despite recent
warnings. Our cybervulnerabilities are undoubtedly grave and the threats we face are severe

but far from comparable to nuclear war. The most recent alarms come in a Defense Science
Board report on how to make military cybersystems more resilient against advanced threats (in short, Russia or
China). It warned that the "cyber threat is serious, with potential consequences similar in some ways to the nuclear
threat of the Cold War." Such fears were also expressed by Adm. Mike Mullen, then chairman of the Joint Chiefs of
Staff, in 2011. He called cyber "The single biggest existential threat that's out there" because "cyber actually more
than theoretically, can attack our infrastructure, our financial systems."

While it is true that cyber

attacks might do these things, it is also true they have not only never
happened but are far more difficult to accomplish than mainstream
thinking believes. The consequences from cyber threats may be similar in some

ways to nuclear, as the Science Board concluded, but mostly, they are incredibly
dissimilar. Eighty years ago, the generals of the U.S. Army Air Corps were sure that their bombers would
easily topple other countries and cause their populations to panic, claims which did not stand up to reality. A
study of the 25-year history of cyber conflict, by the Atlantic Council and Cyber Conflict
Studies Association, has shown a similar dynamic where the impact of disruptive
cyberattacks has been consistently overestimated. Rather than theorizing about

future cyberwars or extrapolating from today's concerns, the history of cyberconflict that have actually been fought,
shows that cyber incidents have so far tended to have effects that are either widespread but fleeting or persistent

No attacks, so far, have been both widespread and persistent.


There have been no authenticated cases of anyone dying from a cyber
attack. Any widespread disruptions, even the 2007 disruption against Estonia, have been
short-lived causing no significant GDP loss. Moreover, as with conflict in other domains, cyberattacks can
but narrowly focused.

take down many targets but keeping them down over time in the face of determined defenses has so far been out
of the range of all but the most dangerous adversaries such as Russia and China. Of course, if the United States is
in a conflict with those nations, cyber will be the least important of the existential threats policymakers should be
worrying about. Plutonium trumps bytes in a shooting war. This is not all good news.
Policymakers have recognized the problems since at least 1998 with little significant progress. Worse, the threats
and vulnerabilities are getting steadily more worrying.

Still, experts have been warning of a

cyber Pearl Harbor for 20 of the 70 years since the actual Pearl Harbor. The

cyber espionage could someday accumulate


into an existential threat. But it doesn't seem so seem just yet, with only
transfer of U.S. trade secrets through Chinese

handwaving estimates of annual losses of 0.1 to 0.5 percent to the total U.S. GDP of around $15 trillion. That's bad,
but

it doesn't add up to an existential crisis or "economic cyberwar."

No impact to cyberterror
Green 2 editor of The Washington Monthly (Joshua, 11/11, The Myth of
Cyberterrorism,
http://www.washingtonmonthly.com/features/2001/0211.green.html, AG)
There's just one problem:

There is no such thing as cyberterrorism--no instance of

anyone ever having been killed by a terrorist (or anyone else) using a computer.
Nor is there compelling evidence that al Qaeda or any other terrorist
organization has resorted to computers for any sort of serious destructive activity. What's more,
outside of a Tom Clancy novel, computer security specialists believe it is virtually
impossible to use the Internet to inflict death on a large scale, and many scoff at the
notion that terrorists would bother trying. "I don't lie awake at night worrying about cyberattacks ruining my life,"

says Dorothy Denning, a computer science professor at Georgetown University and


one of the country's foremost cybersecurity experts. "Not only does
[cyberterrorism] not rank alongside chemical, biological, or nuclear weapons, but it is not anywhere
near as serious as other potential physical threats like car bombs or suicide bombers." Which
is not to say that cybersecurity isn't a serious problem--it's just not one that involves terrorists. Interviews with
terrorism and computer security experts, and current and former government and military officials, yielded near
unanimous agreement that the real danger is from the criminals and other hackers who did $15 billion in damage to
the global economy last year using viruses, worms, and other readily available tools. That figure is sure to balloon if
more isn't done to protect vulnerable computer systems, the vast majority of which are in the private sector. Yet
when it comes to imposing the tough measures on business necessary to protect against the real cyberthreats, the

people imagine cyberterrorism, they


tend to think along Hollywood plot lines, doomsday scenarios in which
terrorists hijack nuclear weapons, airliners, or military computers from halfway around the world.
Bush administration has balked. Crushing BlackBerrys When ordinary

Given the colorful history of federal boondoggles--billion-dollar weapons systems that misfire, $600 toilet seats-that's an understandable concern. But, with few exceptions, it's not one that applies to preparedness for a
cyberattack. "The government is miles ahead of the private sector when it comes to cybersecurity," says Michael
Cheek, director of intelligence for iDefense, a Virginia-based computer security company with government and
private-sector clients. "Particularly the most sensitive military systems." Serious effort and plain good fortune have
combined to bring this about. Take nuclear weapons. The biggest fallacy about their vulnerability, promoted in
action thrillers like WarGames, is that they're designed for remote operation. "[The movie] is premised on the
assumption that there's a modem bank hanging on the side of the computer that controls the missiles," says Martin
Libicki, a defense analyst at the RAND Corporation. "I assure you, there isn't." Rather, nuclear weapons and other
sensitive military systems enjoy the most basic form of Internet security: they're "air-gapped," meaning that they're
not physically connected to the Internet and are therefore inaccessible to outside hackers. (Nuclear weapons also
contain "permissive action links," mechanisms to prevent weapons from being armed without inputting codes
carried by the president.) A retired military official was somewhat indignant at the mere suggestion: "As a general
principle, we've been looking at this thing for 20 years. What cave have you been living in if you haven't considered

the Defense Department has been


particularly vigilant to protect key systems by isolating them from the Net and
this [threat]?" When it comes to cyberthreats,

even from the Pentagon's internal network. All new software must be submitted to the National Security Agency for
security testing. "Terrorists

could not gain control of our spacecraft, nuclear

weapons, or any

other type of high-consequence asset," says Air Force Chief Information


Officer John Gilligan. For more than a year, Pentagon CIO John Stenbit has enforced a moratorium on new wireless
networks, which are often easy to hack into, as well as common wireless devices such as PDAs, BlackBerrys, and
even wireless or infrared copiers and faxes. The September 11 hijackings led to an outcry that airliners are
particularly susceptible to cyberterrorism. Earlier this year, for instance, Sen. Charles Schumer (D-N.Y.) described
"the absolute havoc and devastation that would result if cyberterrorists suddenly shut down our air traffic control
system, with thousands of planes in mid-flight." In fact, cybersecurity experts give some of their highest marks to
the FAA, which reasonably separates its administrative and air traffic control systems and strictly air-gaps the latter.

It's impossible to hijack


a plane remotely, which eliminates the possibility of a high-tech 9/11 scenario in which
planes are used as weapons. Another source of concern is terrorist infiltration of our intelligence
agencies. But here, too, the risk is slim. The CIA's classified computers are also air-gapped,
as is the FBI's entire computer system. "They've been paranoid about this forever," says Libicki, adding that
And there's a reason the 9/11 hijackers used box-cutters instead of keyboards:

paranoia is a sound governing principle when it comes to cybersecurity. Such concerns are manifesting themselves
in broader policy terms as well. One notable characteristic of last year's Quadrennial Defense Review was how
strongly it focused on protecting information systems.

Cyberattacks impossible empirics and defenses solve


Rid 12 (Thomas Rid, reader in war studies at King's College London, is

author of "Cyber War Will Not Take Place" and co-author of "CyberWeapons.", March/April 2012, Think Again: Cyberwar,
http://www.foreignpolicy.com/articles/2012/02/27/cyberwar?page=full)
"Cyberwar Is Already Upon Us." No way. "Cyberwar

is coming!" John Arquilla and David Ronfeldt predicted in


a celebrated Rand paper back in 1993. Since then, it seems to have arrived -- at least by the account of
the U.S. military establishment, which is busy competing over who should get what share of the fight. Cyberspace is
"a domain in which the Air Force flies and fights," Air Force Secretary Michael Wynne claimed in 2006. By 2012,
William J. Lynn III, the deputy defense secretary at the time, was writing that

cyberwar is "just as critical

to military operations as land, sea, air, and space ." In January, the Defense Department vowed to
equip the U.S. armed forces for "conducting a combined arms campaign across all domains -- land, air, maritime,
space, and cyberspace." Meanwhile, growing piles of books and articles explore the threats of cyberwarfare,

Time for a reality check: Cyberwar is still more


hype than hazard. Consider the definition of an act of war: It has to be potentially violent, it
has to be purposeful, and it has to be political. The cyberattacks we've seen so far , from
Estonia to the Stuxnet virus, simply don't meet these criteria. Take the dubious story of a Soviet pipeline
explosion back in 1982, much cited by cyberwar's true believers as the most destructive cyberattack
ever. The account goes like this: In June 1982, a Siberian pipeline that the CIA had virtually booby-trapped with a
cyberterrorism, and how to survive them.

so-called "logic bomb" exploded in a monumental fireball that could be seen from space. The U.S. Air Force
estimated the explosion at 3 kilotons, equivalent to a small nuclear device. Targeting a Soviet pipeline linking gas
fields in Siberia to European markets, the operation sabotaged the pipeline's control systems with software from a

No one died, according to Thomas Reed, a U.S.


the only
harm came to the Soviet economy. But did it really happen? After Reed's account came out,
Vasily Pchelintsev, a former KGB head of the Tyumen region , where the alleged explosion
supposedly took place, denied the story. There are also no media reports from 1982 that confirm such an
Canadian firm that the CIA had doctored with malicious code.

National Security Council aide at the time who revealed the incident in his 2004 book, At the Abyss;

explosion, though accidents and pipeline explosions in the Soviet Union were regularly reported in the early 1980s.
Something likely did happen, but Reed's book is the only public mention of the incident and his account relied on a
single document. Even after the CIA declassified a redacted version of Reed's source, a note on the so-called
Farewell Dossier that describes the effort to provide the Soviet Union with defective technology, the agency did not
confirm that such an explosion occurred. The available evidence on the Siberian pipeline blast is so thin that it
shouldn't be counted as a proven case of a successful cyberattack. Most other commonly cited cases of cyberwar
are even less remarkable. Take the attacks on Estonia in April 2007, which came in response to the controversial
relocation of a Soviet war memorial, the Bronze Soldier. The well-wired country found itself at the receiving end of a
massive distributed denial-of-service attack that emanated from up to 85,000 hijacked computers and lasted three
weeks. The attacks reached a peak on May 9, when 58 Estonian websites were attacked at once and the online
services of Estonia's largest bank were taken down. "What's the difference between a blockade of harbors or
airports of sovereign states and the blockade of government institutions and newspaper websites?" asked Estonian
Prime Minister Andrus Ansip. Despite his analogies, the attack was no act of war. It was certainly a nuisance and an
emotional strike on the country, but the bank's actual network was not even penetrated; it went down for 90
minutes one day and two hours the next. The attack was not violent, it wasn't purposefully aimed at changing
Estonia's behavior, and no political entity took credit for it. The same is true for the vast majority of cyberattacks on

there is no known cyberattack that has caused the loss of


human life. No cyberoffense has ever injured a person or damaged a building . And if
an act is not at least potentially violent, it's not an act of war . Separating war from physical
record. Indeed,

violence makes it a metaphorical notion; it would mean that there is no way to distinguish between World War II,
say, and the "wars" on obesity and cancer. Yet those ailments, unlike past examples of cyber "war," actually do kill
people. "A Digital Pearl Harbor Is Only a Matter of Time ." Keep waiting. U.S. Defense
Secretary Leon Panetta delivered a stark warning last summer: "We could face a cyberattack that could be the

alarmist predictions have been ricocheting inside the


Beltway for the past two decades, and some scaremongers have even upped the
ante by raising the alarm about a cyber 9/11. In his 2010 book, Cyber War, former White House
equivalent of Pearl Harbor." Such

counterterrorism czar Richard Clarke invokes the specter of nationwide power blackouts, planes falling out of the
sky, trains derailing, refineries burning, pipelines exploding, poisonous gas clouds wafting, and satellites spinning

the empirical record is


less hair-raising, even by the standards of the most drastic example available . Gen.
Keith Alexander, head of U.S. Cyber Command (established in 2010 and now boasting a budget of more
out of orbit -- events that would make the 2001 attacks pale in comparison. But

than $3 billion), shared his worst fears in an April 2011 speech at the University of Rhode Island: "What I'm
concerned about are destructive attacks," Alexander said, "those that are coming." He then invoked a remarkable
accident at Russia's Sayano-Shushenskaya hydroelectric plant to highlight the kind of damage a cyberattack might
be able to cause. Shortly after midnight on Aug. 17, 2009, a 900-ton turbine was ripped out of its seat by a socalled "water hammer," a sudden surge in water pressure that then caused a transformer explosion. The turbine's
unusually high vibrations had worn down the bolts that kept its cover in place, and an offline sensor failed to detect
the malfunction. Seventy-five people died in the accident, energy prices in Russia rose, and rebuilding the plant is
slated to cost $1.3 billion. Tough luck for the Russians, but here's what the head of Cyber Command didn't say: The
ill-fated turbine had been malfunctioning for some time, and the plant's management was notoriously poor. On top
of that, the key event that ultimately triggered the catastrophe seems to have been a fire at Bratsk power station,
about 500 miles away. Because the energy supply from Bratsk dropped, authorities remotely increased the burden
on the Sayano-Shushenskaya plant. The sudden spike overwhelmed the turbine, which was two months shy of

the Sayano-Shushenskaya
incident highlights how difficult a devastating attack would be to mount .
The plant's washout was an accident at the end of a complicated and unique chain
of events. Anticipating such vulnerabilities in advance is extraordinarily difficult
even for insiders; creating comparable coincidences from cyberspace would be a
daunting challenge at best for outsiders. If this is the most drastic incident Cyber Command
can conjure up, perhaps it's time for everyone to take a deep breath. " Cyberattacks Are Becoming
Easier." Just the opposite. U.S. Director of National Intelligence James R. Clapper warned last
year that the volume of malicious software on American networks had more than
tripled since 2009 and that more than 60,000 pieces of malware are now discovered every day. The United
States, he said, is undergoing "a phenomenon known as 'convergence, ' which amplifies
the opportunity for disruptive cyberattacks, including against physical infrastructures." ("Digital
reaching the end of its 30-year life cycle, sparking the catastrophe. If anything,

convergence" is a snazzy term for a simple thing: more and more devices able to talk to each other, and formerly

Just because there's more malware, however,


doesn't mean that attacks are becoming easier. In fact, potentially damaging or
life-threatening cyberattacks should be more difficult to pull off . Why? Sensitive
systems generally have built-in redundancy and safety systems, meaning
an attacker's likely objective will not be to shut down a system , since merely
forcing the shutdown of one control system, say a power plant, could trigger a backup
and cause operators to start looking for the bug. To work as an effective weapon,
malware would have to influence an active process -- but not bring it to a screeching
halt. If the malicious activity extends over a lengthy period, it has to remain
stealthy. That's a more difficult trick than hitting the virtual off-button. Take Stuxnet,
the worm that sabotaged Iran's nuclear program in 2010. It didn't just crudely shut down the
centrifuges at the Natanz nuclear facility; rather, the worm subtly manipulated the
system. Stuxnet stealthily infiltrated the plant's networks, then hopped onto the protected control systems,
separate industries and activities able to work together.)

intercepted input values from sensors, recorded these data, and then provided the legitimate controller code with
pre-recorded fake input signals, according to researchers who have studied the worm. Its objective was not just to
fool operators in a control room, but also to circumvent digital safety and monitoring systems so it could secretly

Building and deploying Stuxnet required extremely


detailed intelligence about the systems it was supposed to compromise , and the
same will be true for other dangerous cyberweapons. Yes, "convergence,"
standardization, and sloppy defense of control-systems software could increase the
risk of generic attacks, but the same trend has also caused defenses against the
most coveted targets to improve steadily and has made reprogramming
highly specific installations on legacy systems more complex, not less.
manipulate the actual processes.

1NC No Miscalc
Empirics and technology disprove miscalculation.
Quinlan, 9 (Michael, Former Permanent Under-Sec. State UK Ministry of
Defense, Thinking about Nuclear Weapons: Principles, Problems, Prospects,
p. 63-69)
Even if initial nuclear use did not quickly end the fighting, the supposition of
inexorable momentum in a developing exchange, with each side rushing to
overreaction amid confusion and uncertainty, is implausible . It fails to consider what the

situation of the decisionmakers would really be. Neither side could want escalation. Both would be appalled at what
was going on. Both would be desperately looking for signs that the other was ready to call a halt. Both, given the
capacity for evasion or concealment which modem delivery platforms and vehicles can possess, could have in
reserve significant forces invulnerable enough not to entail use-or-lose pressures. (It may be more open to question,
as noted earlier, whether newer nuclearweapon possessors can be immediately in that position; but it is within
reach of any substantial state with advanced technological capabilities, and attaining it is certain to be a high
priority in the development of forces.) As a result, neither side can have any predisposition to suppose, in an
ambiguous situation of fearful risk, that the right course when in doubt is to go on copiously launching weapons.
And none of this analysis rests on any presumption of highly subtle or pre-concerted rationality. The rationality
required is plain. The argument is reinforced if we consider the possible reasoning of an aggressor at a more

Any substantial nuclear armoury can inflict destruction outweighing


any possible prize that aggression could hope to seize. A state attacking the possessor of such
dispassionate level.

an armoury must therefore be doing so (once given that it cannot count upon destroying the armoury preemptively) on a judgement that the possessor would be found lacking in the will to use it. If the attacked possessor
used nuclear weapons, whether first or in response to the aggressor's own first use, this judgement would begin to
look dangerously precarious. There must be at least a substantial possibility of the aggressor leaders' concluding
that their initial judgement had been mistakenthat the risks were after all greater than whatever prize they had
been seeking, and that for their own country's , survival they must call off the aggression. Deterrence planning such
as that of NATO was directed in the first place to preventing the initial misjudgement and in the second, if it were
nevertheless made, to compelling such a reappraisal. The former aim had to have primacy, because it could not be
taken for granted that the latter was certain to work. But there was no ground for assuming in advance, for all
possible scenarios, that the chance of its working must be negligible. An aggressor state would itself be at huge risk
if nuclear war developed, as its leaders would know. It may be argued that a policy which abandons hope of
physically defeating the enemy and simply hopes to get him to desist is pure gamble, a matter of who blinks first;
and that the political and moral nature of most likely aggressors, almost ex hypothesi, makes them the less likely to
blink. One response to this is to ask what is the alternativeit can only be surrender. But a more positive and
hopeful answer lies in the fact that the criticism is posed in a political vacuum. Real-life conflict would have a
political context. The context which concerned NATO during the cold war, for example, was one of defending vital
interests against a postlated aggressor whose own vital interests would not be engaged, or would be less engaged.
Certainty is not possible, but a clear asymmetry of vital interest is a legitimate basis for expecting an asymmetry,
credible to both sides, of resolve in conflict. That places upon statesmen, as page 23 has noted, the key task in
deterrence of building up in advance a clear and shared grasp of where limits lie. That was plainly achieved in coldwar Europe. If vital interests have been defined in a way that is dear, and also clearly not overlapping or
incompatible with those of the adversary, a credible basis has been laid for the likelihood of greater resolve in
resistance. It was also sometimes suggested by critics that whatever might be indicated by theoretical discussion of
political will and interests, the military environment of nuclear warfareparticularly difficulties of communication
and controlwould drive escalation with overwhelming probability to the limit. But it is obscure why matters should
be regarded as inevitably .so for every possible level and setting of action. Even if the history of war suggested (as
it scarcely does) that military decision-makers are mostly apt to work on the principle 'When in doubt, lash out', the
nuclear revolution creates an utterly new situation. The pervasive reality, always plain to both sides during the cold
war, is `If this goes on to the end, we are all ruined'. Given that inexorable escalation would mean catastrophe for
both, it would be perverse to suppose them permanently incapable of framing arrangements which avoid it. As
page 16 has noted, NATO gave its military commanders no widespread delegated authority, in peace or war, to
launch nuclear weapons without specific political direction. Many types of weapon moreover had physical
safeguards such as PALs incorporated to reinforce organizational ones. There were multiple communication and
control systems for passing information, orders, and prohibitions. Such systems could not be totally guaranteed
against disruption if at a fairly intense level of strategic exchangewhich was only one of many possible levels of
conflict an adversary judged it to be in his interest to weaken political control. It was far from clear why he
necessarily should so judge. Even then, however, i t

remained possible to operate on a general


fail-safe presumption: no authorization, no use. That was the basis on which NATO
operated. If it is feared that the arrangements which 1 a nuclear-weapon possessor

has in place do not meet such standards in some respects, the logical course is to
continue to improve them rather than to assume escalation to be certain and
uncontrollable, with all the enormous inferences that would have to flow from such an assumption. The
likelihood of escalation can never be 100 per cent, and never zero. Where between those two extremes it may lie
can never be precisely calculable in advance; and even were it so calculable, it would not be uniquely fixedit
would stand to vary hugely with circumstances. That there should be any risk at all of escalation to widespread
nuclear war must be deeply disturbing, and decision-makers would always have to weigh it most anxiously. But a
pair of key truths about it need to be recognized. The first is that the risk of escalation to large-scale nuclear war is
inescapably present in any significant armed conflict between nuclear-capable powers, whoever may have started
the conflict and whoever may first have used any particular category of weapon. The initiator of the conflict will
always have physically available to him options for applying more force if he meets effective resistance. If the risk
of escalation, whatever its degree of probability, is to be regarded as absolutely unacceptable, the necessary
inference is that a state attacked by a substantial nuclear power must forgo military resistance. It must surrender,
even if it has a nuclear armoury of its own. But the companion truth is that, as page 47 has noted, the risk of
escalation is an inescapable burden also upon the aggressor. The exploitation of that burden is the crucial route, if
conflict does break out, for managing it, to a tolerable outcome--the only route, indeed, intermediate between
surrender and holocaust, and so the necessary basis for deterrence beforehand. The working out of plans to exploit
escalation risk most effectively in deterring potential aggression entails further and complex issues. It is for
example plainly desirable, wherever geography, politics, and available resources so permit without triggering arms
races, to make provisions and dispositions that are likely to place the onus of making the bigger, and more
evidently dangerous steps in escalation upon the aggressor volib wishes to maintain his attack, rather than upon
the defender. (The customary shorthand for this desirable posture used to be 'escalation dominance'.) These issues
are not further discussed here. But addressing them needs to start from acknowledgement that there are in any
event no certainties or absolutes available, no options guaranteed to be risk-free and cost-free. Deterrence is not
possible without escalation risk; and its presence can point to no automatic policy conclusion save for those who
espouse outright pacifism and accept its consequences. Accident and Miscalculation Ensuring the safety and
security of nuclear weapons plainly needs to be taken most seriously. Detailed information is understandably not
published, but such direct evidence as there is suggests that it always has been so taken in every possessor state,
with the inevitable occasional failures to follow strict procedures dealt with rigorously. Critics have nevertheless
from time to time argued that the possibility of accident involving nuclear weapons is so substantial that it must
weigh heavily in the entire evaluation of whether war-prevention structures entailing their existence should be
tolerated at all. Two sorts of scenario are usually in question. The first is that of a single grave event involving an
unintended nuclear explosiona technical disaster at a storage site, for example, Dr the accidental or unauthorized
launch of a delivery system with a live nuclear warhead. The second is that of some eventperhaps such an
explosion or launch, or some other mishap such as malfunction or misinterpretation of radar signals or computer
systemsinitiating a sequence of response and counter-response that culminated in a nuclear exchange which no
one had truly intended. No event that is physically possible can be said to be of absolutely zero probability (just as
at an opposite extreme it is absurd to claim, as has been heard from distinguished figures, that nuclear-weapon use
can be guaranteed to happen within some finite future span despite not having happened for over sixty years). But

We have to assess
levels between those theoretical limits and weigh their reality and implications
against other factors, in security planning as in everyday life. There have certainly
been, across the decades since 1945, many known accidents involving nuclear weapons,
from transporters skidding off roads to bomber aircraft crashing with or accidentally
dropping the weapons they carried (in past days when such carriage was a frequent feature of
human affairs cannot be managed to the standard of either zero or total probability.

readiness arrangements----it no longer is). A few of these accidents may have released into the nearby environment

None however has entailed a nuclear detonation. Some


commentators suggest that this reflects bizarrely good fortune amid such massive
activity and deployment over so many years. A more rational deduction from the
facts of this long experience would however be that the probability of any accident
triggering a nuclear explosion is extremely low. It might be further noted that the
mechanisms needed to set off such an explosion are technically demanding, and
that in a large number of ways the past sixty years have seen extensive
improvements in safety arrangements for both the design and the handling of
weapons. It is undoubtedly possible to see respects in which, after the cold war, some of the factors bearing
upon risk may be new or more adverse; but some are now plainly less so. The years which the world has
come through entirely without accidental or unauthorized detonation have included
early decades in which knowledge was sketchier, precautions were less developed,
and weapon designs were less ultra-safe than they later became, as well as
substantial periods in which weapon numbers were larger, deployments more
highly toxic material.

widespread and diverse, movements more frequent, and several aspects of doctrine
and readiness arrangements more tense. Similar considerations apply to the
hypothesis of nuclear war being mistakenly triggered by false alarm. Critics again
point to the fact, as it is understood, of numerous occasions when initial steps in alert
sequences for US nuclear forces were embarked upon , or at least called for, by, indicators
mistaken or misconstrued. In none of these instances , it is accepted, did matters get at
all near to nuclear launch--extraordinary good fortune again, critics have suggested. But the rival and
more logical inference from hundreds of events stretching over sixty years of experience presents itself once more:
that the probability of initial misinterpretation leading far towards mistaken launch is remote. Precisely because any
nuclear-weapon possessor recognizes the vast gravity of any launch, release sequences have many steps, and
human decision is repeatedly interposed as well as capping the sequences. To convey that because a first step was
prompted the world somehow came close to accidental nuclear war is wild hyperbole, rather like asserting, when a

History anyway
scarcely offers any ready example of major war started by accident even before the
nuclear revolution imposed an order-of-magnitude increase in caution . It was
occasionally conjectured that nuclear war might be triggered by the real but
accidental or unauthorized launch of a strategic nuclear-weapon delivery system in
the direction of a potential adversary. No such launch is known to have occurred in
over sixty years. The probability of it is therefore very low. But even if it did happen,
the further hypothesis of it initiating a general nuclear exchange is far-fetched. It fails
tennis champion has lost his opening service game, that he was nearly beaten in straight sets.

to consider the real situation of decision-makers as pages 63-4 have brought out. The notion that cosmic holocaust
might be mistakenly precipitated in this way belongs to science fiction.

2NC No Miscalc
Technical barriers and de-targeting solve miscalculation.
Slocombe 9 (Walter, senior advisor for the Coalition Provisional Authority in
Baghdad and a former Under Secretary of Defense for Policy, he is a fourtime recipient of an award for Distinguished Public Service and a member of
the Council on Foreign Relations, De-Alerting: Diagnoses, Prescriptions, and
Side-Effects, Presented at the seminar on Re-framing De-Alert: Decreasing
the Operational Readiness of Nuclear Weapons Systems in the US-Russia
Context in Yverdon, Switzerland, June 21-23)
Lets start with Technical Failure the focus of a great deal of the advocacy, or at
least of stress on past incidents of failures of safety and control mechanisms .4 Much
of the de-alerting literature points to a succession of failures to follow proper
procedures and draw from that history the inference that a relatively simple
procedural failure could produce a nuclear detonation. The argument is essentially that nuclear
weapons systems are sufficiently susceptible of pure accident (including human error or failure at operational/field
level) that it is essential to take measures that have the effect of making it necessary to undertake a prolonged
reconfiguration of the elements of the nuclear weapons force for a launch or detonation to be physically possible.
Specific measures said to serve this objective include separating the weapons from their launchers, burying silo
doors, removal of fuzing or launching mechanisms, deliberate avoidance of maintenance measures need to permit
rapid firing, and the like. . My view is that this line of action is unnecessary in its own terms and highly
problematic from the point of view of other aspects of the problem and that there is a far better option that is

the requirement of external information a code


not held by the operators -- to arm the weapons Advocates of other, more
physical, measures often describe the current arrangement as nuclear weapons
being on a hair trigger. That is at least with respect to US weapons a highly
misleading characterization. The hair trigger figure of speech confuses alert
status readiness to act quickly on orders -- with susceptibility to inadvertent
action. The hair trigger image implies that a minor mistake akin to jostling a gun will
fire the weapon. The US StratCom commander had a more accurate metaphor when he recently said that US
nuclear weapons are less a pistol with a hair trigger than like a pistol in a holster
with the safety turned on and he might have added that in the case of nuclear weapons
the safety is locked in place by a combination lock that can only be opened and
firing made possible if the soldier carrying the pistol receives a message from his
chain of command giving him the combination. Whatever other problems the current nuclear
posture of the US nuclear force may present, it cannot reasonably be said to be on a hair trigger. Since the
1960s the US has taken a series of measures to insure that US nuclear weapons
cannot be detonated without the receipt of both external information and properly
authenticated authorization to use that information. These devices generically Permissive
largely already in place, at least in the US force

Action Links or PALs are in effect combination locks that keep the weapons locked and incapable of detonation
unless and until the weapons firing mechanisms have been unlocked following receipt of a series of numbers
communicated to the operators from higher authority. Equally important in the context of a military organization,
launch of nuclear weapons (including insertion of the combinations) is permitted only where properly authorized by

This combination of reliance on discipline and procedure and on


receipt of an unlocking code not held by the military personnel in charge of the
launch operation is designed to insure that the system is fail safe, i.e., that whatever
mistakes occur, the result will not be a nuclear explosion. Moreover, in recent years, both
the US and Russia, as well as Britain and China, have modified their procedures so
that even if a nuclear-armed missile were launched, it would go not to a real
target in another country but at least in the US case - to empty ocean. In addition to the basic
an authenticated order.

advantage of insuring against a nuclear detonation in a populated area, the fact that a missile launched in error
would be on flight path that diverged from a plausible attacking trajectory should be detectable by either the US or

the Russian warning systems, reducing the possibility of the accident being perceived as a deliberate attack. De-

targeting, therefore, provides a significant protection against technical error.

These
arrangements PALs and their equivalents coupled with continued observance of the agreement made in the mid90s on de-targeting do not eliminate the possibility of technical or operator-level failures, but they come very
close to providing absolute assurance that such errors cannot lead to a nuclear explosion or be interpreted as the
start of a deliberate nuclear attack.6 The advantage of such requirements for external information to activate
weapons is of course that the weapons remain available for authorized use but not susceptible of appropriation or
mistaken use.

Miscalculation and accidental launch are impossible.


Bolkcom et al 6 Christopher Bolkcom, Foreign Affairs, Defense, and
Trade Division of the Congressional Research Service, et al., August 11, 2006,
U.S. Conventional Forces and Nuclear Deterrence: A China Case Study,
online: http://www.au.af.mil/au/awc/awcgate/crs/rl33607.pdf
Once a conflict begins, participants can feel pressure to act quickly, to control events and to manage the crisis in a
way that meets its interests. This, in turn, can make the crisis escalate quickly and unpredictably. For example, if its
command and control systems were protected from attack and offered redundant capabilities, and its forces were
not vulnerable to an early strike by the adversary, then a nation could delay its response, await further information,

if a countrys command
and control infrastructure and its key forces were vulnerable to attack early in a
conflict, then it might feel compelled to act quickly , using those forces before it lost them to
attack, and before it had complete information about the intent and capabilities of its
adversary in pursuing the conflict . Preferably, the capabilities or posture of a nations
conventional and nuclear forces would not inherently add to this instability. Specific
U.S. crisis stability objectives in these scenarios may include fielding forces that 1)
are not vulnerable, and do not make Chinese forces vulnerable to use it or lose it
pressures, and 2) do not appear to be either vulnerable to or capable of political or
military decapitation. Both the United States and China have currently deployed their long-range nuclear
and possibly seek alternate means to resolve the conflict. On the other hand,

forces in ways that would not leave them vulnerable to a first strike, and therefore, appear unlikely to undermine

Chinese forces lack the accuracy to attack U.S. land-based forces and
cannot effectively track and engage U.S. submarines that carry ballistic missiles
(called SSBNs). Chinese long-range missiles are deployed in deeply buried silos,
protected by rough terrain and mountains, or deployed on mobile launchers. Therefore,
stability in a crisis.

neither the United States nor China would experience pressure to use these weapons before losing them. Early
warning and command and control systems, could, however, still be vulnerable to disruption on both sides.
Therefore, efforts to disrupt these assets, or other factors, such as a desire to achieve tactical surprise, could
stimulate prompt or accelerated responses as soon as a crisis unfolds.

No risk of miscalculation deterrence checks


Madsen et al 2010 - bachelor student at Global Studies and
Public Administration at Roskilde University
[Tina Sndergrd, Maia Juel Giorgio, Mark Westh and Jakob
Wiegersma, Autumn 2010, Nuclear Deterrence in South
Asia Global Studies,
http://dspace.ruc.dk/bitstream/1800/6041/1/Project%20GS-BA,
%20Autumn%202010.pdf]//SGarg
We have now assessed Waltzs three requirements for effective deterrence
between India and Pakistan. In regards to survivability of their nuclear arsenals, both meet
this requirement as they have clearly stated that their nuclear weapons

should be able to perform a retaliatory strike. Furthermore, India and Pakistan


are procuring various delivery vehicles, such as nuclear submarines, which enhance
second-strike capability. The second requirement, no early firing as a result of false alarm, is difficult
to test

empirically, as it is nearly impossible to find out the reactions of soldiers who believe

they are under

However, by the fact that weapon components are separated


from fissile cores, time is given for commanders to react appropriately, which
reduces the risk of being subjected to false alarms . With regards to the third
requirement, India has an effective command and control system with regards to
the notion of force-in-being. As components take time to be assembled, unauthorized use or
nuclear attack.

accidents are not likely to occur. Even though there are insurgent elements that might

disrupt the Pakistani

Pakistan, as well as India, is not prone to


immediate theft and accidents of its nuclear weapons. Through their doctrines and
statements, both countries believe in retaliatory action. Therefore, it is possible to conclude,
through Waltz framework, that both countries are deterred by each other, and thus
that deterrence is in effect
nuclear security in the future, it seems that currently,

Espionage Advantage

Notes
Hell, their internal link is based off CHINA putting backdoors in Huawei
(Chinese tech giant) and China exploiting those in US customers. This is
ridiculous because A. the plan doesnt stop China from mandating backdoors
and B. the US has already functionally banned sales of Huawei products in
the US

CX

1NC Rels Resilient


Relations are improving now and are resilient increased
commitment to bilateral relations by both Presidents Obama
and Jinping
Podesta et. al. 14 - John Podesta currently serves as Counselor to President
Barack Obama. At the time of this reports writing, he was chair of the Center
for American Progress, which he founded in 2003. Podesta previously served
as White House chief of staff from 1998 to 2001 under President Bill Clinton
and was co-chairman of the Obama transition team in 2008. Tung Chee Hwa
is the Founding Chairman of the China-U.S. Exchange Foundation and the
Vice Chairman of the Eleventh National Committee of the Peoples Political
Consultative Conference. He previously served as the first chief executive of
the Hong Kong Special Administrative Region from 1997 to 2005. Samuel R.
Berger is Chair of the Albright Stonebridge Group. He served as national
security advisor to President Clinton from 1997 to 2001. Prior to his service in
the Clinton Administration, Berger spent 16 years at the Washington law firm
Hogan & Hartson. Wang Jisi is President of the Institute of International and
Strategic Studies and professor at the School of International Studies at
Peking University. Professor Wang is a member of the Foreign Policy Advisory
Committee of the Foreign Ministry of China and is president of the Chinese
Association of American Studies. (John, Tung Chee Hwa, Samuel Berger,
Wang Jisi, February 2014, US/China Relations: Toward a New Model of Major
Power Relationship Center for American Progress,
http://www.fas.org/sgp/crs/misc/RL34511.pdf)
Chinese Vice President Xi Jinping raised the
prospect of a new type of relationship between major countries in the 21st
century.1 As State Councilor Dai Bingguo said about the concept, China and the U.S. must create
the possibility that countries with different political institutions, cultural traditions
and different economic systems can respect and cooperate with each other.2 A year later,
President Barak Obama and President Xi Jinping conducted an informal, shirt-sleeve
summit in southern California to establish a solid working relationship between the
two presidents. Then National Security Adviser Tom Donlion described the challenge facing President Obama
In February 2012, during a Washington, D.C., visit, then

and President Xi at the summit as turning the aspiration of charting a new course for our relationship into a reality
and to build out the new model of relations between great powers.3 We have been interested in the idea of a
new model of major power relations ever since we attended the lunch in Washington when then Vice President Xi
first raised it. We, along with our respective institutionsthe Center for American Progress in Washington and the
China-U.S. Exchange Foundation in Hong Konghad already been engaged in track II high-level dialogue between
Chinese and American scholars for several years by then. We were quite familiar with the challenge, as then

Secretary of State Hillary Clinton put it, to write a new answer to the age-old
question of what happens when an established power and a rising power meet .4 In
conjunction with the initiative of the two presidents, we proposed that our track II focus on
the very topic that engaged the leaders: building a new model of major power relations
between the United States and China. To prepare for the dialogue, experts in Washington, California,
Beijing, Shanghai, and Hong Kong drafted and exchanged papers, printed in this volume, on the U.S. and Chinese
perspectives on what a new model of major power relations would look like in practice; how the bilateral
relationship fits into regional and international structures; what governing principles for the relationship could be;
and how to take steps towards a positive, constructive relationship. The two sides discussed their approaches and
findings in a series of video conference calls through the spring and summer of 2013. In September 2013, we
convened a distinguished group of American and Chinese experts to discuss the concepts raised in the papers. The
group is listed with their affiliations at the beginning of this volume.

2NC Rels Resilient


No risk of a US-China war relations are resilient and
interdependence checks
Glaser 11 - , Professor of Political Science and International Affairs and
Director of the Institute for Security and Conflict Studies at the Elliott School
of International Affairs at George Washington University, (CHARLES GLASER,
March/April, Will China's Rise Lead to War?, Foreign Affairs,)
So far, the China debate among international relations theorists has pitted optimistic liberals against pessimistic
realists. The liberals argue that because the current international order is defined by economic and political
openness, it can accommodate China's rise peacefully. The United States and other leading powers, this argument
runs, can and will make clear that China is welcome to join the existing order and prosper within it, and China is
likely to do so rather than launch a costly and dangerous struggle to overturn the system and establish an order

The standard realist view, in contrast, predicts intense competition.


China's growing strength, most realists argue, will lead it to pursue its interests
more assertively, which will in turn lead the United States and other countries to balance against it. This
cycle will generate at the least a parallel to the Cold War standoff between the United States and the
Soviet Union, and perhaps even a hegemonic war. Adherents of this view point to China's
recent harder line on its maritime claims in the East China and South China seas and to the increasingly close
more to its own liking.

relations between the United States and India as signs that the cycle of assertiveness and balancing has already

a more nuanced version of realism provides grounds for optimism.


China's rise need not be nearly as competitive and dangerous as the standard realist
begun. In fact, however,

because the structural forces driving major powers into


conflict will be relatively weak. The dangers that do exist, moreover, are not
argument suggests,

the ones predicted by sweeping theories of the international system in general but instead
stem from secondary disputes particular to Northeast Asia--and the security prevalent in the
international system at large should make these disputes easier for the United States
and China to manage. In the end, therefore, the outcome of China's rise will depend less on the pressures
generated by the international system than on how well U.S. and Chinese leaders manage the situation. Conflict is
not predetermined--and if the United States can adjust to the new international conditions, making some
uncomfortable concessions and not exaggerating the dangers, a major clash might well be avoided. A GOOD KIND
OF SECURITY DILEMMA STRUCTURAL REALISM explains states' actions in terms of the pressures and opportunities
created by the international system. One need not look to domestic factors to explain international conflict, in this
view, because the routine actions of independent states trying to maintain their security in an anarchic world can
result in war. This does not happen all the time, of course, and explaining how security-seeking states find
themselves at war is actually something of a puzzle, since they might be expected to choose cooperation and the
benefits of peace instead. The solution to the puzzle lies in the concept of the security dilemma--a situation in which
one state's efforts to increase its own security reduce the security of others. The intensity of the security dilemma
depends, in part, on the ease of attack and coercion. When attacking is easy, even small increases in one state's
forces will significantly decrease the security of others, fueling a spiral of fear and arming. When defending and
deterring are easy, in contrast, changes in one state's military forces will not necessarily threaten others, and the
possibility of maintaining good political relations among the players in the system will increase. The intensity of the
security dilemma also depends on states' beliefs about one another's motives and goals. For example, if a state
believes that its adversary is driven only by a quest for security--rather than, say, an inherent desire to dominate
the system--then it should find increases in the adversary's military forces less troubling and not feel the need to
respond in kind, thus preventing the spiral of political and military escalation. The possibility of variation in the
intensity of the security dilemma has dramatic implications for structural realist theory, making its predictions less
consistently bleak than often assumed. When the security dilemma is severe, competition will indeed be intense
and war more likely. These are the classic behaviors predicted by realist pessimism. But when the security dilemma
is mild, a structural realist will see that the international system creates opportunities for restraint and peace.
Properly understood, moreover, the security dilemma suggests that a state will be more secure when its adversary
is more secure--because insecurity can pressure an adversary to adopt competitive and threatening policies. This
dynamic creates incentives for restraint and cooperation. If an adversary can be persuaded that all one wants is
security (as opposed to domination), the adversary may itself relax. What does all this imply about the rise of

Current international conditions should enable


both the United States and China to protect their vital interests without posing large
threats to each other. Nuclear weapons make it relatively easy for major
powers to maintain highly effective deterrent forces. Even if Chinese
China? At the broadest level, the news is good.

power were to greatly exceed U.S. powe r somewhere down the road, the United States
would still be able to maintain nuclear forces that could survive any Chinese attack
and threaten massive damage in retaliation. Large-scale conventional attacks by
China against the U.S. homeland, meanwhile, are virtually impossible because the
United States and China are separated by the vast expanse of the Pacific Ocean, across which it would be
difficult to attack. No foreseeable increase in China's power would be large enough to overcome these twin
advantages of defense for the United States. The same defensive advantages, moreover, apply to China as well.
Although China is currently much weaker than the United States militarily, it will soon be able to build a nuclear

China should not find the United States'


massive conventional capabilities especially threatening, because the bulk of U.S.
forces, logistics, and support lie across the Pacific. The overall effect of these conditions is
to greatly moderate the security dilemma. Both the United States and China will be
able to maintain high levels of security now and through any potential rise of China
to superpower status. This should help Washington and Beijing avoid truly strained
geopolitical relations, which should in turn help ensure that the security dilemma
stays moderate, thereby facilitating cooperation. The United States, for example, will have the
force that meets its requirements for deterrence. And

option to forego responding to China's modernization of its nuclear force. This restraint will help reassure China that
the United States does not want to threaten its security--and thus help head off a downward political spiral fueled
by nuclear competition.

1NC Relations Decline Inev


Relations fail inevitably
Friedberg 9/2012 - Professor of Politics and International Affairs at the
Woodrow Wilson School of Public and International Affairs at Princeton
University and the author of A Contest for Supremacy: China, America, and
the Struggle for Mastery in Asia (Aaron, Bucking Beijing, Foreign Affairs,
http://www.foreignaffairs.com/articles/138032/aaron-l-friedberg/buckingbeijing)
Recent events have raised serious doubts about both elements of this strategy. Decades of trade and
talk have not hastened China's political liberalization. Indeed, the last few years have
been marked by an intensified crackdown on domestic dissent. At the same time, the much-touted
economic relationship between the two Pacific powers has become a major source of
friction. And despite hopes for enhanced cooperation, Beijing has actually done
very little to help Washington solve pressing international problems, such as North
Korea's acquisition of nuclear weapons or Iran's attempts to develop them. Finally, far from accepting the status

China's leaders have become more forceful in attempting to control the waters and
resources off their country's coasts. As for balancing, the continued buildup of China's military
capabilities, coupled with impending cuts in U.S. defense spending, suggests that
the regional distribution of power is set to shift sharply in Beijing's favor. WHY WE CAN'T ALL
JUST GET ALONG Today, China's ruling elites are both arrogant and insecure. In their
view, continued rule by the Chinese Communist Party (CCP) is essential to China's stability,
prosperity, and prestige; it is also, not coincidentally, vital to their own safety and comfort. Although
they have largely accepted some form of capitalism in the economic sphere, they remain committed to
preserving their hold on political power. The CCP'S determination to maintain control informs the
regime's threat perceptions, goals, and policies. Anxious about their legitimacy, China's
rulers are eager to portray themselves as defenders of the national honor .
Although they believe China is on track to become a world power on par with the United States, they remain
deeply fearful of encirclement and ideological subversion . And despite
Washington's attempts to reassure them of its benign intentions, Chinese
leaders are convinced that the United States aims to block China's rise and,
ultimately, undermine its one-party system of government . Like the United States, since the
quo,

end of the Cold War, China has pursued an essentially constant approach toward its greatest external challenger.
For the most part, Beijing has sought to avoid outright confrontation with the United States while pursuing
economic growth and building up all the elements of its "comprehensive national power," a Chinese strategic
concept that encompasses military strength, technological prowess, and diplomatic influence. Even as they remain

They have sought


advances, slowly expanding China's sphere of influence and strengthening its
position in Asia while working quietly to erode that of the United States. Although they are careful
never to say so directly, they seek to have China displace the United States in the long run and
on the defensive, however, Chinese officials have not been content to remain passive.
incremental

to restore China to what they regard as its rightful place as the preponderant regional power. Chinese strategists do
not believe that they can achieve this objective quickly or through a frontal assault. Instead,

they seek to

reassure their neighbors, relying on the attractive force of China's massive economy to counter nascent
balancing efforts against it. Following the advice of the ancient military strategist Sun-tzu, Beijing aims to
"win without fighting," gradually creating a situation in which overt resistance to its wishes will appear
futile. The failure to date to achieve a genuine entente between the United States and
China is the result not of a lack of effort but of a fundamental divergence of interests.
Although limited cooperation on specific issues might be possible, the

ideological gap between the two nations is simply too great, and the level of trust
between them too low, to permit a stable modus vivendi. What China's current leaders
ultimately want -- regional hegemony -- is not something their counterparts in
Washington are willing to give. That would run counter to an axiomatic goal of U.S. grand strategy,
which has remained constant for decades: to prevent the domination of either end of the Eurasian landmass by one

The reasons for this goal involve a mix of strategic, economic, and
will continue to be valid into the foreseeable future.

or more potentially hostile powers.


ideological considerations that

2NC Relations Decline Inev


Relations are impossible
Zhang 12/13/2012 -Editor-in-Chief, Co-Founder at WiseLit (Henry, USChina Relations: Why Obama's 'Asia Pivot' Strategy Could Lead to Disaster,
Policy Mic, http://www.policymic.com/articles/20675/us-china-relations-whyobama-s-asia-pivot-strategy-could-lead-to-disaster)
This American response is due in part to the surprising advancements made by the
Chinese military, such as the successful developments of its aircraft carrier,

advanced jet fighters, and more cost-effective drones. China-U.S. relations


expert Wu Xinbo advises the U.S. not to just focus on China's rising
capabilities, but also to "pay attention to how China will use its military
power." It is not surprising that China wants to catch up militarily, as it is a
dominant economic power that has the means to do so. However, the Chinese
Communist Party and the People's Liberation Army may not necessarily want to
undermine U.S. global military preeminence, but rather wish to assert their
country's sovereignty in regional disputes involving territories in the East and South
China seas. The Chinese might threaten U.S. dominance in these regions
insofar as they see American forces as encroachments that they must
guard against. Conversely, Washington sees itself as an important player in the
Pacific, with certain obligations and diplomatic interests to which it must attend.

Notable strategic maneuvers stemming from this perception include the


stationing of 250 U.S. Marines in Australia, and military drills with Japan.

Alt causes to US-China relations


Haenle 1/15/2014 - director of the CarnegieTsinghua Center, adjunct
professor at Tsinghua University on International Relations, formerly served
on the national Security Council on East Asia (Paul, What Does a New Type
of Great-Power Relations Mean for the United States and China? Carnegie
Endowment, http://carnegieendowment.org/2014/01/15/us-china-relations2013-new-model-of-major-power-relations-in-theory-and-in-practice/gyjm?
reloadFlag=1)
First, the U.S. and China need to start actively cooperating together on global challenges where we have mutual

the major
challenges and opportunities for the U.S.-China relationship will come in working
together to address critical global challenges such as nuclear proliferation, energy
and food security, terrorism, climate change, Middle East instability, cyber security,
and global financial reform and recovery. The need to find tangible ways to work together
interests. In the past, our countries have focused on bilateral issues. Today, however,

constructively on global challenges was evident at Sunnylands. Obama and Xi concluded their discussions with an
announcement to enhance cooperation on combating nuclear proliferation by continuing to apply pressure on
Pyongyang, and to work together to combat climate change by discussing ways to reduce emissions of
hydrofluorocarbons. If we can engage in more effort together that produce real benefits for our peoples as well as
the rest of the world, this will be an important step toward making the new model of major country relations a
reality. Second, Chinese and American leaders will need to resist the expectation that either side will change the
other sides views on long-standing and historical areas of disagreement between our two countriessuch as

Many in Washington are concerned that the new model of


great power relations concept is an effort by China to compel the U.S. to respect
Chinas core interests, create Chinese spheres of influence, and get the U.S. to
accommodate China's interests on Beijings terms. This type of approach will not
work, and making this a starting point for discussions on the new paradigm will only set this exercise up for
Taiwan or human rights overnight.

failure. On many of these issues, including Taiwan, the United States and China have agreed to disagree since their
first communiqu in 1972. But in a more positive context of greater cooperation on global issues, our leaders will be
Third, our countries
have new areas of tensions in the relationship that exacerbate mistrust and
that we need to address with urgency. In 2013, these issues included revelations of Chinese
cyber hacking of American commercial and military secrets to dangerous risks
deriving from regional territorial disputes in the South and East China Sea, including
Beijings recent announcement of a new Air Defense Identification Zone. These

in a better place to work on and reduce these areas of long-standing disagreement.

challenges, especially the latter, hold the potential for confrontation between our militaries if we do not renew our
military to military efforts to increase transparency and cooperation. These important issues must be addressed
head-on, not sidestepped. But as we work vigorously through these current disagreements, we should not allow

we must address
them with urgency and find ways to reduce these disagreements and
enhance trust if we are to achieve a new type of major country relations.
these areas of friction to define or overwhelm our broad and robust relationship. But

Add-Ons

1NC Meltdown =/= Extinction


Coal plants disprove the impact they emit way more radiation than a global
meltdown

Worstall 13 Forbes Contributor focusing on business and technology (Tim


Worstall, 8/10/13, The Fukushima Radiation Leak Is Equal to 76 Milion
Bananas, http://www.forbes.com/sites/timworstall/2013/08/10/thefukushima-radiation-leak-is-equal-to-76-million-bananas/)//twonily
Not that Greenpeace is ever going to say anything other than that nuclear power is the work of the very devil of
course. And the headlines do indeed seem alarming: Radioactive Fukushima groundwater rises above
barrier Up to 40 trillion becquerels released into Pacific ocean so far Storage for radioactive water running out.

a cumulative 20 trillion to 40 trillion becquerels of


radioactive tritium may have leaked into the sea since the disaster. Most of us havent a clue
Or: Tepco admitted on Friday that

what that means of course. We dont instinctively understand what a becquerel is in the same way that we do
pound, pint or gallons, and certainly trillions of anything sounds hideous. But dont forget that trillions of

we really want
to know is whether 20 trillion becquerels of radiation is actually an important
picogrammes of dihydrogen monoxide is also the major ingredient in a glass of beer. So what

To which the answer is no, it isnt. This is actually around and


about (perhaps a little over) the amount of radiation the plant was allowed to dump
into the environment before the disaster. Now there are indeed those who insist that any
amount of radiation kills us all stone dead while we sleep in our beds but Im afraid that
this is incorrect . Were all exposed to radiation all the time and we all
seem to survive long enough to be killed by something else so radiation isnt as
number.

dangerous as all that. At which point we can offer a comparison. Something to try and give us a sense of
perspective about whether 20 trillion nasties of radiation is something to get all concerned about or not. That
comparison being that the radiation leakage from Fukushima appears to be about the same as that from 76 million
bananas. Which is a lot of bananas I agree, but again we can put that into some sort of perspective. Lets start from
the beginning with the banana equivalent dose, the BED. Bananas contain potassium, some portion of potassium is
always radioactive, thus bananas contain some radioactivity. This gets into the human body as we digest the lovely
fruit (OK, bananas are an herb but still): Since a typical banana contains about half a gram of potassium, it will
have an activity of roughly 15 Bq. Excellent, we now have a unit that we can grasp, one that the human mind can
use to give a sense of proportion to these claims about radioactivity. We know that bananas are good for us on
balance, thus this amount of radioactivity isnt all that much of a burden on us. We
also have that claim of 20 trillion becquerels of radiation having been dumped into the Pacific Ocean in the past
couple of years. 20 trillion divided by two years by 365 days by 24 hours gives us an hourly rate of 1,141,552,511
becquerels per hour. Divide that by our 15 Bq per banana and we can see that the radiation spillage from
Fukushima is running at 76 million bananas per hour. Which is, as I say above, a lot of bananas. But its not actually
that many bananas. World production of them is some 145 million tonnes a year. Theres a thousand kilos in a
tonne, say a banana is 100 grammes (sounds about right, four bananas to the pound, ten to the kilo) or 1.45 trillion
bananas a year eaten around the world. Divide again by 365 and 24 to get the hourly consumption rate and we get
165 million bananas consumed per hour. We can do this slightly differently and say that the 1.45 trillion bananas
consumed each year have those 15 Bq giving us around 22 trillion Bq each year. The Fukushima leak is 20 trillion Bq

current leak is just under half that


exposure that we all get from the global consumption of bananas. Except even
thats overstating it. For the banana consumption does indeed get into our bodies: the Fukushima leak is
over two years: thus our two calculations agree. The

getting into the Pacific Ocean where its obviously far less dangerous. And dont forget that all that radiation in the
bananas ends up in the oceans as well, given that we do in fact urinate it out and no, its not something that the
sewage treatment plants particularly keep out of the rivers. There are some who are viewing this radiation leak very
differently: Arnold Gundersen, Fairewinds Associates: [...] we are contaminating the Pacific Ocean which is
extraordinarily serious. Evgeny Sukhoi: Is there anything that can be done with that, I mean with the ocean?
Gundersen: Frankly, I dont believe so. I think we will continue to release radioactive material into the ocean for 20
or 30 years at least. They have to pump the water out of the areas surrounding the nuclear reactor. But frankly, this
water is the most radioactive water Ive ever experienced. I have to admit that I simply dont agree. Im not actually
arguing that radiation is good for us but I really dont think that half the radiation of the worlds banana crop being

diluted into the Pacific Ocean is all that much to worry about. And why

about it

we really shouldnt worry

all that much. The radiation that fossil fuel plants spew into the environment each year is around 0.1

Fukushima is pumping out 10 trillion becquerels a


year at present. Or 10 TBq, or 10 of 10 to the power of 12. Or, if you prefer, one ten thousandth of
EBq. Thats ExaBecquerel, or 10 to the power of 18.

the amount that the worlds coal plants are doing . Or even, given that
there are only about 2,500 coal plants in the world, Fukushima is, in this disaster,
pumping out around one quarter of the radiation that a coal plant does in
normal operation . You can worry about it if you want but its not something thats likely
to have any real measurable effect on anyone or anything.

2NC Meltdown =/= Extinction


No impact empirics
Marder, 11 staff writer (Jenny, Mechanics of a Nuclear Meltdown
Explained, PBS, 3/15/2011,
http://www.pbs.org/newshour/rundown/mechanics-of-a-meltdownexplained/) //RGP

Japanese workers are still struggling to regain control of


an earthquake and tsunami-damaged nuclear power plant amid worsening fears of
a full meltdown. Which raises the questions: What exactly is a nuclear meltdown? And what is a partial
meltdown? This term meltdown is being bandied about, and I think people think that you get the fuel
hot and things start melting and become liquid , said Charles Ferguson, physicist and president of
the Federation of American Scientists. But there are different steps along the way . Inside the core
After a powerful explosion on Tuesday,

of the boiling water reactors at Japans Fukushima Dai-ichi facility are thousands of zirconium metal fuel rods, each
stacked with ceramic pellets the size of pencil erasers. These pellets contain uranium dioxide. Under normal
circumstances, energy is generated by harnessing the heat produced through an atom-splitting process called
nuclear fission. As uranium atoms split, they produce heat, while creating whats known as fission products. These
are radioactive fragments, such as barium, iodine and Cesium-137. In a working nuclear reactor, water gets
pumped into the reactors heated core, boils, turns into steam and powers a turbine, generating electricity.

each uranium atom splits into two parts, and you get a whole soup of
elements in the middle of the periodic table , said Arjun Makhijani, a nuclear engineer and
Basically,

president of the Institute for Energy and Environmental Research. A reactor is like a pressure cooker. It contains

In the
event of a cooling failure, water gets injected to cool the fuel rods, and pressure
builds. This superheated core must be cooled with water to prevent overheating and an excessive buildup of
steam, which can cause an explosion. In Japan, theyve been relieving pressure by releasing
steam through pressure valves. But its a trade-off, as theres no way to do this
without also releasing some radioactive material . A nuclear meltdown is an accident resulting
boiling water and steam, and as temperature rises, so does pressure, since the steam cant escape.

from severe heating and a lack of sufficient cooling at the reactor core, and it occurs in different stages. As the core
heats, the zirconium metal reacts with steam to become zirconium oxide. This oxidation process releases additional
heat, further increasing the temperature inside the core. High temperatures cause the zirconium coating that
covers the surface of the fuel rods to blister and balloon. In time, that ultra-hot zirconium metal starts to melt.
Exposed parts of the fuel rods eventually become liquid, sink down into the coolant and solidify. And thats just the
beginning of a potentially catastrophic event. This can clog and prevent the flow of more coolant, Ferguson said.
And that can become a vicious cycle. Partial melting can solidify and block cooling channels, leading to more
melting and higher temperatures if adequate cooling isnt present. A full meltdown would involve all of the fuel in
that core melting and a mass of molten material falling and settling at the bottom of the reactor vessel. If the vessel

That containment is
shielded by protective layers of steel and concrete . But if that containment is ruptured, then
is ruptured, the material could flow into the larger containment building surrounding it.

potentially a lot of material could go into the environment, Ferguson said. Meltdown can also occur in the pools
containing spent fuel rods. Used fuel rods are removed from the reactor and submerged in whats called a spent
fuel pool, which cools and shields the radioactive material. Overheating

of the spent fuel pools could


cause the water containing and cooling the rods to evaporate . Without coolant, the fuel rods
become highly vulnerable to catching fire and spontaneously combusting, releasing dangerous levels of radiation
into the atmosphere. Water not only provides cooling, but it provides shielding , said Robert
Alvarez, a nuclear expert and a senior scholar at the Institute for Policy Studies. [Radiation] dose rates coming off
from spent fuel at distances of 50 to 100 yards could be life-threatening. Since spent fuel is less radioactive than
fuel in the reactor core, these pools are easier to control, said Peter Caracappa, a professor and radiation safety
officer at Rensselaer Polytechnic Institute. But theyre also less contained. If material is released, it has a greater
potential to spread because theres no primary containment, he said. Most of the problems with the backup
generators were caused by the tsunami flooding them. But Makhijani suspects that unseen damage from the
earthquake may be adding another challenge. I think because the earthquake was so severe, theres probably a lot
of damage becoming apparent now, he said. Valves might have become displaced, and there may be cracked
pipes. We cant know, because theres no way to suspect. Yesterday, they had trouble releasing a valve. And
theyve had trouble maintaining coolant inside, which means leaks.

No extinction empirics reactors leak literally all the time


Nichols 13 columnist @ Veterans Today (Bob Nichols, 4/6/13, All Nuclear

Reactors Leak All of he Time, http://www.veteranstoday.com/2013/04/06/allreactors-leak-all-the-time/)//twonily


(San Francisco) Reportedly Americans widely believe in God and lead the world in the percentage of citizens in
prison and on parole. That is actual reality from an imaginary character in a TV show. The Gallup Poll also says it is

Most Americans believe that nuke reactors are safe and


quite sound, too. Wonder why they do that? Most people at one time in their lives
watched as steam escapes from a pressure cooker and accept it as real and true. A
true and has been for years.

reactor is very much the same thing . The cooks, called Operators, even take the lid
off from time to time too. A nuclear reactor is just an expensive, overly complicated
way to heat water to make steam. Of course all reactors leak ! All nuclear reactors also
actually manufacture more than 1,946 dangerous and known radioactive metals, gases and aerosols. Many
isotopes, such as radioactive hydrogen, simply cannot be contained . So,
they barely even try. It is mostly just a show for the rubes.[1]

Even explosions dont cause leaks empirics


Bellona News 11 (9/12/11, Breaking: Explosion rocks French nuclear
facility; no radiation leaks apparent, http://bellona.org/news/nuclearissues/accidents-and-incidents/2011-09-breaking-explosion-rocks-frenchnuclear-facility-no-radiation-leaks-apparent)//twonily
There is no immediate evidence of a radioactive leak after a blast at the
southern French nuclear facility of Marcoule near Nimes which killed one person and
injured four others, one seriously, French media have reported and safety officials have confirmed.
There was no risk of a radioactive leak after the blast , caused by a fire near a
furnace in the Centraco radioactive waste storage site, said officials according to various media reports. The plants

it had been an industrial accident, not a


nuclear accident. For the time being nothing has made it outside , said one
owner, national electricity provider EDF, said

spokesman for Frances Atomic Energy Commission who spoke anonymously to the BBC. The Centraco treatment
centre, which has been operational since February of 1999, belongs to a subsidiary of EDF. It produces MOX fuel,
which recycles plutonium from nuclear weapons. [Marcoule] is French version of Sellafield. It is difficult to evaluate
right now how serious the situation is based on the information we have at the moment. But it can develop further,
said Bellona nuclear physicist Nils Bhmer. The local Midi Libre newspaper, on its web site, said an oven exploded at

No radiation leak was reported, the


report said, adding that no quarantine or evacuation orders were issued for
neighboring towns. A security perimeter has been set up because of the risk of leakage. The explosion hit
the site at 11:45 local time. The EDF spokesman said the furnace affected had been burning
contaminated waste, including fuels, tools and clothing, which had been used in nuclear energy
the plant, killing one person and seriously injuring another.

production. The fire caused by the explosion was under control, he told the BBC. The International Atomic Energy
Agency (IAEA) said it was in touch with the French authorities to learn more about the nature of the explosion. IAEA
Director General Yukiya Amano said the organisations incident centre had been immediately activated, Reuters
reports. A statement issued by the Nuclear Safety Authority also said there have been no radiation leaks outside of
the plant. Staff at the plant reacted to the accident according to planned procedures, it said. Frances Nuclear
Safety Authority, however, is not noted for its transparency. Operational since 1956, the Marcoule plant is a major
site involved with the decommissioning of nuclear facilities, and operates a pressurised water reactor used to
produce tritium. The site is has also been used since 1995 by French nuclear giant Areva to produce MOX fuel at the
sites MELOX factory, which recycles plutonium from nuclear weapons. Part of the process involves firing
superheated plutonium and uranium pellets in an oven. The Marcoule plant is located in the Gard department in
Languedoc-Roussillon region, near Frances Mediterranean coast. Marcoule: Sellafields French brother Its first major

reactors
generated the first plutonium for Frances first nuclear weapons test in 1960. Its
role upon opening was weapons production as France sought a place among nuclear nations. Its

reactor producing tritium as fuel for hydrogen as well as other weapons related reactors sprang up as the arms race
gained international traction. The site also houses an experimental Phenix fast-breeder reactor which since 1995
has combine fissile uranium and plutonium into mixed oxide or MOX fuel that can be used in civilian nuclear power

stations.

1NC No Korea War


No escalation to Korean conflict
David Kang (Professor of International Relations and Business and Director
of the Korean Studies Institute University of Southern California) December
31 2010 Koreas New Cold War,
http://nationalinterest.org/commentary/koreas-new-cold-war-4653)
However, despite dueling artillery barrages and the sinking of a warship, pledges of enormous retaliation, in-your-

the risk of all-out war on the Korean


peninsula is less than it has been at any time in the past four decades. North Korea didnt
blink, because it had no intention of actually starting a major war . Rather than signifying a
new round of escalating tension between North and South Korea, the events of the past year point to
something elsea new cold war between the two sides. In fact, one of my pet peeves is the analogies we use
to describe the situation between South and North Korea. We often call the situation a powder keg
or a tinderbox, implying a very unstable situation in which one small spark could
lead to a huge explosion. But the evidence actually leads to the opposite conclusion :
we have gone sixty years without a major war, despite numerous sparks
such as the skirmishing and shows of force that occurred over the past month . If one
believes the situation is a tinderbox, the only explanation for six decades without a
major war is that we have been extraordinarily lucky. I prefer the opposite explanation: deterrence is
quite stable because both sides know the costs of a major war, and both sides
rhetoric and muscle-flexing asidekeep smaller incidents in their proper
perspective. How can this be, when North Korea threatens to use massive retaliation and
mentions its nuclear weapons in its rhetoric, and when the South Korean leadership
and military is determined to "respond relentlessly" to meet any North Korean
provocation? Local skirmishing has stayed local for sixty years. The key issue is whether
a local fight could escalate into all-out war , such as North Korea shelling Seoul with artillery or
face joint military exercises and urgent calls for talks,

missiles. Such a decision would clearly have to be taken at the top of the North Korean leadership. Especially when
tensions are high, both militaries are on high alert and local commanders particularly careful with their actions.

it is not likely that a commander one hundred kilometers away


would make a decision on his own to start shooting at Seoul. For
their part, North Korean leaders have not made such a decision in sixty years, knowing that any major
attack on Seoul would cause a massive response from the South Korean and U.S.
forces and would carry the war into Pyongyang and beyond . After the fighting,
North Korea would cease to exist. Thus, while both North and South Korean leaders talk in grim
tones about war, both sides have kept the actual fighting to localized areas , and I have
seen no indication that this time the North Korean leadership plans to expand the
fighting into a general war.
Without a clear directive from the top,
from the military exercises

2NC No Korea War


Even if North Korea attacks South Korea wont retaliate no
full scale war
Kim 10 (6/16/10, Jack, Reuters, Q+A - How serious is the Korean crisis and
risk of war?
http://in.reuters.com/article/idINIndia-49340820100616)
Many analysts doubt there will be war, as long as South Korea holds its fire. North Korea's
obsolete conventional armed forces and military equipment mean quick and certain
defeat if it wages full-scale war and Pyongyang is well aware of its limits. South
Korea has made it clear it will not retaliate despite investigations that found a torpedo fired by a
North Korean submarine sank the corvette Cheonan in March. It knows the investment community will take fright if
it does attack. President Lee Myung-bak's government has taken the case to the Security Council, rather than take
the law into its own hands.

No Korean War both sides would be willing to compromise.


Asian Correspondent 10, 12/9, Tensions high in Koreas, but all-out war
unlikely, http://asiancorrespondent.com/43494/tensions-high-in-koreas-butall-out-war-unlikely/,
Two weeks after North Korea shelled a South Korean island, the rivals are still trading threats of attacks and
counterattacks. Tensions remain at their highest in more than a decade, and though neither side is backing down,

all-out war is unlikely. The fact that both North and South are having to prove themselves militarily and
conduct live-fire tests very close to each others borders just increases the likelihood that there could be an errant
shell or just a war of nerves that could lead to crossing the line once again, said Peter Beck, a research fellow at
Keio University in Tokyo. Now that the North has done it once, its not going to surprise me if they do it again.

the doomsday scenario of all-out war across the worlds most militarized border
is unlikely, he and other experts said. South Koreas moves to bolster its military
readiness since the attack reduce the risk of the outbreak of a full-fledged war, said
Daniel Pinkston, a Seoul-based analyst with the International Crisis Group think tank. North
Korea is rich in manpower but poor in hardware and, he said, it knows that further
provocation will come at a cost. The Norths game has always been to provoke just
enough to be able to extract what it needs from the South and the rest of the world.
Still,

Since 2003, Pyongyang had been engaged in negotiations with five other nations to dismantle its nuclear program
in exchange for fuel oil and other concessions. After backing out of that deal last year, North Korea struggling to
feed its people, slapped with sanctions has been looking for a way back to the negotiating table. Seoul and
Washington, however, say giving into Pyongyangs ploys would only reward bad behavior and have resisted

North Korea is handling a sensitive transfer of


power from leader Kim Jong Il to his young, untested son. While that uncertainly may
make North Korea more unpredictable, it also means it craves stability more than
ever.
restarting the talks. Complicating matters is that

No war escalation HIGHLY unlikely deterrence checks


Rory Medcalf (Program Director - International Security at the Lowry
Institute for International Policy) 4/10/2013 Korean War II? Maybe, but not
likely, http://www.lowyinterpreter.org/post/2013/04/04/A-new-Korean-warMaybe-but-not-likely.aspx

Deterrence is alive and well and at home,


in the Asian century. Yes, those warning of war have a point. An iconic act of
limited aggression by the North is a real possibility . Kim Jong-un obviously feels he has
lots to prove, and a fresh act of violence like the 2010 sinking of the Cheonan or the bombardment of
I would put the analytical focus on a somewhat different place.
for better or worse,

Yeonpyeong Island might just do the trick. Yes, the South has promised to respond forcefully to any future such
provocations, and the US and possibly others would feel compelled to back it up. Yes, the young Kim has thrown

But I still assess, on


balance, that the North Korean leadership is aware of the risks of a spiral
into the war, which would seal its fate . Why else, after first promising nuclear attack, has Pyongyang
fairly much every toy out the cot this time, and needs a face-saving way to quieten down.

lurched back to rather less apocalyptic threats, such as restarting its Yongbyon reactor or obstructing South Korea
workers at a joint project? As for ordinary North Koreans, it's not clear that they think Armageddon is just around the
corner. The fate of North Korea is less likely to be about a high-definition replay of the 1950-1953 war than about
change from within and eventual regime failure leading to some seriously dangerous moments for US-China
diplomacy (as explored in Chapter 5 of this Lowy Institute report). So for the moment I would play down the war

small-scale North Korean attack in the 'possible' basket, an escalation to


large-scale conventional conflict in the 'highly unlikely' basket, and the chance
of nuclear escalation pretty much as remote as it has been for decades (which is
not to say it is impossible). If the Korea crisis of recent weeks underscores one reality it is
the central and continuing role of deterrence in Asia's security . It exposes in
talk. I put a

plain sight as plain as last week's much-publicised B-2 'stealth' bombing run the unpleasant fact that the
security and prosperity of the Asian century still rests on the existence of American military power and a professed
willingness to use it.

Now war at worst miscalc cause small skirmishes but no full


scale war
Maplecroft News 4/10/13 War on Korean Peninsula unlikely, but further
escalation could spur capital flight from South new risk briefing,
http://maplecroft.com/about/news/country-risk-briefings-n-korea-april10.html
An on-going series of

provocative measures by North Korea since conducting its third nuclear test on

12 February 2013 have escalated tensions in the Korean peninsula and wider North-East Asia
region. Maplecrofts Country Risk Briefing for North Korea makes detailed assessment of Pyongyang's domestic
motives and foreign policy consideration behind these actions. In addition to providing general analysis on the
dynamics of this isolated and dynastic regime, the briefing also examines regional security implications covering all
important stakeholders, such as China, US and Japan. In particular, the briefing looks closely at the potential impact

the risk of
a full-scale war on the Korean Peninsula remains low. However, there is a moderate
risk of military miscalculations leading to limited skirmishes, particularly near the maritime
on neighbouring South Korea and its security and business environment. According to the briefing,

border in the East China Sea. The heightened risk of small-scale confrontations will pressure the US, South Korea
and Japan to continue to increase their missile defence capabilities. This will be unwelcome to China, despite its
own concerns over North Korean behaviour. Beijing will continue to implement UN sanctions more rigidly against
North Korea. It will also urge all sides to the conflict to resume six-party talks.

Deterrence Solves
Carlton Meyer (Editor G2 Military) 2003 The Mythical North Korean
Threat, http://www.g2mil.com/korea.htm
Even if North Korea employs a few crude nuclear weapons, using them would be
suicidal since it would invite instant retaliation from the United States . North Korea lacks
the technical know-how to build an Intercontinental Ballistic Missile, despite the hopes and lies from the National
Missile Defense proponents in the USA. North Korea's industrial production is almost zero, over two million people

have starved in recent years, and millions of homeless nomads threaten internal revolution. The US military ignores
this reality and retains old plans for the deployment of 450,000 GIs to help defend South Korea, even though the
superior South Korean military can halt any North Korean offensive without help from a single American soldier.

American forces are not even required for a counter-offensive. A North Korean
attack would stall after a few intense days and South Korean forces would soon be
in position to overrun North Korea. American air and naval power along with
logistical and intelligence support would ensure the rapid collapse of the North
Korean army.

Deterrence solves escalation.


David Kang (assoc. prof of govt and adjunct assoc prof at the Tuck School of
Business at Dartmouth) Summer 2003 The Avoidable Crisis in North Korea
Orbis, Volume 47, Issue 3 accessed via Science Direct
North Korea has not attacked South Korea for fifty years because deterrence
works. Despite the tension that has existed on the peninsula, the armistice line has
held. Neither side has attempted to mount a major military operation , nor has either
side attempted to challenge deterrence on the peninsula.6 Deterrence will continue
to hold even if North Korea develops and deploys a nuclear weapon. Deterrence
requires both sides to know that the other side can inflict unacceptable costs on it. Since 1953 North
Korea has faced both a determined South Korean military, and more important, U.S. military
deployments that at their height comprised 100,000 troops and nuclear-tipped Lance missiles and
even today include 38,000 troops, nuclear-capable airbases, and naval facilities that guarantee U.S.

although tension
is high, the balance of power has been stable. Far from being an unstable powder
keg, for five decades both sides have moved cautiously and avoided major military
mobilizations that could spiral out of control. The balance of power has held
because any war on the peninsula would have disastrous consequences for both
sides. The capitals of Seoul and Pyongyang are less than 150 miles apartcloser than New York and
involvement in any conflict on the peninsula. The result has not been surprising:

Baltimore. Seoul is 30 miles from the demilitarized zone that separates the North and the South
(DMZ), and easily within reach of North Koreas artillery tubes. U.S. General Gary Luck estimates that
a war on the Korean peninsula would cost the US$1 trillion in economic damage and result in one
million casualties, including 52,000 U.S. military casualties. The North, although it has numerically
larger armed forces, faces much more highly trained and capable U.S.-South Korean armed forces.

With the North growing continually weaker relative to the South, the chances for
war become even slimmer. North Korea never had the material capabilities to be a serious

contender to the U.S.-South Korean alliance, and it fell further behind early. So the real question has
not been whether North Korea would engage in a preventive attack as South Korea caught up, but why
North Korea might fight as it fell further and further behind. As the balance of power began to turn

the North deterred the U.S. from attempting to crush it through


massive conventional military deployments along the DMZ. Especially because Seoul is
against the North,

both vulnerable to air attack and the center of South Korean life, the South Korean government is

North Koreas militaryboth conventional


and missile systemsexist to deter the South and the U.S. from becoming too
adventurous. The peninsular situation is more an uneasy standoff than one of the Norths being in a
position to invade the South. Both sides are very careful, and neither wishes to provoke a
war, knowing the destruction it would bring.
quite reluctant to escalate tensions too quickly.

Zero risk of Korean conflict


Rowland 10 (Ashley Rowland, Stars and Stripes, Despite threats, war not likely in Korea, experts say,
http://www.stripes.com/news/despite-threats-war-not-likely-in-korea-experts-say-1.127344?localLinksEnabled=false,
December 3, 2010)

Despite increasingly belligerent threats to respond swiftly and strongly to military attacks, analysts
say there is one thing both North Korea and South Korea want to avoid: an
escalation into war. The latest promise to retaliate with violence came Friday, when South Koreas defense
minister-to-be said during a confirmation hearing that he supports airstrikes against North Korea in the case of
future provocations from the communist country. In case the enemy attacks our territory and people again, we will
thoroughly retaliate to ensure that the enemy cannot provoke again, Kim Kwan-jin said, according to The
Associated Press. The hearing was a formality because South Koreas National Assembly does not have the power to
reject South Korean president Lee Myung-baks appointment. Kims comments came 10 days after North Korea
bombarded South Koreas Yeonpyeong island near the maritime border, killing two marines and two civilians the
first North Korean attack against civilians since the Korean War. South Korea responded by firing 80 rounds, less
than half of the 170 fired by North Korea. It was the second deadly provocation from the North this year. In March, a
North Korean torpedo sank the South Korean warship Cheonan, killing 46 sailors, although North Korea has denied
involvement in the incident. The South launched a series of military exercises, some with U.S. participation,
intended to show its military strength following the attack. John Delury, a professor at Yonsei University in Seoul,
said South Korea is using textbook posturing to deter another attack by emphasizing that it is tough and firm. But

its hard to predict how the South would respond to another attack. The country
usually errs on the side of restraint , he said. I think theyre trying to send a very clear signal to North
Korea: Dont push us again, Delury said. For all of the criticism of the initial South Korean response that it was too

people dont want another hot conflict. I think the strategy is to


rattle the sabers a bit to prevent another incident. Meanwhile, Yonhap News reported Friday that
weak, in the end I think

North Korea recently added multiple-launch rockets that are capable of hitting Seoul, located about 31 miles from
the border. The report was based on comments from an unnamed South Korean military source who said the North
now has 5,200 multiple-launch rockets. A spokesman for South Koreas Joint Chiefs of Staff would not comment on

Experts say it is a question of


when not if North Korea will launch another attack. But those experts doubt the
situation will escalate into full-scale war. I think that its certainly possible, but I think that what
the accuracy of the report because of the sensitivity of the information.

North Korea wants, as well as South Korea, is to contain this, said Bruce Bechtol, author of Defiant Failed State:
The North Korean Threat to International Security and an associate professor of political science at Angelo State

North Korea typically launches small, surprise attacks that can


be contained not ones that are likely to escalate. Delury said both Koreas want to
avoid war, and North Koreas leaders have a particular interest in avoiding conflict
they know the first people to be hit in a full-scale fight would be the elites.
University in Texas. He said

Past 50 years disproves escalation


White 10 Masters in journalism from Columbia and IR degree from the London School of Economics, editor
for Business Insider and formerly wrote for MSNBC (3/26, Gregory, Business Insider, The Long, Long History Of
False Starts Of War Between South And North Korea, http://www.businessinsider.com/were-calling-it-this-is-not-thestart-the-restart-of-the-korean-war-2010-3)

History suggests that this sinking of a South Korean naval vessel off the coast of
the country will not be the restart of the Korean conflict. Since the end of open conflict between North
and South Korea, the North has consistently acted in an aggressive manner
towards its neighbor. During the 1960s, North Korea conducted military
operations into the south, culminating in 1968 when 600 of these raids were reported. In the
1970s, North Korea tried to assassinate key members of the South Korean
government, in an attempt to push the crisis forward. In 1999, two North Korean naval
ships were blown up killing 30. In 2002, a sea battle killed and unspecified amount of North Koreans
and 5 South Koreans. In November 2009, two military vessels exchanged fire (via HuffPo). In
January 2010, North Korea launched 30 shells into the country's no sail zone. This time won't be
different. Little will happen.

Solvency

1NC No Solvency
Aff is insufficient because it doesnt seek international
commitments their evidence
CCIA 12 (international not-for-profit membership organization dedicated to
innovation and enhancing societys access to information and
communications)
(Promoting CrossBorder Data Flows Priorities for the Business Community,
http://www.ccianet.org/wpcontent/uploads/library/PromotingCrossBorderDataFlows.pdf)
The movement of electronic information across borders is critical to businesses around the world, but the
international rules governing flows of digital goods, services, data and infrastructure are incomplete. The global
trading system does not spell out a consistent, transparent framework for the treatment of cross border flows of
digital goods, services or information, leaving businesses and individuals to deal with a patchwork of national,
bilateral and global arrangements covering significant issues such as the storage, transfer, disclosure, retention and
protection of personal, commercial and financial data. Dealing with these issues is becoming even more important
as a new generation of networked technologies enables greater crossborder collaboration over the Internet, which
has the potential to stimulate economic development and job growth. Despite the widespread benefits of cross
border data flows to innovation and economic growth, and due in large part to gaps in global rules and inadequate
enforcement of existing commitments, digital protectionism is a growing threat around the world. A number of
countries have already enacted or are pursuing restrictive policies governing the provision of digital commercial and
financial services, technology products, or the treatment of information to favor domestic interests over
international competition. Even where policies are designed to support legitimate public interests such as national
security or law enforcement, businesses can suffer when those rules are unclear, arbitrary, unevenly applied or
more traderestrictive than necessary to achieve the underlying objective. Whats more, multiple governments may
assert jurisdiction over the same information, which may leave businesses subject to inconsistent or conflicting
rules. In response, the United States should drive the development and adoption of transparent and highquality
international rules, norms and best practices on crossborder flows of digital data and technologies while also
holding countries to existing international obligations. Such efforts must recognize and accommodate legitimate
differences in regulatory approaches to issues such as privacy and security between countries as well as across
sectors. They should also be grounded in key concepts such as nondiscrimination and national treatment that have
underpinned the trading system for decades.

The U.S. Government should seek


international commitments on several key objectives, including: prohibiting
measures that restrict legitimate crossborder data flows or link commercial
benefit to local investment; addressing emerging legal and policy issues
involving the digital economy; promoting industry driven international
standards, dialogues and best practices; and expanding trade in digital
goods, services and infrastructure. U.S. efforts should ensure that trade
agreements cover digital technologies that may be developed in the future.
At the same time, the United States should work with governments around
the world to pursue other policies that support crossborder data flows,
including those endorsed in the Communiqu on Principles for Internet
Policymaking related to intellectual property protection and limiting
intermediary liability developed by the Organization for Economic
Cooperation and Development (OECD) in June 2011. U.S. negotiators should
pursue these issues in a variety of forums around the world, including the
World Trade Organization (WTO), Asia Pacific Economic Cooperation (APEC)
forum, OECD, and regional trade negotiations such as the TransPacific
Partnership as appropriate in each forum. In addition, the U.S. Government
should solicit ideas and begin to develop a plurilateral framework to set a
new global gold standard to improve innovation. Finally, the U.S.
Government should identify and seek to resolve through WTO or bilateral
consultations or other processes violations of current international rules

concerning digital goods, services and information. Promoting CrossBorder Data Flows:
Priorities for the Business Community 2 The importance of crossborder commercial and financial flows Access to
computers, servers, routers and mobile devices, services such as cloud computing whereby remote data centers
host information and run applications over the Internet, and information is vital to the success of billions of
individuals, businesses and entire economies. In the United States alone, the goods, services and content flowing
through the Internet have been responsible for 15 percent of GDP growth over the past five years. Open, fair and
contestable international markets for information and communication technologies (ICT) and information are
important to electronic retailers, search engines, social networks, web hosting providers, registrars and the range of
technology infrastructure and service providers who rely directly on the Internet to create economic value. But they
are also critical to the much larger universe of manufacturers, retailers, wholesalers, financial services and logistics
firms, universities, labs, hospitals and other organizations which rely on hardware, software and reliable access to
the Internet to improve their productivity, extend their reach across the globe, and manage international networks
of customers, suppliers, and researchers. For example, financial institutions rely heavily on gathering, processing,
and analyzing customer information and will often process data in regional centers, which requires reliable and
secure access both to networked technologies and crossborder data flows. According to McKinsey, more than
threequarters of the value created by the Internet accrues to traditional industries that would exist without the
Internet. The overall impact of the Internet and information technologies on productivity may surpass the effect of
any other technology enabler in history, including electricity and the combustion engine, according to the OECD.
Networked technologies and data flows are particularly important to small businesses, nonprofits and
entrepreneurs. Thanks to the Internet and advances in technology, small companies, NGOs and individuals can
customize and rapidly scale their IT systems at a lower cost and collaborate globally by accessing on line services
and platforms. Improved access to networked technologies also creates new opportunities for entrepreneurs and
innovators to design applications and to extend their reach internationally to the more than two billion people who
are now connected to the Internet. In fact, advances in networked technologies have led to the emergence of
entirely new business platforms. Kiva, a microlending service established in 2005, has used the Internet to
assemble a network of nearly 600,000 individuals who have lent over $200 million to entrepreneurs in markets
where access to traditional banking systems is limited. Millions of others use online advertising and platforms such
as eBay, Facebook, Google Docs, Hotmail, Skype and Twitter to reach customers, suppliers and partners around the
world. More broadly, economies that are open to international trade in ICT and information grow faster and are
more productive Limiting network access dramatically undermines the economic benefits of technology and can
slow growth across entire economies.

Backdoor reform is key to solve, not abolishment


Burger et al 14
(Eric, Research Professor of Computer Science at Georgetown, L. Jean Camp,
Associate professor at the Indiana University School of Information and
Computing, Dan Lubar, Emerging Standards Consultant at RelayServices, Jon
M Pesha, Carnegie Mellon University, Terry Davis, MicroSystems Automation
Group, Risking It All: Unlocking the Backdoor to the Nations Cybersecurity,
IEEE USA, 7/20/2014, pg. 1-5, Social Science Research Network,
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2468604)//duncan
This paper addresses government policies that can influence commercial practices to weaken security in products

The debate on information surveillance for national


security must include consideration of the potential cybersecurity risks and
economic implications of the information collection strategies employed . As IEEE-USA,
and services sold on the commercial market.

we write to comment on current discussions with respect to weakening standards, or altering commercial products
and services for intelligence, or law enforcement. Any policy that seeks to weaken technology sold on the
commercial market has many serious downsides, even if it temporarily advances the intelligence and law

we define and
address the risks of installing backdoors in commercial products, introducing
malware and spyware into products, and weakening standards. We illustrate that
these are practices that harm Americas cybersecurity posture and put the
resilience of American cyberinfrastructure at risk. We write as a technical society to
enforcement missions of facilitating legal and authorized government surveillance. Specifically,

clarify the potential harm should these strategies be adopted. Whether or not these strategies ever have been used

Individual computer users, large corporations and


depend on security features built into information technology
products and services they buy on the commercial market. If the security features
of these widely available products and services are weak, everyone is in greater
danger. There recently have been allegations that U.S. government agencies (and
in practice is outside the scope of this paper.
government agencies all

have engaged in a number of activities deliberately intended to


weaken mass market, widely used technology. Weakening commercial products and services does
have the benefit that it becomes easier for U.S. intelligence agencies to conduct
surveillance on targets that use the weakened technology, and more information is available for law
enforcement purposes. On the surface, it would appear these motivations would be reasonable. However, such
strategies also inevitably make it easier for foreign powers, criminals and
terrorists to infiltrate these systems for their own purposes. Moreover,
everyone who uses backdoor technologies may be vulnerable , and not just the handful of
some private entities)

surveillance targets for U.S. intelligence agencies. It is the opinion of IEEE-USAs Committee on Communications
Policy that no entity should act to reduce the security of a product or service sold on the commercial market without
first conducting a careful and methodical risk assessment. A complete risk assessment would consider the interests

A
methodical risk assessment would give proper weight to the asymmetric nature of
cyberthreats, given that technology is equally advanced and ubiquitous in the United States, and the locales of
many of our adversaries. Vulnerable products should be corrected , as needed, based on this
of the large swath of users of the technology who are not the intended targets of government surveillance.

assessment. The next section briefly describes some of the government policies and technical strategies that might
have the undesired side effect of reducing security. The following section discusses why the effect of these practices

Government policies can affect greatly the


security of commercial products, either positively or negatively. There are a number of
methods by which a government might affect security negatively as a
means of facilitating legal government surveillance. One inexpensive
method is to exploit pre-existing weaknesses that are already present in
commercial software, while keeping these weaknesses a secret. Another
method is to motivate the designer of a computer or communications
system to make those systems easier for government agencies to access.
Motivation may come from direct mandate or financial incentives. There are many ways
that a designer can facilitate government access once so motivated. For example, the system may be
equipped with a backdoor. The company that creates it and, presumably, the
government agency that requests it would know the backdoor , but not the products
(or services) purchaser(s). The hope is that the government agency will use this feature
when it is given authority to do so, but no one else will. However, creating a
backdoor introduces the risk that other parties will find the vulnerability,
especially when capable adversaries, who are actively seeking security
vulnerabilities, know how to leverage such weaknesses . History illustrates
that secret backdoors do not remain secret and that the more widespread a
backdoor, the more dangerous its existence . The 1988 Morris worm, the first
widespread Internet attack, used a number of backdoors to infect systems and
spread widely. The backdoors in that case were a set of secrets then known only by a small, highly technical
community. A single, putatively innocent error resulted in a large-scale attack that
disabled many systems. In recent years, Barracuda had a completely undocumented
backdoor that allowed high levels of access from the Internet addresses assigned to Barracuda.
However, when it was publicized, as almost inevitably happens, it became extremely unsafe, and
Barracudas customers rejected it. One example of how attackers can subvert
backdoors placed into systems for benign reasons occurred in the network of the largest commercial
cellular operator in Greece. Switches deployed in the system came equipped with built-in
wiretapping features, intended only for authorized law enforcement agencies. Some
unknown attacker was able to install software , and made use of these embedded wiretapping
features to surreptitiously and illegally eavesdrop on calls from many cell phones
including phones belonging to the Prime Minister of Greece, a hundred high-ranking Greek
dignitaries, and an employee of the U.S. Embassy in Greece before the security breach finally
was discovered. In essence, a backdoor created to fight crime was used to commit crime.
may be a decrease, not an increase, in security.

2NC No Solvency
Aff doesnt solve their author

Kehl et al 14 (Danielle Kehl is a Policy Analyst at New Americas Open Technology Institute
(OTI). Kevin Bankston is the Policy Director at OTI, Robyn Greene is a Policy Counsel at OTI,
and Robert Morgus is a Research Associate at OTI, New Americas Open Technology
Institute Policy Paper, Surveillance Costs: The NSAs Impact on the Economy, Internet
Freedom & Cybersecurity, July 2014// rck)
The U.S. government has already taken some limited steps to mitigate this damage and begin the slow, difficult
process of rebuilding trust in the United States as a responsible steward of the Internet. But the reform efforts to
date have been relatively narrow, focusing primarily on the surveillance programs impact on the rights of U.S.
citizens. Based on our findings, we recommend that the U.S. government take the following steps to address the
broader concern that the NSAs programs are impacting our economy, our foreign relations, and our cybersecurity:
Strengthen privacy protections for both Americans and non-Americans, within the United States and
extraterritorially. Provide for increased transparency around government surveillance, both

from the government and companies. Recommit to the Internet Freedom agenda in a way
that directly addresses issues raised by NSA surveillance, including moving toward
international human-rights based standards on surveillance. Begin the process of restoring
trust in cryptography standards through the National Institute of Standards and Technology.
Ensure that the U.S. government does not undermine cybersecurity by inserting surveillance backdoors into
hardware or software products. Help to eliminate security vulnerabilities in software, rather than

stockpile them. Develop clear policies about whether, when, and under what legal standards
it is permissible for the government to secretly install malware on a computer or in a
network. Separate the offensive and defensive functions of the NSA in order to minimize
conflicts of interest.

1NC Circumvention
Circumvention the NSA will force companies to build
backdoors
Trevor Timm 15, Trevor Timm is a Guardian US columnist and executive
director of the Freedom of the Press Foundation, a non-profit that supports
and defends journalism dedicated to transparency and accountability. 3-42015, "Building backdoors into encryption isn't only bad for China, Mr
President," Guardian,
http://www.theguardian.com/commentisfree/2015/mar/04/backdoorsencryption-china-apple-google-nsa)//GV
Want to know why forcing tech companies to build backdoors into encryption is a terrible idea? Look no further than
President Obamas stark criticism of Chinas plan to do exactly that on Tuesday. If only he would tell the FBI and NSA

the FBI - and more recently the NSA have been pushing for a new US law that would force tech companies like
Apple and Google to hand over the encryption keys or build backdoors into
the same thing. In a stunningly short-sighted move,

their products and tools so the government would always have access to our communications. It was only a matter
of time before other governments jumped on the bandwagon, and China wasted no time in demanding the same
from tech companies a few weeks ago. As President Obama himself described to Reuters, China has proposed an
expansive new anti-terrorism bill that would essentially force all foreign companies, including US companies, to
turn over to the Chinese government mechanisms where they can snoop and keep track of all the users of those
services. Obama continued: Those kinds of restrictive practices I think would ironically hurt the Chinese economy
over the long term because I dont think there is any US or European firm, any international firm, that could credibly
get away with that wholesale turning over of data, personal data, over to a government. Bravo! Of course these
are the exact arguments for why it would be a disaster for US government to force tech companies to do the same.
(Somehow Obama left that part out.) As Yahoos top security executive Alex Stamos told NSA director Mike Rogers
in a public confrontation last week, building backdoors into encryption is like drilling a hole into a windshield.
Even if its technically possible to produce the flaw - and we, for some reason, trust the US government never to

Companies will no longer be


in a position to say no, and even if they did, intelligence services would
find the backdoor unilaterally - or just steal the keys outright. For an example on
how this works, look no further than last weeks Snowden revelation that the UKs intelligence
service and the NSA stole the encryption keys for millions of Sim cards
used by many of the worlds most popular cell phone providers . Its happened
abuse it - other countries will inevitably demand access for themselves.

many times before too. Security expert Bruce Schneier has documented with numerous examples, Back-door
access built for the good guys is routinely used by the bad guys. Stamos repeatedly (and commendably) pushed
the NSA director for an answer on what happens when China or Russia also demand backdoors from tech
companies, but Rogers didnt have an answer prepared at all. He just kept repeating I think we can work through
this. As Stamos insinuated, maybe Rogers should ask his own staff why we actually cant work through this,
because virtually every technologist agrees backdoors just cannot be secure in practice. (If you want to further
understand the details behind the encryption vs. backdoor debate and how what the NSA director is asking for is
quite literally impossible, read this excellent piece by surveillance expert Julian Sanchez.) Its downright bizarre that
the US government has been warning of the grave cybersecurity risks the country faces while, at the very same
time, arguing that we should pass a law that would weaken cybersecurity and put every single citizen at more risk
of having their private information stolen by criminals, foreign governments, and our own. Forcing backdoors will
also be disastrous for the US economy as it would be for Chinas. US tech companies - which already have suffered
billions of dollars of losses overseas because of consumer distrust over their relationships with the NSA - would lose
all credibility with users around the world if the FBI and NSA succeed with their plan. The White House is supposedly
coming out with an official policy on encryption sometime this month, according to the New York Times but the
President can save himself a lot of time and just apply his comments about China to the US government. If he
knows backdoors in encryption are bad for cybersecurity, privacy, and the economy, why is there even a debate?

You might also like