You are on page 1of 25

Broadband Access – Materials for the Practice Debates – Response Time Scenario

Aff materials

 New 1AC response time scenario p. 2-6

 New solvency for Link-up and Life-Line p. 7

 Additional Aff backline cards

o Quick Response Time = Lower Death Toll p. 8-9

o Starting with those in poverty = key p. 10

o A-to Mueller Terrorism Takeouts p. 11

 A-to States Cplan

o A-to States Cplan – 2AC p. 12-14

o 1AR – Extensions off “solvency deficit – Interlinking” p. 15-16

 A-to Neg’s impact defense

o A-to “Bioterror risk exaggerated – it’s a conspiracy to justify funding” p. 17

o A-to “Bioterror fails – weather/uncontrollable factors” p. 18

o A-to “Bioterror will fail – technical obstacles” p. 19

o A-to Terrorist d/n want large death tolls p. 20

Neg defense vs. Bioterror (Note: Gus loves “impact d”)

 A-to Bioterror Advantage – Frontline p. 21-22

 Extensions

o Extensions off # 1 “Risk exaggerated/Conspiracy” p. 23

o Extensions off # 2 “no dissemination/weather” p. 24

o Extensions off # 3 “Technical Obstacles” p. 25

1
Broadband Access – Aff – Response Time Advantage
Bioteror risk is high and neg takeouts are wrong – qualified experts confirm.

Deutch ’05 (John Deutch, qualified inside this piece of evidence, is now at The Massachusetts Institute of
Technology – Meeting the Bioterrorism Challenge: Testimony before U.S. Senate Committee on
Health, Education, Labor, & Pensions Subcommittee on Bioterrorism and Public Health
Preparedness -- May 11, 2005 --
http://web.mit.edu/chemistry/deutch/policy/72MeetingBioterroism2005.pdf)

I base my views on my experience as Director of Central Intelligence and Deputy Secretary of Defense in
the first Clinton administration, as a member of President George H.W. Bush’s Foreign Intelligence Advisory Board, as chairman
of the Commission on the Organization of the Government to Combat Weapons of Mass Destruction, and from the mid-
seventies, my service on many Defense Science Board and other government advisory committees, that addressed various aspects
of the weapons of mass destruction threat. My views align closely with most who have studied the threat of
bioterrorist and our biodefense preparedness. At the World Economic Conference this January I served on a panel with
Majority Leader Frist, a member of this subcommittee, that addressed bioterrorism and I believe our views on this important
subject are quite similar. My assessment of the threat is as follows:

o Terrorist groups with international reach, such as al Qaeda, have shown interest in biological
weapons. The technology for producing biological agents and dispersal mechanisms is
well known and easily within the capacity of terrorist organizations. Thus the threat is
real.

o We are fortunate that the United States, our allies, and our deployed military forces have not
yet been subject to a large-scale biological attack. The likelihood of an attack, our
vulnerability to an attack, and the need to prevent catastrophic consequences, means that
biodefense deserves to be a national priority.
o Despite the many warning, and some progress by the various involved government agencies, including
Health and Human Services (HHS) and its Centers for Disease Control and Prevention (CDC) and National
Institutes of Health, (NIH), and the new Department of Homeland Security (DHS), our territory, citizens,
agriculture and livestock remain unacceptably vulnerable to a catastrophic biological agent
attack. State and local government cannot possibly deal with these events without significant technical and
financial help from the federal government.

o In the near term, the agents of greatest concern are anthrax and smallpox. In the longer term, it is entirely
possible that new classes of pathogens will be developed based on modern molecular biology
and biotechnology techniques that will be more virulent and more difficult to detect and to
treat.
o To my knowledge, no comprehensive multi-year program plan exists that integrates
the efforts of the various agencies required to improve our nation’s biodefense
posture.

2
Broadband Access – Aff – Response Time Advantage cont’

Minimizing the death toll is crucial – large casualties ensures a US response that escalates to nuclear war.

Conley ’03 (Lt Col Harry W. is chief of the Systems Analysis Branch, Directorate of Requirements,
Headquarters Air Combat Command (ACC), Langley AFB, Virginia. Air & Space Power Journal -
Spring 2003 -- http://www.airpower.maxwell.af.mil/airchronicles/apj/apj03/spr03/conley.html)

The number of American casualties suffered due to a WMD attack may well be the most important
variable in determining the nature of the US reprisal. A key question here is how many Americans would have to be
killed to prompt a massive response by the United States. The bombing of marines in Lebanon, the Oklahoma City
bombing, and the downing of Pan Am Flight 103 each resulted in a casualty count of roughly the same
magnitude (150–300 deaths). Although these events caused anger and a desire for retaliation among the
American public, they prompted no serious call for massive or nuclear retaliation. The body count from a
single biological attack could easily be one or two orders of magnitude higher than the casualties caused by these events. Using
the rule of proportionality as a guide, one could justifiably debate whether the United States should use massive force in
responding to an event that resulted in only a few thousand deaths. However, what
if the casualty count was around
300,000? Such an unthinkable result from a single CBW incident is not beyond the realm of possibility:
“According to the U.S. Congress Office of Technology Assessment, 100 kg of anthrax spores delivered by an efficient
aerosol generator on a large urban target would be between two and six times as lethal as a one megaton
thermo-nuclear bomb.”46 Would the deaths of 300,000 Americans be enough to trigger a nuclear response? In this case,
proportionality does not rule out the use of nuclear weapons. Besides simply the total number of casualties, the types of
casualties- predominantly military versus civilian- will also affect the nature and scope of the US reprisal action. Military combat
entails known risks, and the emotions resulting from a significant number of military casualties are not likely to be as forceful as
they would be if the attack were against civilians. World War II provides perhaps the best examples for the kind of event or
circumstance that would have to take place to trigger a nuclear response. A CBW event that produced a shock and death toll
roughly equivalent to those arising from the attack on Pearl Harbor might be sufficient to prompt a nuclear retaliation. President
Harry Truman’s decision to drop atomic bombs on Hiroshima and Nagasaki- based upon a calculation that up to one million
casualties might be incurred in an invasion of the Japanese homeland47- is an example of the kind of thought process that would
have to occur prior to a nuclear response to a CBW event. Victor Utgoff suggests that “if nuclear retaliation is seen at the time to
offer the best prospects for suppressing further CB attacks and speeding the defeat of the aggressor, and if
the original
attacks had caused severe damage that had outraged American or allied publics, nuclear retaliation would
be more than just a possibility, whatever promises had been made.”48

3
Broadband Access – Aff – Response Time Advantage cont’
A National Broadband Emergency Network is necessary.

It should target low-income households and would solve for ubiquity, response time, and global modeling

Ramsey ‘9 (Rey Ramsey – Chief Executive Officer – One Economy Corporation, former director of the Oregon Housing and Community
Services Department. Ramsey also served two terms on the Habitat for Humanity International board of directors, elected as
chair in 2003.Comments before the FEDERAL COMMUNICATIONS COMMISSION in the Matter of “A National Broadband
Plan for Our Future: “Comments of the One Economy Corporation – GN Docket No. 09‐51 –June 8th –
https://www.neca.org/portal/server.pt/gateway/PTARGS_0_0_307_206_0_43/http
%3B/prodnet.www.neca.org/wawatch/wwpdf/68oneeconomy.pdf_)

For the country to fully utilize broadband with a purpose, we must implement a national prescription with two key elements. First is to mitigate
barriers and problems, the second is to maximize opportunities so that the nation can move from the notion of digital divide to the reality of a
digital opportunity. In developing the National Broadband Plan (National Plan), the FCC should create “Broadband with a Purpose and a Social Dividend,” a national plan that
harnesses market forces to advance an important public purpose and serve other national priorities. Spectrum and Universal Service Fund subsidies are both valuable public resources, and the
Commission has a responsibility to align these resources in a way that stimulates economic development, improves health outcomes and advances educational opportunities. Developing a
In the case of broadband, this social
National Plan geared to these public purposes should yield an important social dividend, benefiting underserved and unserved sectors.
dividend must focus on bridging the digital divide for low‐income individuals and those left out of the first wave of broadband Internet expansion and
adoption. The creation of the National Broadband Plan is a landmark opportunity for the United States to aim for real global leadership in broadband. While Organization for Economic Co‐
ordination and Development (OECD) and other broadband rankings necessarily loom large as the Commission undertakes this proceeding, we urge the Commission to be intentional about their
goals and objectives and to consistently measure them against those benchmarks. Through
a bold yet focused assortment of incentives and policy directives, and
stimulating, but not usurping the private sector, the
FCC can play a dramatic role in reshaping the United States as a global broadband leader and
unleashing the unfulfilled promise of broadband in impacting employment, education, health, and GDP growth.

Broadband Deficit | Intentional Focus on Low‐Income Populations

Due to barriers to broadband adoption, low‐income individuals in underserved and unserved communities were most frequently left behind in the first wave
of broadband deployment in the United States. For those with annual incomes under $20,000, just 25% access broadband in the
home. Additionally, only 42% of those with a yearly income under $30,000 have access to broadband. In stark contrast to both these numbers, 82% of families earning more than $75,000
each year have accessed broadband in the home1. For 92% of Americans, one broadband option is available in addition to satellite, yet 57% have accessed broadband in the home2,3. To
significantly increase broadband penetration in the U.S., this 35% gap between availability and adoption – the Broadband Deficit – must
be overcome. Free Market Principles | Focus on Supply AND Demand Increasing the demand for broadband is as important, if not more so, than increasing the supply. We can overcome
the 35% Broadband Deficit by concentrating on the three “A’s”: • Availability (Supply) – Sufficient, desirable and competitive broadband options • Affordability (Supply and Demand) – Where
price is compatible with a person’s ability to pay • Adoption (Demand) – Sustainable usage and uptake of broadband as spurred by the following five elements: o Affordable broadband
connections o Affordable hardware choices o Awareness of broadband options and benefits o Promotion of digital literacy o Prevalence of relevant content Leapfrogging | Next‐ Generation
Networks for Underserved and Unserved Whenever possible, we should incentivize the installation of next‐generation, high‐speed networks at affordable
prices in underserved and unserved communities. This infrastructure investment, a direct deposit on the potential social dividend, could have a profound impact on the delivery and utilization
of applications for education, employment, healthcare and economic development. A 2009 study by Leonard Waverman, of the Haskayne School of Business at the University of Calgary, found
that by adding ten more broadband lines per 100 individuals across the U.S. (~30 million new broadband lines) would raise U.S.
GDP by over $110 billion (Connectivity Scorecard 2009)4. Additionally, the Information Technology and Innovation Foundation (ITIF) forecasts expenditures on IT to have three
to five times more impact on productivity than other capital expenditures. By “leapfrogging” older generation technologies and installing updated services for unserved
and underserved populations, the Administration’s National Plan can maximize the benefits of this National Strategy and set a model for the
rest of the world. The North Star | Government’s Role as a Free Market Stimulus The first role of government should be to establish national goals and interim benchmarks, setting the
North Star for U.S. progress in broadband. We recommend the creation of a Broadband Progress Board to establish a five‐year plan with transparent benchmarks and annual performance
measurements. In addition to addressing speed, affordability, availability, and adoption, these benchmarks should also include demand principles, as outlined above, and national priorities such
as: • Healthcare: Tele‐Health, Health record Management, and Aging in Place • Education: E‐Learning, Education in the Classroom, After‐School, and In the Home • Economic Development and
Employment: Job Training and Re‐Training, Career Coaching, and Job Growth • Rural Economic Development • Home‐based Access to Broadband • Digital Literacy The
government
should establish a system of incentives and policy directives to increase supply and demand, promote public‐private partnerships, drive
innovation, and ensure affordability for low‐income people. These incentives will be incorporated to spur private sector investment and personal adoption, and
thereby stimulate the market and meet the public test of creating a social dividend. This approach, rather than burdensome regulation, should neither be a means nor an unintended consequence
The government should also create a National Emergency Network, a meet‐you‐where‐you‐are digital framework and delivery
of this National Plan.
system for natural and man‐made emergencies. This Network must have an intentional focus on the poor, as they are most often
deprived of information and resources that are critical in coping with an emergency, most evident in the events leading up to and the aftermath of
Hurricane Katrina.

4
Broadband Access – Aff – Response Time Advantage cont’
Only the Federal Government can solve

The States literally cannot interlink a public-private National Emergency Network

Carafano ‘6 (James Jay Carafano, Ph.D., is Senior Research Fellow for National Security and Homeland Security in the
Douglas and Sarah Allison Center for Foreign Policy Studies, a division of the Kathryn and Shelby Cullom Davis
Institute for International Studies, at The Heritage Foundation. Talking Through Disasters: The Federal Role in
Emergency Communications – Backgrounder #1951—July 17th –
http://www.heritage.org/research/nationalsecurity/bg1951.cfm)

Addressing the most serious problems requires more sophisticated solutions than simply demanding vast amounts of federal tax dollars for interoperable
communications, and deciding how the federal government can best address communications shortfalls requires understanding Washington’s proper
role. Responding to emergencies is primarily a state and local government mission.[7] The federal government should therefore
focus on the tasks that only Washington can perform. Only the federal government can integrate the efforts of local, state,
regional, and private-sector assets into a national response system that enables the nation as a whole to support local
communities in the event of a disaster. It is Washington’s job to ensure the means and capacity for all jurisdictions to “plug” into a national system.
Additionally, the federal government should concentrate on responding to catastrophic disasters that put tens of thousands of lives and billions of dollars in property at
risk—dangers that would overwhelm the capacity of any state or local government. With regard to emergency management communications, creating a national
response network and responding to catastrophic disasters should define where Washington puts its priority effort. There are three aspects to emergency
management communications:

* Responding to everyday demands (the fires, criminal acts, and accidents that happen in communities routinely);

* Establishing regional and national communications so that local, state, and federal public and private assets can be
coordinated; and

* Operating under severe conditions when infrastructure is degraded (a widespread blackout, for example) or overwhelmed by
a surge in demand (such as when the New York 911 system crashed after the World Trade Center collapsed).

Clearly, Washington should focus on the second two, which are consistent with the federal mandate of creating a national system and responding to
catastrophic disasters. What Are the Best Policies? Federal emergency management communications effort should be focused exclusively on the highest federal
priorities—building the capacity for jurisdictions across the country to share critical information, act in a collaborative manner, and operate even when normal
telecommunications systems are wiped out or overwhelmed. Even with the right priorities, however, it will be difficult for the federal government to enhance the role
it plays unless it adopts policies that address the major obstacles to building better capabilities. These policies include the following. Policy #1: Put First Things First
Wireless communications will form the backbone of any emergency communications system. In a wireless system, information is transmitted over parts of the
electromagnetic spectrum rather than through wire lines or cables. This is important because in a disaster, infrastructure such as phone lines or switching trunks might
be disrupted. Additionally, responders may need information in places where there are no fixed communications systems available. In these cases, the federal govern-
ment plays a significant role. The electromagnetic spectrum that carries wireless communications is managed by the federal government. Some is auctioned for
commercial use. Other spectrum is allocated for public purposes. Current federal policies do not facilitate creating a national emergency network or building the
capacity for responding to catastrophic disasters. Federal, state, and local public safety agencies already have a large allocation of spectrum for emergency responders.
The problem is that the allocation is scattered throughout the frequency band, which is grossly inefficient. Compared to the commercial use of the spectrum,
emergency response networks carry a much smaller number of transactions with only an intermittent surge in demand. As a result, bandwidth is significantly
underutilized. In turn, local jurisdictions manage their spectrum by breaking allocations into smaller pools of channels for each individual agency (such as giving fire
departments in neighboring communities their own dedicated channels). Further splitting the spectrum exacerbates the inefficiency of underutilization. In many cases,
federal, state, and local responders do not even have the capacity to share spectrum when they are all working in the same region and responding to the same crisis.[8]
The commercial space uses the spectrum about 20 times more efficiently than governments use it.[9] The spectrum licensed to
federal, state, and local public safety users supports fewer than 3 million users across the U.S. In contrast, commercial operators (such as
Sprint and T-Mobile) support about 80 million users in a comparable amount of spectrum. Additionally, the commercial networks provide both voice and high-speed
data. Most public safety networks carry voice service only. With a relatively small number of users, the emergency management spectrum holds little attraction for
private-sector service providers. There is virtually no incentive for private-sector investment. Economies of scale cannot be used to spur investments, to innovate, and
to reduce costs. However, that could change if federal policies created commercial opportunities. Policy #2: Open Emergency Management Frequencies as Dual-Use
Spectrum The government should provide the private sector with opportunities to offer commercial services in bandwidth that
currently is reserved for public safety agencies. In turn, the private sector could invest in building up capacity for emergency
services to operate within the spectrum and provide state-of-the-art, low-cost, secure services and guaranteed access during
disaster situations. Prohibitions against sharing the public safety spectrum should be eliminated, and federal agencies should have greater flexibility in deciding
how to share, sell, or barter spectrum to obtain the emergency communications services they need from the private sector.

5
Broadband Access – Aff – Response Time Advantage cont’
Quick response time minimizes death tolls

CCVM ‘7 (Center for Creative Voices in Media – quoting Professor Jon M. Peha of Carnegie Mellon
University, THE CASE FOR UNIVERAL BROADBAND IN AMERICANOW – October 1st –
www.creativevoices.us/cgi-upload)

Professor Jon M. Peha of Carnegie Mellon University, an expert on public safety communications systems, recently testified before Congress about
the need for a national broadband infrastructure: When public safety communication systems failed, people can die. We had seen
this occur after the 9/11 attacks, after Hurricane Katrina, and in countless large and small emergencies throughout the country. Many of these tragic failures
are avoidable. In addition to suffering from much-discussed interoperability problems, the communication systems used by public safety are less
dependable than they should be, less secure than they should be, and less spectrally efficient than they should be. Ironically, they are
also more expensive than they should be, which means taxpayers pay extra for systems that are unnecessarily prone to failure. Instead, Peha told Congress: First
responders should have a single nationwide broadband communications system with technology that is based on open standards.
This requires federal leadership. Today, the federal government needs to exert the same kind of leadership that enabled
America to build a superhighway system that is the envy of the world. In 1956, in the National Interstate and Defense Highways Act, the federal
government committed to building a nationwide network of high speed interstate superhighways to better provide for homeland security and national defense. In
the same way that these highways went on to spur economic development nationwide, the Hermiston and Edmonds experiences demonstrate
that deployment today of broadband technologies that provide for America’s homeland security and public needs can also have a
tremendously beneficial impact on economic growth and job creation.

6
Broadband Access – Aff – Life Line and Link-Up Solvency

Expanding the Life Line and Link-Up services solves and would not require new allocation of money

Womack ‘9 (Ryan Womack, BroadbandCensus, a Washington D.C. based publication with embedded reporters and writers
from inside the beltway, dedicated to covering the issues in and around broadband access and deployment. He is
internally quoting former FCC Chair Kevin Martin – Silicon Angle – June 10th – http://siliconangle.com/ver2/?
tag=fcc)

With high unemployment levels, foreclosures across the nation and everyone’s household budgets being stretched thin, we call on the FCC to reduce the proposed
hike in the Universal
Service Fund’s contribution level,” NASUCA President David Springe, said in a statement. The proposed increase would bring the USF
Every telephone user in the country pays into the USF
contribution to 12.9 percent of a users bill, compared with the previous high of 11.4 percent.
whether they know it or not. Not a tax, the monies are deposited in a trust fund used to maintain and subsidize rural telephone service to places where it would
otherwise be prohibitively expensive. The USF is controlled by the Universal Service Administrative Company, and overseen by a joint board consisting of FCC and
state-level commissioners. The fund has four main goals: ensure reasonable rates for all consumers, assist low-income families in
telecommunications, enable rural health care companies telecommunication capabilities, and assist eligible schools and libraries in providing low-cost internet
access. A pilot proposal floated last year by then-FCC Chairman Kevin Martin would expand two USF programs to provide
broadband internet access to selected homes, using a $30 million trial system based on the current Life Line and Link Up programs. The
proposal was endorsed by the National Association of Regulatory Utility Commissioners at its Winter meeting earlier this year. With over $7 billion in the
USF trust fund according to some estimates, NASUCA says the joint board can easily expend these available funds rather than
increase carrier contributions.

7
Broadband Access – Aff – Response Time Adv – Quick Response Time = Lower Death Toll
Quick response time minimizes death tolls

Larson ‘4 (Richard C. Larson, Massachusetts Institute of Technology Engineering Systems Division and
Department of Civil and Environmental Engineering Room – Decision Models for Emergency
Response Planning – Sept 28th – http://create.usc.edu/research/50755.pdf)

Carefully planned detection of and response to any bio-terrorism attack is crucial in terms of saving lives. This new area of concern has only
recently been the focus of O.R. analyses. But the work has been widely reported and has had major national impact. The developed models provide a consistent
framework for considering operations following a bio-attack. The work has changed our national policies with regard to immunizations and medications following a
bio-terrorist attack. With regard to a possible anthrax attack, the co-authors Lawrence Wein and Edward Kaplan state, Two
pounds of weapons-grade
anthrax dropped on a large American city could result in more than 100,000 deaths, even if early cases were successfully diagnosed,
antibiotics were distributed broadly and drug adherence was high. The reason for the catastrophic death toll: Not enough people would receive antibiotics quickly
enough to prevent symptoms from developing, and those who developed symptoms would overwhelm the medical facilities. Any
plan to cope with this
scenario must include (1) immediate intervention, (2) rapid distribution of antibiotics to everyone in the affected region, (3) aggressive
education to ensure adherence to the full course of treatment and (4) creation of "surge capacity" to treat the sudden influx of patients. [43] Their conclusions, together
with their colleague David Craft, were based on a highly sophisticated set of mathematical models that included an airborne anthrax dispersion model, an age-
dependent dose-response model, a disease progression model, and a set of spatially distributed two-stage queueing systems consisting of antibiotic distribution and
hospital care [42]. One of their most controversial recommendations is to have non-professionals disperse antibiotics very soon after an attack and/or have those
antibiotics in the hands of citizens at all times – pre-positioned at the points of need in case of such an attack [15]. Based on these recommendations, the US Postal
The same three co-authors
Service has announced that its mail carriers will help to distribute antibiotics if a large attack occurs in the Washington D.C. area4.
also used O.R. methods to study response to smallpox attack [16]. The initial federal policy had been to isolate the symptomatic
victims, trace and vaccinate their contacts, quarantine others, and hope that the spread of disease could be limited by these measures. The O.R. analysis, again based
on a highly complex but compelling set of models, indicated that the initially selected policy would result in many deaths. Instead, the analysis suggested a
different response: as soon as the attack is recognized, undertake mass vaccination across the entire population. This
recommendation caused quite a stir nationally, in the press, among physicians and with policy makers, but now has been adopted as official US
policy. O.R. is playing major roles in other aspects of medical response to major emergencies as well. For instance, Linda Green has shown how usual efficiency
measures defined in terms of bed occupancy in hospitals cause large queueing delays for beds even in the presence of routine demand; demands caused by major
emergency events would overwhelm such hospitals [11, 12]. Bravata et. al. extend the policy conclusions of the anthrax and smallpox work described above to
examine regionalized or local stockpiling of drugs and response to bio-terrorism events [8, 44]. One can see here the need for additional O.R. research on optimal
locations of drug and equipment stockpiling. Traditional location theory seeks global optimal solutions that minimize some measure of total system travel time or
distance [29, Chapter 6]. Usually one or a small number of carefully positioned facilities accomplish this travel time minimization goal. Within an environment of a
major emergency, the traditional formulation of the facility location problem may be highly inappropriate. Instead, one has to consider that one or more of the
stockpiled facilities may be destroyed by the emergency event and/or travel paths leading from them may be damaged or inaccessible. In such cases, one may want to
position more than the usual number of facilities, each containing fewer medications and supplies, in order to increase the probability of survivability of the drug and
supply distribution system. This version of the problem is somewhat similar to the so-called the p-dispersion location problem, where p is the number of facilities
Should a major bio-terrorism event occur at one
being dispersed. These issues are addressed in new papers by Gong et. al. [9] and Berman et. al. [4].
identified location or limited region, getting timely appropriate medical care to those exposed is critical for their survival. One can
imagine scenarios in which victims are first triaged, those identified as needing immediate transport are taken to nearby hospitals or other medical
facilities, initial treatments are administered, and then many patients at the nearby hospital are moved out to more distant locations. For if
such outward movements are not done, the nearby hospitals become queueing choke points in the system, with their own limited
resources totally overwhelmed. The cascading wave-like movement of patients out of nearby facilities to more distant ones reminds one of the reverse of
NYCRI’s fire relocation model. Creating such hospital “surge capacity” (in the words of Kaplan and Wein) certainly warrants further research.

8
Broadband Access – Aff – Response Time Adv – Quick Response Time = Lower Death Toll
Quick response time vital to minimizing death tolls

Lee ‘9 (Young M. – IBM T.J. Watson Research Center – The T.J. Watson Research Center is the centerpiece of IBM's globally
integrated approach to innovation – “Simulating Distribution of Emergency Relief Supplies for Disaster Response
Operations –
http://domino.watson.ibm.com/library/cyberdig.nsf/papers/F41126CC5A974D11852575E1006061F2/$File/rc24813.pdf)

Recent natural and man-made disasters such as Hurricane Katrina in 2005, Hurricane Gustav in 2008, flooding in Iowa in 2008, flooding in North Dakota in
2009, earthquake in the Sichuan Province of China in 2008, U.S. anthrax attack of 2001, and the possibility of a pandemic H1N1 influenza in 2009 made us
realize how important it is to have effective disaster preparedness and response planning. Larson et al (2006) provide a historical review of
five major disasters – the Oklahoma City bombing in 2005, the crash of United Flight 232 in 1989, the Sarin attack in the Tokyo subway in 1995, Hurricane Flood in
1999, and Hurricane Charlie in 2004 – and stress the need for operations research models to improve preparedness for and response to major emergencies. One of the
responsibilities of federal and local governments is to distribute emergency relief supplies such as water, meals, blankets, generators, tarps, and medicine to disaster
Emergency relief operations may need
victims in the event of various natural and man-made disasters such as hurricane, earthquake, flood and terrorism.
to cover millions of people in a short period of time. For example, it is desired that water and meals within three days to prevent serious health hazard
and death. For a wide-spread smallpox attack, the vaccination of all in potential contact is recommended within 4 days of exposure,
and in the event of an anthrax outbreak, the distribution of antibiotics is recommended within two days of the event (AHRQ – Agency
for Healthcare Research and Quality, 2004) . Adverse consequences of ineffective distribution planning can include death, sickness, and
social disorder. For example, the confirmed death toll for hurricane Katrina is over 1,300 victims, in addition to $200 billion of damages. A better response
plan would have reduced the death toll (Iqbal 2007). Therefore, careful planning of distribution of emergency supplies considering various risk factors
and uncertainty is important because it will influence the lives of many people. The task of providing immediate disaster relief also requires coordination between
local and the federal government (Iqbal 2007).

9
Broadband Access – Aff – Starting with those in poverty = key
( ) Expanding Broadband Access to lower-income households is key -- vital to disaster response times

Lloyd ‘8 (Mark Lloyd is a Senior Fellow at the Center for American Progress. “Ubiquity Requires
Redundancy: The Case for Federal Investment in Broadband” – Science Progress – January 18th –
http://www.scienceprogress.org/2008/01/ubiquity-requires-redundancy/)

In small rural towns, in the crowded barrios and ghettos of urban U.S. cities, in those places where financial institutions are not yet convinced they can
get an adequate return on investment, Americans do not have access to the communications networks they will need to keep them
safe in the future.[5] It is no coincidence that these same places hold our nation’s toxic waste dumps, our chemical plants, and our seaports and airports, yet
we do not have the ability to communicate most effectively where we are most vulnerable. The Department of Defense has long been
provided almost all the communication resources it needed to protect American interests overseas. What has been too often forgotten is the importance
of equipping all Americans with the ability to participate effectively in the national defense effort at home. Americans take pride in
assisting when their communities are under attack or threatened by a natural disaster. A concerted effort must be made to equip all Americans so
they are able to communicate effectively when confronted by catastrophe. President Eisenhower understood the value of a robust transportation
system at home to sustain national unity and to promote defense needs. In announcing the new interstate highway system, Eisenhower called the effort “the National
Defense Highway System.” In addition to some direct experience with a problem-laden military convoy from Washington, D.C. to San Francisco he took in 1919,
Eisenhower was also impressed with the German autobahn. “The old convoy,” Eisenhower said, “had started me thinking about good, two-lane highways, but
Germany had made me see the wisdom of broader ribbons across the land.” Despite the squabbles of some local government and business leaders who fought against
a federal highway system, Eisenhower was convinced that America could do better. As Richard Weingoff reports in his excellent history of the interstate system,
when Vice President Richard M. Nixon delivered an address before a 1954 conference of state governors at Lake George, NY, reading from Eisenhower’s detailed
notes, he declared that the U.S. “highway network is inadequate locally, and obsolete as a national system.” Nixon then recounted Eisenhower’s convoy and then
cited five “penalties” of the nation’s obsolete highway network: the annual death and injury toll, the waste of billions of dollars in detours and traffic jams, the
clogging of the nation’s courts with highway-related suits, the inefficiency in the transportation of goods, and “the appalling inadequacies to meet the demands of
catastrophe or defense, should an atomic war come.”[6]If America is to be ready “to meet the demands of catastrophe or defense,” all Americans need
access to advanced telecommunications services in the 21st century, just as they needed access to an advanced highway system in the 20th century. But
as the 9/11 Commission noted in its report, the United States is not ready for a national emergency. And as every comprehensive analysis of the
tragedy of Hurricane Katrina revealed, we are not prepared to handle a major natural disaster. Both of these experiences highlight the
importance and the multiple failures of U.S. communications services as warning systems or as systems to allow for the
coordination of first responders.[7]

10
Broadband Access – Aff – Response Time Adv – A-to Mueller Terrorism Takeouts

And, Mueller is alone – prefer consensus of experts

Allison ‘7 (Graham Allison, Director – Belfer Center for Science and International Affairs, Professor of Government, and Faculty Chair of
the Dubai Initiative – Harvard University’s Kennedy School of Government, “Symposium: Apocalypse When?”, The National
Interest, November / December 2007, Lexis)

MUELLER IS entitled to his opinion that the threat of nuclear proliferation and nuclear terrorism is "exaggerated" and "overwrought." But
analysts of various political persuasions, in and out of government, are virtually unanimous in their judgment to the contrary. As the
national-security community learned during the Cold War, risk = likelihood x consequences. Thus, even when the likelihood of nuclear Armageddon was small, the
consequences were so catastrophic that prudent policymakers felt a categorical imperative to do everything that feasibly could be done to prevent that war. Today, a
single nuclear bomb exploding in just one city would change our world. Given such consequences, differences between a 1 percent and a 20 percent likelihood of such
an attack are relatively insignificant when considering how we should respond to the threat. Richard Garwin, a designer of the hydrogen bomb who Enrico Fermi once
called "the only true genius I had ever met", told Congress in March that he estimated a "20 percent per year probability [of a nuclear explosion-not just a
contaminated, dirty bomb-a nuclear explosion] with American cities and European cities included." My Harvard colleague Matthew Bunn has created a model in the
Annals of the American Academy of Political and Social Science that estimates the probability of a nuclear terrorist attack over a ten-year period to be 29 percent-
identical to the average estimate from a poll of security experts commissioned by Senator Richard Lugar in 2005. My book, Nuclear Terrorism, states my own best
judgment that, on the current trend line, the chances of a nuclear terrorist attack in the next decade are greater than 50 percent. Former Secretary of Defense William
Perry has expressed his own view that my work may even underestimate the risk. Warren Buffet, the world's most successful investor and legendary
odds-maker in pricing insurance policies for unlikely but catastrophic events, concluded that nuclear terrorism is "inevitable." He stated,
"I don't see any way that it won't happen." To assess the threat one must answer five core questions: who, what, where, when and how? Who could be
planning a nuclear terrorist attack? Al-Qaeda remains the leading candidate. According to the most recent National Intelligence Estimate (NIE), Al-Qaeda has
been substantially reconstituted-but with its leadership having moved from a medieval Afghanistan to Pakistan-a nation that actually has nuclear
weapons. As former CIA Director George J. Tenet's memoir reports, Al-Qaeda's leadership has remained "singularly focused on acquiring
WMDs" and that "the main threat is the nuclear one." Tenet concluded, "I am convinced that this is where [Osama bin Laden] and his operatives want
to go." What nuclear weapons could terrorists use? A ready-made weapon from the arsenal of one of the nuclear-weapons states or an
elementary nuclear bomb constructed from highly enriched uranium made by a state remain most likely. As John Foster, a
leading U.S. bomb-maker and former director of the Lawrence Livermore National Laboratory, wrote a quarter of a century ago,
"If the essential nuclear materials are at hand, it is possible to make an atomic bomb using information that is available in the open
literature." Where could terrorists acquire a nuclear bomb? If a nuclear attack occurs, Russia will be the most likely source of the weapon or material.
A close second, however, is North Korea, which now has ten bombs worth of plutonium, or Pakistan with sixty nuclear bombs. Finally, research
reactors in forty developing and transitional countries still hold the essential ingredient for nuclear weapons. When could terrorists launch the first nuclear attack?
If terrorists bought or stole a nuclear weapon in good working condition, they could explode it today. If terrorists acquired one hundred pounds of highly enriched
uranium, they could make a working elementary nuclear bomb in less than a year. How could terrorists deliver a nuclear weapon to its target? In the same way that
illegal items come to our cities every day. As one of my former colleagues has quipped, if you have any doubt about the ability of terrorists to deliver a weapon to an
American target, remember: They could hide it in a bale of marijuana.

11
Broadband Access – Aff – Response Time Adv – A-to States Cplan
(insert Aff theory objections as necessary)

( ) perm – do both

( ) Solvency deficit – public-private partnerships

a) These partnerships are vital for fast response – that’s Carafano. Here’s more ev:

Carafano ‘6 (James Jay Carafano, Ph.D., is Senior Research Fellow for National Security and Homeland Security in the
Douglas and Sarah Allison Center for Foreign Policy Studies, a division of the Kathryn and Shelby Cullom Davis
Institute for International Studies, at The Heritage Foundation. Talking Through Disasters: The Federal Role in
Emergency Communications – Backgrounder #1951—July 17th –
http://www.heritage.org/research/nationalsecurity/bg1951.cfm)

From September 11, 2001, to Hurricane Katrina in 2005,


Congress and the Bush Administration have wrestled with the challenge of improving
emergency management communications. An unprecedented federal spending spree has yielded scant progress, however, and
Washington’s programs should be scrapped. It is unlikely that they will ever be able to achieve, either efficiently or effectively, the goal of creating the kind of
emergency communication systems the nation needs to respond to national disasters.
The right approach would include adhering to a set of policies
that promote effective public–private sharing of the emergency management electromagnetic spectrum, create a national capability to
deploy a wide-area emergency management communications network for catastrophic disasters, and establish coherent national leadership for
emergency response communications.

b) Only Federal Signal creates the investor perception for these partnerships

Extend our 1AC Atkinson ev. Here’s more ev

Rintels ‘8 (Jonathan Rintels is the Executive Director of the Center for Creative Voices in Media, a nonprofit
organization – An Action Plan for America Using Technology and innovation to address our
nation’s critical challenges
https://www.policyarchive.org/bitstream/handle/10207/11811/Benton_Foundation_Action_Plan.pdf?sequence=1)

By promoting both the supply of and the demand for broadband, a well-conceived NBS will establish a “virtuous circle” in which an increased supply
of robust and affordable broadband stimulates creation of applications that produce wide-ranging, valuable social benefits that then cause citizens to demand even
more robust and affordable broadband; which in turn stimulates greater investment in more robust broadband; which then stimulates the creation of even more
Strong federal leadership, expressed in a
beneficial applications that cause citizens to demand even more robust and affordable broadband.
comprehensive NBS, is crucial to ending the stand-off between those ready to invest in the deployment of robust broadband
when great technologies and applications emerge to take advantage of it, and those ready to invest in transforming technologies
and applications and who are waiting for robust broadband to be built out.

(Note: NBS stands for “National Broadband Strategy”)

12
Broadband Access – Aff – Response Time Adv – A-to States Cplan cont’
( ) Solvency Deficit – Reporting requirements

Having to report to 50 different regulators would slow the network -- jacking the industry.

Goldberg ‘9 (Neal M. Goldberg, Comments Before the FEDERAL COMMUNICATIONS COMMISSION in the Matter of “A
National Broadband Plan for Our Future: -- GN Docket No. 09-51 -- National Cable & Telecommunications
Association -- June 8th)

Shaped by market forces rather than state and local regulatory requirements, the deployment of broadband has spread at a remarkable pace, demonstrating that the
absence of regulation can and will serve consumer welfare. Nevertheless, state and local governments continue to propose a wide range of
regulations – from billing rules, collections requirements, speed warranties, customer service requirements, local privacy rules, filing and notice
provisions, network architecture requirements, pricing and promotional requirements as well as additional taxes and fees – that would form a patchwork of
unmanageable rules. Congress, the Commission and the courts have consistently confirmed that the Communications Act prohibits the imposition of local
franchising and fee requirements, or any other state or local regulation of the provision of information services without explicit Commission authority,108 limiting
localities’ involvement to the management of facilities in the public rights-of-way. Nevertheless,
each time a new broadband service is introduced –
Wi-Fi being a recent example – there are numerous state and local governments that seek to require new and separate
authorizations from providers – and the payment of new fees – as a condition of offering these services.109 As the Commission has
recognized,110 the interstate character of broadband services makes complying with numerous state or local regulatory regimes
impracticable, if not impossible. Broadband networks often are not designed to follow state boundaries, and engineering them to
meet different requirements on a state-by-state basis is not always possible. Complying with inconsistent regulatory schemes, such
as varying requirements for service quality and reliability, also could require the installation of additional equipment locally, and perhaps additional personnel. All of
these changes would undermine the efficiency of the network, making the service less valuable to the public. The imposition of inconsistent state
regulatory regimes also would interfere with or even prevent providers from efficiently providing various capabilities without regard to location, because tailoring
any such requirements inevitably
them to meet the particular requirements of each state would be impossible, given their accessibility via the Internet. And
will raise costs. Frequently, such regulations are imposed on only a select group of broadband providers, using a particular platform or technology, leading to
higher costs for some competitors and depriving consumers of a more meaningful choice among them. Even if the state or local governments are not
successful in imposing those requirements, the time and expense involved in addressing and resolving these requests
greatly slows broadband network deployment and the roll-out of new broadband services to consumers. For all of these reasons, the Commission
should affirm its exclusive jurisdiction over all broadband services, and explicitly preempt state and local regulation, except for
generally applicable consumer protection laws to which any business operating in a state is subject.

13
Broadband Access – Aff – Response Time Adv – A-to States Cplan cont’
( ) Solvency Deficit: Interlinking

Only the Federal Government can do it – that’s Carafano

And, even if their cplan interlinks today – it’s impossible for it to stay uniform over time

Peya ‘7 (Jon – Professor of Electrical Engineering and Public Policy, and Associate Director of the Center for Wireless
& Broadband Networking @ Carnegie Mellon University – Hearing on Oversight of NTIA and Innovations in
Interoperability – March –
http://www.ece.cmu.edu/~peha/Peha_testimony_public_safety_comm_March2007.pdf)

The communications infrastructure used today by American first responders is


I applaud you for holding a hearing on this important topic.
disgracefully inadequate, especially in view of threats to homeland security since 9/11. Congress could change that. When public safety
communications systems fail, people can die. We have seen this occur after the 9/11 attacks, after Hurricane Katrina, and in countless large and small emergencies
Many of these tragic failures are avoidable. In addition to suffering from much-discussed interoperability problems, the
throughout the country.
communications systems used by public safety are less dependable than they should be, less secure than they should be, and less
spectrally efficient than they should be. Ironically, they are also more expensive than they should be, which means tax-payers pay extra for systems that are
unnecessarily prone to failure [1]. The fact that public safety’s spectrum use is far less efficient than commercial cellular has prompted some to argue that public
safety should get no more spectrum. However, until the federal government addresses the cause of these inefficiencies, it must feed public safety’s inevitable growing
hunger for spectrum. Addressing the cause may involve allocating more spectrum, establishing policies so the new spectrum is used efficiently, and later reclaiming
The basic problem is that decisions about public safety communications are left to tens of thousands of independent
some existing allocations.
local public safety agencies. Despite the many bright and dedicated professionals working for these agencies, it simply is not
possible to build a dependable, cost-effective system this way. First responders should have a single nationwide broadband
communications system [2] with technology that is based on open standards. This requires federal leadership.

14
Broadband Access – Response Time Adv – A-to States Cplan – Extensions off “solvency
deficit – Interlinking”
( ) States will close their emergency networks over time – this will create interoperability issues

Lloyd ‘8 (Mark Lloyd is a Senior Fellow at the Center for American Progress. “Ubiquity Requires Redundancy: The
Case for Federal Investment in Broadband” – Science Progress – January 18th –
http://www.scienceprogress.org/2008/01/ubiquity-requires-redundancy/)

it is vital that the different systems and the equipment operating over these communications systems be
In addition to redundancy,
interoperable. One unfortunate result of relying on private competition is the tendency of competitors to develop systems which do not permit interoperability. A
key failing of emergency response after 9/11 and Katrina was the lack of interoperable communications equipment.[15] Many of
the problems of interoperability are the result of turf wars and not equipment limitations. Federal policies to override local turf
wars are essential. The Department of Homeland Security has made it a priority to solve the range of problems related to interoperability.[16] But again,
interoperability must not be limited to operation over one infrastructure, but must cross all relevant communications platforms. Phones and computers must operate
Interoperability is a vital component of emergency
over wireline and wireless infrastructure, including competing wireline and wireless networks.
service and a modern communications network. Closed “private” broadband networks stifle not only innovation and service competition,
they also limit the ability of all Americans to participate effectively in response to natural disaster and terrorist attack. If the
United States is to compete effectively in a global economy and defend itself against global terrorist threats, then it must take advantage of
the unique opportunities only possible with an open network.

( ) Over time, States will attempt to improve their networks – hurting standardization

Rintels ‘8 (Jonathan Rintels is the Executive Director of the Center for Creative Voices in Media, a nonprofit organization
– An Action Plan for America Using Technology and innovation to address our nation’s critical challenges
https://www.policyarchive.org/bitstream/handle/10207/11811/Benton_Foundation_Action_Plan.pdf?sequence=1)

Strong Federal Leadership Is Necessary to Implement a National Broadband Strategy That Will Enhance Public Safety and
Protect Homeland Security Professor Jon M. Peha of Carnegie Mellon University, an expert on public safety communications systems, recently testified
before Congress about the compelling public safety and homeland security rationale for a national broadband infrastructure: When public safety
communication systems fail, people can die. We had seen this occur after the 9/11 attacks, after Hurricane Katrina, and in countless large and small
emergencies throughout the country. Many of these tragic failures are avoidable. In addition to suffering from much-discussed
interoperability problems, the communication systems used by public safety are less dependable than they should be, less secure than they
should be, and less spectrally efficient than they should be. Ironically, they are also more expensive than they should be, which means taxpayers pay extra
for systems that are unnecessarily prone to failure.125 Instead, Peha told Congress: “First responders should have a single nationwide broadband
communications system with technology that is based on open standards. This requires federal leadership.”126 The kind of leadership needed
today was on display in 1956 when the federal government, in the National Interstate and Defense Highways Act, signed enthusiastically into law by President
Eisenhower, committed to building a nationwide network of world-class, high-speed interstate superhighways to better provide for public safety and homeland
security.127 Today, in the Digital Age, for those same reasons, the federal government must exert that same kind of leadership to ensure the standards, shared
services, and connections to a new world-class infrastructure of 21st-century telecommunications networks. “All Americans need access to advanced
telecommunications services in the 21st century,” Lloyd writes, “just as they needed access to an advanced highway system in the 20th century.” This is particularly
true for all emergency organizations meeting critical public needs. Just as we connected schools to broadband at the end of the last century, we need to hook up the
more than 100,000 emergency agencies in the nation. “Katrina and 9/11 remind us that access to advanced telecommunications service is a public need. We need
national leadership to remind us of this, and insist on policies that address public needs.”128 The right applications for the right networks Achieving integrated and
interoperable emergency response systems requires that 1) emergency organizations have access to broadband, 2) the networks serving this balkanized field
interconnect, and 3) most importantly, the right data and applications can be transmitted over Internet networks.129 However, there’s no one government agency
too often, the agencies charged with different aspects of the
charged with taking a comprehensive view of public safety and emergency response. And,
emergency response focus too much on building networks, not the needed standardization of data and applications that must run over
them.

15
Broadband Access – Response Time Adv – A-to States Cplan – Extensions off “solvency
deficit – Interlinking” cont’
( ) Over time, de-centralized action would grow slower and balkanized – Federal Action is key

Lloyd ‘8 (Mark Lloyd is a Senior Fellow at the Center for American Progress. “Ubiquity Requires Redundancy: The
Case for Federal Investment in Broadband” – Science Progress – January 18th –
http://www.scienceprogress.org/2008/01/ubiquity-requires-redundancy/)

The result of this regulatory protection of different bits of the telecommunications industry leaves the United States with
balkanized communications capabilities. If the prevention or response to the terrorist attacks on 9/11—when New York City police, fire,
and rescue workers could not communicate with each other amid the chaos and carnage of that awful day—or the prevention or response to the failed levees
overwhelmed by hurricane Katrina demonstrated anything, they demonstrated the need for better command and control.[9]
Indeed, in the debate over communications policy, the term “command and control” is little more than a right-wing slogan. Outside of military operations this phrase
has never accurately described either the policymaking process or the execution of policy in the United States. Even the federal highway system so important to
there is no question
Presidents Roosevelt, Truman, and Eisenhower for military purposes, was the product of a contentious federal-state partnership. Still,
about the importance of federal vision and leadership and funding.[10] The importance of strong federal engagement in the development of the
national highway system is beyond dispute. The same can be said of the importance of federal leadership in the U.S. space program, which led to the U.S. satellite
Perhaps the most
industry, as well as federal leadership in the Defense Advanced Research Projects Agency, which spurred the research behind the Internet.
direct corollary to the national highway system in the U.S. telecommunications arena is the National Communications System.
The NCS began after the Cuban missile crisis. Communications problems between and among the United States, the Soviet Union, and other nations helped to create
the crisis. President Kennedy ordered an investigation of national security communications, and the
National Security Council recommended forming
a single unified communications system to connect and extend the communications network serving federal agencies, with a
focus on interconnectivity and survivability. The NCS oversees wireline (Government Emergency Telecommunications Service) and cellular service
(Wireless Priority Service).[11] The NCS is now part of the Department of Homeland Security’s Preparedness Directorate, and despite the increased attention to the
communication needs of first responders on September 11, 2001, NCS failures and inadequacies were made obvious after Katrina.[12] In New Orleans, police officers
were forced to use a single frequency on their patrol radios, which “posed some problems with people talking over each other,” explained Deputy Policy Chief Warren
In
Riley at the time. “We probably have 20 agencies on one channel right now.” And with little power to recharge batteries, some of those radios were soon useless.
southern Mississippi, the National Guard couldn’t even count on radios. “We’ve got runners running from commander to
commander,” said Maj. Gen. Harold Cross of the Mississippi National Guard. “In other words, we’re going to the sound of gunfire, as we used to say during the
Revolutionary War.”[13] As Sen. John Kerry (D-MA) said: “This is a further demonstration of our inadequate response to the 9/11
Commission’s recommendations and other warnings about the failures in our first responders’ communications systems.”[14]

16
A-to “Bioterror risk exaggerated – it’s a conspiracy to justify funding”

( ) Prefer our ev – our Deutch card is more qualified and he is not running a program that would financially profit from
exaggerating the bioterror risk.

( ) Leitenberg is wrong – the bulk of top experts think he’s wrong.

National Journal ’05 (April 23, 2005 – lexis)

Other experts are less generous. Milton Leitenberg, a biological-weapons expert and a scientist at the University of Maryland's
School of Public Policy, speaks of the "huckstering of an imminent biological-warfare threat." Since 9/11, "high-
end scenarios, if not science-fiction ones, were the rule in studies prepared for U.S. government agencies, even by competent
contractors," he wrote in The Problem of Biological Weapons, published last year. Portraying the Threat The perspective of
Leitenberg and other critics is controversial, because it challenges views strongly held and repeated
throughout the federal government and the biological-defense community. "This is a very dangerous
question," says Dr. Tara O'Toole, the chief executive officer of the Center for Biosecurity at the University
of Pittsburgh Medical Center, the co-developer of Atlantic Storm and Dark Winter, and a leading, if unnamed, target of the
critics. "I think biological-weapons attacks -- specifically, covert bioterror attacks -- are the single most terrifying
security threat facing the world, not just the United States, far more frightening and potentially damaging than a
nuke going off in American cities," she said. Warned retired Air Force Col. Randall Larsen, a prominent biodefense
advocate who co-developed the Atlantic Storm and Dark Winter exercises, "We could lose a million people in a week
in this country from a sophisticated biological attack."

( ) their “conspiracy theory” claim is just silly – Madrid, London, 9-11, and Japan subway all prove that terror episodes
are likely.

17
A-to “Bioterror fails – weather/uncontrollable factors”
( ) Some pathogens do not rely on weather and terrorist would release many strains to maximize death toll

Wheelis ’02 (Mark Wheelis works in the Section of Microbiology, University of California, Davis, CA 95616.
His research interests are in the history of biological warfare, especially in the First World War, and
the scientific aspects of biological and chemical arms control. BioScience – July 1st – lexis)

Dissemination of many introduced pathogens likewise requires relatively little expertise. Animal virus preparations could be
diluted and disseminated with a simple atomizer in close proximity to target animals, or the preparation smeared directly on the
nostrils or mouths of a small number of animals. This could be done from rural roads with essentially no chance of detection.
Dissemination of animal diseases could also be done surreptitiously at an animal auction or near barns where animals are densely
penned (as in chicken houses or piggeries). For plant diseases, simply exposing a mass of sporulating fungi to the air immediately
upwind of a target field could be effective, if environmental conditions were favorable for infection. The biggest challenge
of introducing a plant pathogen is probably timing the release with the appropriate weather conditions
(Campbell and Madden 1990). If pathogens are released immediately before the start of a dry period, few, if any, infections are
likely to result. However, if released at the start of a rainy period, these pathogens could cause a major
epidemic. The technical ease of introducing many agricultural pathogens makes it more likely that terrorists or
criminals would release pathogens in several locations in an attempt to initiate multiple, simultaneous
outbreaks. This would ensure that trade sanctions would be imposed, because it would undermine any argument that the
outbreaks are localized and do not jeopardize importing countries. It would also be more likely to overwhelm the response
capacity and lead to the uncontrollable spread of disease. This is the principal way in which a bioterrorist attack would differ
from a natural disease introduction, and it raises the question whether a system designed to respond to natural introductions can
deal effectively with sudden, multifocal outbreaks.

( ) Their “weather” argument is wrong – terrorist will still try no matter what.

The Record ’01 (The Record (Bergen County, NJ) October 7, 2001 – lexis)

To produce mass casualties, airborne delivery would be the preferred method for unleashing all types of chemical or biological
weapons. The poisoning of water supplies is unlikely, experts say, because the amount of toxic agent required would be
prohibitively large. Airborne delivery, however, is fraught with problems. There's an optimal size for particles to be
inhaled into the lungs. Many terrorism experts are reluctant to discuss this topic, although details are readily available from many
sources. In general, producing aerosols of the right size, either of liquids or powders, is extremely difficult or impossible without
special equipment and expertise. Crop dusting sprayers, for instance, are designed to produce droplets many times larger than
ideal. Weather conditions can also make a huge difference. Efficiency, though, may be low on a terrorist's
list of concerns. That fact alone raises the chance that some group may eventually attempt an act of terror
using biological or chemical means.

18
A-to “Bioterror will fail – technical obstacles”
( ) Terrorists can circumvent tech barriers – can buy off scientists.

National Journal ’05 (April 23, 2005 – lexis)

One way that terrorists might get around the challenges,


Danzig wrote, is to tap the expertise of scientists from
state-sponsored weapons programs, through bribery perhaps, or by
otherwise recruiting from the thousands of scientists and
technicians worldwide who have relevant training but no specific
biological-weapons experience. He estimated that up to 10,000
people on the planet have experience in state weapons programs,
and that perhaps more than a million scientists have the
relevant training.
( ) Terrorist can easily develop Bioweapons – critics are just wrong.

National Journal ’05 (April 23, 2005 – lexis)

The crux of the debate is whether terrorists could develop, build, and effectively use a catastrophic
bioweapon. O'Toole and Larsen are in the "yes" camp. Both cited a once-secret Defense Department experiment
called Project Bacchus, which was conducted in the late 1990s to assess whether terrorists could create a biological terror weapon
using commercially available equipment. The project "demonstrated quite persuasively that about four people,
only one of whom had any biological training at all -- and that was not with the U.S. weapons program, that was a
degree in biology -- could, using materials bought through the Internet, set up shop, undiscovered, and create
a Bacillus anthracis look-alike," O'Toole said. "It wasn't until I saw the Bacchus program, about how easy it
was to do, that I said, 'Holy shit, this is just a matter of time,' " said Larsen. Technologies for effective dissemination
of substances by spraying are becoming increasingly available for benign commercial purposes, O'Toole said. "Technology has
moved ahead very significantly, particularly in the last few years, mostly for agricultural purposes... We've figured out ways to
cover particles that make it much more likely they could go deeper into the lung to be absorbed into the blood -- very helpful if
you're building an anthrax weapon," she said. Richard Danzig wrote in an unpublished paper last year, "For an adequately
weaponized agent (especially a powder), many sources in the open literature suggest that simple backpack sprayers will perform
this function." Danzig is a former Navy secretary who is now a Pentagon bioterrorism consultant and the Sam Nunn Prize fellow
in international security at the Center for Strategic and International Studies in Washington. "From the standpoint of
proliferation, the fundamentals of this knowledge are already widespread and legitimately proliferating,"
Danzig wrote. "They are the basis of pharmaceutical, biotech, medical, and agricultural progress." "It seems likely that,
over a period between a few months and a few years, broadly skilled individuals equipped with modest
laboratory equipment can develop biological weapons. They can do this in state programs, as members of
terrorist groups, or simply as individuals," he added.
(Note: Dr. Tara O'Toole is the chief executive officer of the Center for Biosecurity at the University of Pittsburgh
Medical Center, and retired Air Force Col. Randall Larsen, is prominent biodefense advocate.)

19
A-to Terrorist d/n want large death tolls
( ) The death toll could accidentally be larger than anticipated

Our whole thesis is biological agent mutate and spiral out of control.

( ) History disproves their claim

Anderson 98 (Ph.D -- Microbes and Mass Casualties: Defending America Against Bioterrorism – Heritage
Foundation Reports -- Backgrounder #1182 --- May 26th --
http://www.heritage.org/research/Homelanddefense/BG1182.cfm)

Throughout human history, the threat of mass contagion has evoked primal fear. Natural pestilence periodically has
ravaged cities, states, and even entire civilizations. Rapid advances in genetic engineering in the past few decades have
increased the likelihood that disease-causing microbes could overwhelm the U.S. public health system and wreak horrific
destruction. Today, the United States faces the nightmarish possibility that terrorist groups would seek to cause mass
casualties by unleashing biological agents on U.S. soil. Biological agents, on an equal-weight basis, are the most lethal
substances known to mankind. According to a 1997 U.S. Department of Defense report on proliferation, the "most lethal
biological toxins are hundreds to thousands of times more lethal per unit than the most lethal chemical warfare agents." 2
They can be targeted against people, animals, or crops using a variety of means of delivery, from aerial bombs and spray
tanks to ballistic missile warheads. Until recently, the intelligence community generally has downplayed the capability of
terrorists to effect mass casualties using biological agents, noting that the impact of an attack is difficult to predict,
considering the sensitivity of microorganisms to meteorological conditions. Most analysts agreed with the view that
terrorists only "want a lot of people watching, not a lot of people dead." 4 But the 1993 World Trade Center
bombing, the 1995 sarin attack in Tokyo`s subway system, and the 1996 Oklahoma City bombing shattered that
conventional wisdom. These attacks indicate an important threshold has been breached; clearly, some terrorist
groups want a lot of people watching and a lot of civilians dead. Belatedly, senior defense and law enforcement
officials are recognizing the growing danger of bioterrorism. Gordon Oehler, then director of the Nonproliferation Center
of the Central Intelligence Agency (CIA), testified before Congress in March 1996 that Extremist groups worldwide are
increasingly learning how to manufacture chemical and biological agents, and the potential for additional chemical and
biological attacks by such groups continues to grow. In January 1998, Defense Intelligence Agency chief Lieutenant
General Patrick Hughes testified that chemical and biological weapons have a "high probability of being used over the
next two decades." 6

( ) Recent developments are on our side:

9-11, Madrid, London, etc. Death tolls are obviously rising.

20
A-to Bioterror Advantage – Frontline

( ) Bioterror risk exaggerated – it’s a conspiracy to justify funding for counter-measures.

Leitenberg ‘05 (MILTON LEITENBERG is a senior research scholar at the Univ. of Maryland and is the author of "Assessing
the Biological Weapons and Bioterrorism Threat." LA Times – Feb 17th – lexis)

A pandemic flu outbreak of the kind the world witnessed in 1918-19 could kill hundreds of millions of people. The only lethal
biological attack in the United States -- the anthrax mailings -- killed five. But the annual budget for
combating bioterror is more than $7 billion, while Congress just passed a $3.8-billion emergency package to prepare for
a flu outbreak. The exaggeration of the bioterror threat began more than a decade ago after the Japanese Aum
Shinrikyo group released sarin gas in the Tokyo subways in 1995. The scaremongering has grown more acute since
9/11 and the mailing of anthrax-laced letters to Congress and media outlets in the fall of 2001. Now an edifice of
institutes, programs and publicists with a vested interest in hyping the bioterror threat has grown, funded
by the government and by foundations. Last year, for example, Senate Majority Leader Bill Frist described
bioterrorism as "the greatest existential threat we have in the world today." But how could he justify such
a claim? Is bioterrorism a greater existential threat than global climate change, global poverty levels, wars and
conflicts, nuclear proliferation, ocean-quality deterioration, deforestation, desertification, depletion of freshwater aquifers or the
balancing of population growth and food production? Is it likely to kill more people than the more mundane scourges of AIDS,
tuberculosis, malaria, measles and cholera, which kill more than 11 million people each year? So what substantiates the alarm
and the massive federal spending on bioterrorism? There are two main sources of bioterrorism threats: first, from countries
developing bioweapons, and second, from terrorist groups that might buy, steal or manufacture them. The first threat is
declining. U.S. intelligence estimates say the number of countries that conduct offensive bioweapons
programs has fallen in the last 15 years from 13 to nine, as South Africa, Libya, Iraq and Cuba were dropped. There is no
publicly available evidence that even the most hostile of the nine remaining countries -- Syria and Iran -- are ramping up their
programs. And, despite the fear that a hostile nation could help terrorists get biological weapons, no country
has ever done so -- even nations known to have trained terrorists.

( ) Bioterror will not work – weather blocks succes

21
A-to Bioterror Advantage – Frontline cont’

( ) Technical barriers make bio-terror impossible – our ev cites 5 specific obstacles.

National Journal ’05 (April 23rd – lexis)

Technical Challenges

On the other hand, critics argue that some experts have oversimplified the significant technical challenges to
building catastrophic biological weapons and have overestimated the abilities of terrorist groups to
overcome them. "How do you kill a lot of people? There, you've got to get involved with airborne, deadly pathogens such as Bacillus anthracis spores, and
that's fairly technically demanding to do," Zilinskas said. Potential difficulties, experts say, include obtaining proper
equipment and an appropriate strain of pathogen; storing and handling the pathogen properly; growing it
to produce a greater quantity; processing it to develop the desirable characteristics; testing it; and dispersing it. A terrorist
group would need to have suitably educated and knowledgeable people, and sufficient time and freedom from government
scrutiny, to do the work, they say. Potentially the toughest challenge, experts say, is "weaponization" --
processing an agent to the point that it can resist environmental stresses, survive dissemination, and increase its
ability to infect (pathogenicity) and to harm (toxicity). This is particularly true if the terrorists want to spray the agent, which is a
more effective approach for a mass attack than spreading an agent through human-to-human contact. "While collection and
purification knowledge is widespread among ordinary scientists, weaponization is obviously a military subject, and much of the
knowledge that surrounds it is classified," wrote Danzig, who believes that terrorists nevertheless might be able to develop
catastrophic biological weapons. The key difficulty for producing an aerosolized weapon, Danzig said, "would be to produce a pathogen formulation in
sizes that would be within the human respiratory range and that could be reliably sto , led, and spread as a stable aerosol rather than clump and fall to the ground.
Mastering these somewhat contradictory requirements is tricky... The challenge becomes greater as attackers seek higher concentrations of agent and higher efficiency
Stanford's Chyba agrees on the difficulties of weaponization. "Aerosolization is clearly [a] serious
in dissemination."
hurdle. I just find it hard, currently, to imagine a Qaeda offshoot -- or, for that matter, any of the current non-state
groups that I have read about -- being technically proficient in that."
(Note: Danzig is a former Navy secretary who is now a Pentagon bioterrorism consultant and the
Sam Nunn Prize fellow in international security at the Center for Strategic and International Studies in Washington)

( ) Terrorists won’t use Bioweapons – it hurts their agenda to pursue death tolls that are that large

Stern ‘99 [Jessica, Council on Foreign Relations “The Prospect of Domestic Bioterrorism,”
http://www.cdc.gov/ncidod/EID/vol5no4/stern.htm CDC Emerging Infectious Diseases--Vol 5 # 4 July)

Would domestic terrorists use biological weapons?1 The conventional wisdom among experts has been that terrorists "want a lot of people
watching, not a lot of people dead" and are unlikely to turn to weapons of mass destruction.2 A new school of thought
proposes that improved technology has made biological attacks resulting in hundreds of thousands or millions of deaths
all but inevitable. While terrorists are increasingly interested in weapons of mass destruction, proponents of the latter view exaggerate
the threat. Using biological weapons to create mass casualties would require more than having biological agents in hand. The terrorists would need to
disseminate the agent, which presents technical and organizational obstacles that few domestic groups could surmount. In addition, relatively few
terrorists would want to kill millions of people, even if they could. For most terrorists, the costs of escalation to
biological weapons would seem to outweigh the benefits. Most modern terrorists have had substantively rational goals,
such as attaining national autonomy or establishing a government purportedly more representative of the people's will.
Escalating to such frightening weapons would result in a massive government crackdown and could alienate the group's
supporters. Biological weapons are also dangerous to produce. A number of Aum Shinrikyo members reportedly damaged their own health while
working on biological agents. Additionally, some terrorists may perceive moral constraints.3

22
Bioterrorism Takeouts – Extensions off # 1 “Risk exaggerated/Conspiracy”

Extend that the bioterror risk is exaggerated. Prefer our evidence for two reasons:

 Qualifications -- Leitenberg is a senior research scholar at the University of Maryland.


 Our conspiracy theory argument indicts the underlying assumption of their impact -- their impacts are cooked-up
to secure funding for counter-measures.

And – here’s more ev – the Bioterror risk exaggerated – it’s just a conspiracy

Birch ’06 (Douglas -- Sun foreign correspondent-- Baltimore Sun – June 18th – lexis)

Despite the concern of many scientists, some bioweapons experts say the fears are overblown. In a book last year, Assessing the
Biological Weapons and Bioterrorism Threat, Milton Leitenberg, a biowarfare expert at the University of Maryland, College
Park, wrote that the threat of bioterror "has been systematically and deliberately exaggerated" by an "edifice" of government-
funded institutes and experts who run programs and conferences. Germ weapons need to be carefully cultured, transported, stored
and effectively disseminated, said Raymond Zilinskas, a policy expert and biologist at the Center for Nonproliferation Studies.
Groups like al-Qaida and Japan's Aum Shinrikyo attempted, and abandoned, efforts to make germ weapons because the task was
too difficult.

Statistics that account for the funding conspiracy-theory-effect prove that the risk is actually quite small.

Leitenberg ‘06 (Milton Leitenberg, a senior research scholar at the University of Maryland, is the author of
"Assessing the Biological Weapons and Bioterrorism Threat.” Los Angeles Times,
http://www.commondreams.org/views06/0217-27.htm)
The United States has spent at least $33 billion since 2002 to combat the threat of biological terrorism. The trouble is, the risk
that terrorists will use biological agents is being systematically and deliberately exaggerated. And the U.S. government has been
using most of its money to prepare for the wrong contingency. A pandemic flu outbreak of the kind the world witnessed in 1918-19 could kill hundreds of
millions of people. The only lethal biological attack in the United States — the anthrax mailings — killed five. But the annual budget for combating bioterror is more than $7 billion, while
The exaggeration of the bioterror threat began more than a decade ago
Congress just passed a $3.8-billion emergency package to prepare for a flu outbreak.
after the Japanese Aum Shinrikyo group released sarin gas in the Tokyo subways in 1995. The scaremongering has grown more
acute since 9/11 and the mailing of anthrax-laced letters to Congress and media outlets in the fall of 2001. Now an edifice of
institutes, programs and publicists with a vested interest in hyping the bioterror threat has grown, funded by the government and
by foundations. Last year, for example, Senate Majority Leader Bill Frist described bioterrorism as "the greatest existential threat we have in the world today." But how could he justify
such a claim? Is bioterrorism a greater existential threat than global climate change, global poverty levels, wars and conflicts, nuclear proliferation, ocean-quality deterioration, deforestation,
desertification, depletion of freshwater aquifers or the balancing of population growth and food production? Is it likely to kill more people than the more mundane scourges of AIDS, tuberculosis,
So what substantiates the alarm and the massive federal spending on
malaria, measles and cholera, which kill more than 11 million people each year?
bioterrorism? There are two main sources of bioterrorism threats: first, from countries developing bioweapons, and second, from
terrorist groups that might buy, steal or manufacture them. The first threat is declining. U.S. intelligence estimates say the
number of countries that conduct offensive bioweapons programs has fallen in the last 15 years from 13 to nine, as South Africa,
Libya, Iraq and Cuba were dropped. There is no publicly available evidence that even the most hostile of the nine remaining
countries — Syria and Iran — are ramping up their programs. And, despite the fear that a hostile nation could help terrorists get
biological weapons, no country has ever done so — even nations known to have trained terrorists. It's more difficult to assess the risk of
terrorists using bioweapons, especially because the perpetrators of the anthrax mailings have not been identified. If the perpetrators did not have access to assistance, materials or knowledge
derived from the U.S. biodefense program, but had developed such sophistication independently, that would change our view of what a terrorist group might be capable of. So far, however, the
history of terrorist experimentation with bioweapons has shown that killing large numbers of people isn't as easy as we've been led to believe. Followers of Bhagwan Shree Rajneesh succeeded
in culturing and distributing salmonella in Oregon in 1984, sickening 751 people. Aum Shinrikyo failed in its attempts to obtain, produce and disperse anthrax and botulinum toxin between 1990
and 1994. Al Qaeda tried to develop bioweapons from 1997 until the U.S. invasion of Afghanistan in 2001, but declassified documents found by U.S. forces outside Kandahar indicate the group
At a conference in Tokyo this week, bioterrorism experts called for new programs to counter the
never obtained the necessary pathogens.
possibility that terrorists could genetically engineer new pathogens. Yet three of the leading scientists in the field have said there
is no likelihood at this time that a terrorist group could perform such a feat. The real problem is that a decade of widely broadcast discussion of what it
takes to produce a bioweapon has provided terrorists with at least a rough roadmap. Until now, no terrorist group has had professionals with the skills to exploit the information — but the
There is no military or strategic justification for imputing to real-world terrorist groups capabilities
publicity may make it easier in the future.
that they do not possess. Yet no risk analysis was conducted before the $33 billion was spent. Some scientists and politicians
privately acknowledge that the threat of bioterror attacks is exaggerated, but they argue that spending on bioterrorism prevention
and response would be inadequate without it. But the persistent hype is not benign. It is almost certainly the single major factor in
provoking interest in bioweapons among terrorist groups. Bin Laden's deputy, the Egyptian doctor Ayman Zawahiri, wrote on a
captured floppy disk that "we only became aware of (bioweapons) when the enemy drew our attention to them by repeatedly
expressing concerns that they can be produced simply with easily available materials." We are creating our worst nightmare.

23
Bioterrorism Takeouts – Extensions off # 2 “no dissemination/weather”

Extend that Bioweapon attacks would not kill that many people.

Our Lacquer card cites two warrants:

First – that weather conditions make attacks fail over 90% of the time.

Second – that the public would be alerted well in advance, minimizing the death toll.

Here’s more ev that the biological materials will die off before explosion.

You should prefer our ev – our warrants are more diverse, and Lacquer is an expert in the field.

24
Bioterrorism Takeouts -- Extensions off # 3 “Technical Obstacles”

Extend our 1NC National Journal ev – Bioterrorism will never happen because of technical obstables.

Our ev cites 5 warrants that they have not addressed – terrorist cannot:

o get the right equipment


o buy the right strains
o store the biological material
o test it, or
o make it an effective weapon

Prefer our ev – it cites Danzig who used to work for the Pentagon.

Here’s more ev that acquisition is impossible:

25

You might also like