You are on page 1of 17

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

Contention 1 is Inherency
ZERO money for HSR

HURST 11 21 11 CQ Staff
[Nathan Hurst, CQ WEEKLY WEEKLY REPORT APPROPRIATIONS, Nov. 21, 2011 Page 2448, Obama Initiatives on Infrastructure, High-Speed Rail Are Zeroed Out]

House and Senate conferees provided no funding for two of President Obamas signature initiatives: highspeed rail and a national infrastructure bank Obama has requested $53 billion over six years for high-speed rail. The
Senate Appropriations Committee agreed to set aside $100 million in fiscal 2012. Overall, the three-bill minibus (HR 2112) that the Senate cleared Nov. 17 would provide $109.4 billion for the departments of Transportation and Housing and Urban Development (HUD). Just $55.6 billion of that is discretionary appropriations; most of the funding in the bill comes from obligations for the Highway Trust Fund. (Minibus, p. 2445 The discretionary funding is virtually the same as in fiscal 2011, but it is $19.4 billion less than Obama sought Senate Majority Whip Richard J. Durbin, D-Ill., said he was disappointed that the final bill did not include funding specifically for high-speed rail, but he expressed optimism that projects such as the one in his state aiming to provide faster rail links between Chicago and St. Louis would continue under a different grant program.

Thus the plan: The United States Federal government will build a high speed rail network connecting the major population centers of the contiguous United States. We reserve the right to clarify. Contention two is the environment

Contention 2 is Global warming


Global warming is real and anthropogenic- thousands of studies prove our point

Rahmstorf 08 (Stefan Rahmstorf,


Revisiting the Facts)

Potsdam Institute for Climate Impact Research, 2008, Anthropegenic Climate Change:

It is time to turn to statement B: human activities are altering the climate. This can be broken into two parts. The first is as follows: global climate is warming. This is by now a generally undisputed point (except by novelist Michael Crichton), so we deal with it only briefly.32 The two leading compilations of data measured with thermometers are shown in figure 3-3, that of the National Aeronautics and Space Administration (NASA) and that of the British Hadley Centre for Climate Change. Although they differ in the details, due to the inclusion of different data sets and use of different spatial averaging and quality control procedures, they both show a consistent picture, with a global mean warming of 0.8C since the late nineteenth century. Temperatures over the past ten years clearly were the warmest since measured records have been available. The year 1998 sticks out well above the longterm trend due to the occurrence of a major El Nio event that year (the last El Nio so far and one of the strongest on record). These events are examples of the largest natural climate variations on multiyear time

scales and, by releasing heat from the ocean, generally cause positive anomalies in global mean temperature. It is remarkable that the year 2005 rivaled the heat of 1998 even though no El Nio event occurred that year. (A
bizarre curiosity, perhaps worth mentioning, is that several prominent climate skeptics recently used the extreme year 1998 to claim in the media that global warming had ended. In Lindzens words, Indeed, the absence of any record breakers during the past seven years is statistical evidence that temperatures are not increasing.)33 In addition to the surface measurements, the more recent portion of the global warming trend (since 1979) is also documented by satellite data. It is not 42 stefan rahmstorf straightforward to derive a reliable surface temperature trend from satellites, as they measure radiation coming from throughout the atmosphere (not just near the surface), including the stratosphere, which has strongly cooled,34 and the records are not homogeneous due to the short life span of individual satellites, the problem of orbital decay, observations at different times of day, and drifts in instrument calibration. Current analyses of these satellite data show trends that are fully consistent with surface measurements and model simulations.35 If no reliable temperature measurements existed, could we be sure that the climate is warming? The canaries in the coal mine of climate change (as glaciologist Lonnie Thompson puts it) are mountain glaciers. W e know, both from old photographs and from the position of the terminal moraines heaped up by the flowing ice, that mountain glaciers have been in retreat all over the world during the past century. There are precious few exceptions, and they are associated with a

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

strong increase in precipitation or local cooling.36 I have inspected examples of shrinking glaciers myself in field trips to Switzerland, Norway, and New Zealand. As glaciers respond sensitively to temperature changes, data on the extent of glaciers have been used to reconstruct a history of Northern Hemisphere temperature over the past four centuries (see figure 3-4).37 Cores drilled in tropical glaciers show signs of recent melting that is unprecedented at least throughout the Holocene the past 10,000 years.38 Another powerful sign of anthropogenic climate change 43 Source: Reconstructed from proxy data. Mann, Bradley, and Hughes, Northern Hemisphere Temperatures during the Past Millennium, as shown in IPCC, Climate Change 2001; Moberg and others, Highly Variable Northern Hemisphere Temperatures; and Oerlemans, Extracting a Climate Signal. For full references, see notes 37, 39, and 47. Instrumental data are from NASA up to 2005. a. All curves are smoothed over twenty years, and values are given relative to the mean 195180. b. Goddard Institute for Space Studies (GISS) data for land and ocean, Northern Hemisphere. Temperature deviation C Mann et al. 1999 Moberg et al. 2005 Oerlemans 2005 Obs. data (GISS land&ocean NH)b Figure 3-4. Temperature

of the Northern Hemisphere during the Past Millenniuma warming, visible clearly from satellites, is the shrinking Arctic sea ice cover (figure 3-5), which has declined 20 percent since satellite observations began in 1979. While climate clearly became warmer in the twentieth century, much discussion particularly in the popular media
has focused on the question of how unusual this warming is in a longer-term context. While this is an interesting question, it has often been mixed incorrectly with the question of causation. Scientifically, how unusual recent warming issay, compared to the past millenniumin itself contains little information about its cause. Even a highly unusual warming could have a natural cause (for example, an exceptional increase in solar activity). And even a warming within the bounds of past natural variations could have a predominantly anthropogenic cause. I come to the question of causation shortly, after briefly visiting the evidence for past natural climate variations. Records from the time before systematic temperature measurements were collected are based on proxy data, coming from tree rings, ice cores, corals, and other sources. These proxy data are generally linked to local temperatures in some way, but they may be influenced by other parameters as well (for example, in 1979 and in 2005a precipitation), they may have a seasonal bias (for example, the growth season for tree rings), and high-quality long records are difficult to obtain and therefore few in number and geographic coverage. Therefore, there is still substantial uncertainty in the evolution of past global or hemispheric temperatures. (Comparing only local or regional temperature, as in Europe, is of limited value for our purposes, as regional variations can be much larger than global ones and can have many regional causes, unrelated to global-scale forcing and climate change.) The first quantitative reconstruction for the Northern Hemisphere temperature of the past millennium, including an error estimation, was presented by Mann, Bradley, and Hughes and rightly highlighted in the 2001 IPCC report as one of the major new findings since its 1995 report; it is shown in figure 3-6.39 The analysis suggests that, despite the large error bars, twentieth-century warming is indeed highly unusual and probably was unprecedented during the past millennium. This result, presumably because of its symbolic power, has attracted much criticism, to some extent in scientific journals, but even more so in the popular media. The hockey stickshaped curve became a symbol for the IPCC, and criticizing this particular data analysis became an avenue for some to question the credibility of the IPCC. anthropogenic climate change 45 Source: Data from IPCC, Climate Change 2001. a. The past evolution is shown as in figure 3-5, except that the global (not hemispheric) mean instrumental data are shown, and the temperature origin (0C anomaly) is placed at the 1990 value of the smoothed instrumental data, since the IPCC projections start in 1990. Two example scenarios (A2, B1) are shown together with the full range (shaded). B1 is a relatively low- and A2 is a relatively high-emissions scenario. The observed temperature rise since 1990 runs along the upper edge of the scenarios. Temperature deviation C Mann et al. 1999 Moberg et al. 2005 Oerlemans 2005 Obs. data (GISS land&ocean NH)b IPCC Projections Figure 3-6. Global Temperature Projections for the Twenty-First Centurya Three important things have been overlooked in much of the media coverage. First, even if the scientific critics had been right, this would not have called into question the very cautious conclusion drawn by the IPCC from the reconstruction by Mann, Bradley, and Hughes: New analyses of proxy data for the Northern Hemisphere indicate that the increase in temperature in the twentieth century is likely to have been the largest of any century during the past 1,000 years. This conclusion has since been supported further by every single one of close to a dozen new reconstructions (two of which are shown in figure 3-6). Second, by far the most serious scientific criticism raised against Mann, Hughes, and Bradley was simply based on a mistake.40 The prominent paper of von Storch and others, which claimed (based on a model test) that the method of Mann, Bradley, and Hughes systematically underestimated variability, was [itself] based on incorrect implementation of the reconstruction procedure.41 With correct implementation, climate field reconstruction pro cedures such as the one used by Mann, Bradley, and Hughes have been shown to perform well in similar model tests.42 Third, whether their reconstruction is accurate or not has no bearing on policy. If their analysis underestimated past natural climate variability, this would certainly not argue for a smaller climate sensitivity and thus a lesser concern about the consequences of our emissions. Some have argued that, in contrast, it would point to a larger climate sensitivity.43 While this is a valid point in principle, it does not apply in practice to the climate sensitivity estimates discussed herein or to the range given by IPCC, since these did not use the reconstruction of Mann, Hughes, and Bradley or any other proxy records of the past millennium. Media claims that a pillar of the Kyoto Protocol had been called into question were therefore misinformed. As an aside, the protocol was agreed in 1997, before the reconstruction in question even existed. The overheated public debate on this topic has, at least, helped to attract more researchers and funding to this area of paleoclimatology; its methodology has advanced significantly, and a number of new reconstructions have been presented in recent years. While the science has moved forward, the first seminal reconstruction by Mann, Hughes, and Bradley has held up remarkably well, with its main features reproduced by more recent work. Further progress probably will require substantial amounts of new proxy data, rather than further refinement of the statistical techniques pioneered by Mann, Hughes, and Bradley. Developing these data sets will require time and substantial effort. It is time to address the final statement: most of the observed warming over the past fifty years is anthropogenic. A large number of studies exist that have taken different approaches to analyze this issue, which is generally called the attribution problem. I do not discuss the exact share of the anthropogenic contribution (although this is an interesting question). By most I simply mean more than 50 percent. 46 stefan rahmstorf The first and crucial piece of evidence is, of course, that the magnitude of the warming is what is expected from the anthropogenic perturbation of the radiation balance, so anthropogenic forcing is able to explain all of the temperature rise. As

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

discussed here, the rise in greenhouse gases alone corresponds to 2.6 W/m2 of forcing. This by itself, after subtraction of the observed 0.6 W/m2 of ocean heat uptake, would cause 1.6C of warming since preindustrial times for medium climate sensitivity (3C). With a current best guess aerosol forcing of 1 W/m2, the expected warming is 0.8C. The point here is not that it is possible to obtain the exact observed numberthis is fortuitous because the amount of aerosol forcing is still very uncertain but that the expected magnitude is roughly right. There can be little doubt that the anthropogenic forcing is large enough to explain most of the warming. Depending on aerosol forcing and climate sensitivity, it could explain a large fraction of the warming, or all of it, or even more warming than has been observed (leaving room for natural processes to counteract some of the warming). The second important piece of evidence is clear: there is no viable alternative explanation. In the scientific literature, no serious alternative hypothesis has been proposed to explain the observed global warming. Other possible causes, such as solar activity, volcanic activity, cosmic rays, or orbital cycles, are well observed, but they do not show trends capable of explaining the observed warming. Since 1978, solar irradiance has been measured directly from satellites and shows the well-known eleven-year solar cycle, but no trend.44 There are various estimates of solar variability before this time, based on sunspot numbers, solar cycle length, the geomagnetic AA index, neutron monitor data, and carbon-14 data. These indicate that solar activity probably increased somewhat up to 1940. While there is disagreement about the variation in previous centuries, different authors agree that solar activity did not significantly increase during the last sixty-five years.45 Therefore, this cannot explain the warming, and neither can any of the other factors mentioned. Models driven by natural factors only, leaving the anthropogenic forcing aside, show a cooling in the second half of the twentieth century (for an example, see figure 2-2, panel a, in chapter 2 of this volume). The

trend in the sum of natural forcings is downward.46 The only way out would be either some as yet undiscovered unknown forcing or a warming trend that arises by chance from an unforced internal variability in the climate system. The latter cannot be completely ruled out, but has to be considered highly unlikely. No
evidence in the observed record, proxy data, or current models suggests that such internal variability could cause a sustained trend of global warming of the observed magnitude. As discussed, twentieth century warming is unprecedented over the

past 1,000 years (or even 2,000 years, as the few longer reconstructions available now suggest), which does not support the idea of large internal fluctuations.47 Also, those past variations correlate well with past forcing (solar variability, volcanic activity) and thus appear to anthropogenic climate change 47 be
largely forced rather than due to unforced internal variability.48 And indeed, it would be difficult for a large and sustained unforced variability to satisfy the fundamental physical law of energy conservation. Natural internal variability generally shifts heat around different parts of the climate systemfor example, the large El Nio event of 1998, which warmed the atmosphere by releasing heat stored in the ocean. This mechanism implies that the ocean heat content drops as the atmosphere warms.

For past decades, as discussed, we observed the atmosphere warming and the ocean heat content increasing, which rules out heat release from the ocean as a cause of surface warming. The heat content of
the whole climate system is increasing, and there is no plausible source of this heat other than the heat trapped by greenhouse gases. A completely different approach to attribution is to analyze the spatial patterns of climate change. This is done in so-called fingerprint studies, which associate particular patterns or fingerprints with di fferent forcings. It is plausible that the pattern of a solar-forced climate change differs from the pattern of a change caused by greenhouse gases. For example, a characteristic of greenhouse gases is that heat is trapped closer to the Earths surface and that, unlike solar variability, greenhouse gases tend to warm more in winter and at night. Such studies have used different data sets and have been performed by different groups of researchers with different statistical methods. They consistently conclude that the observed spatial pattern of

warming can only be explained by greenhouse gases.49 Overall, it has to be considered highly likely that the observed warming is indeed predominantly due to the human-caused increase in greenhouse gases.
Discussion and Consequences This paper discussed the evidence for the anthropogenic increase in atmospheric CO2concentration and the effect of CO2on climate, finding that this anthropogenic increase is proven beyond reasonable doubt and

that a mass of evidence points to a CO2effect on climate of 3C 1.5C global warming for a doubling of concentration. (This is the classic IPCC range; my personal assessment is that, in the light of new studies since the IPCC Third
Assessment Report, the uncertainty range can now be narrowed somewhat to 3C 1C.) This is based on consistent results from theory, models, and data analysis, and, even in the absence of any computer models, the same result would still hold based on physics and on data from climate history alone. Considering the plethora of consistent evidence, the chance that these conclusions are wrong has to be considered minute. If the preceding is accepted, then it follows logically and incontrovertibly

that a further increase in CO2concentration will lead to further warming. The magnitude of our emissions depends on human behavior, but the climatic 48 stefan rahmstorf response to various emissions scenarios can be
computed from the information presented here. The result is the famous range of future global temperature scenarios shown in figure 3-6.50 Two additional steps are involved in these computations: the consideration of anthropogenic forcings other than CO2(for example, other greenhouse gases and aerosols) and the computation of concentrations from the emissions. Other gases are not discussed here, although they are important to get quantitatively accurate results. CO2 is the largest and most important forcing. Concerning concentrations, the scenarios shown basically assume that ocean and biosphere take up a similar share of our emitted CO2as in the past. This could turn out to be an optimistic assumption; some models indicate the possibility of a positive feedback, with the biosphere turning into a carbon source rather than a sink under growing climatic stress.51 It is clear that even in the more optimistic of the shown (non-mitigation) scenarios, global temperature would rise by 23C above its preindustrial level by the end of this century. Even for a paleoclimatologist like myself, this is an extraordinarily high temperature, which is very likely unprecedented in at least the past 100,000 years. As far as the data show, we would have to go back about 3 million years, to the

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

Pliocene, for comparable temperatures. The rate of this warming (which is important for the ability of ecosystems to cope) is also highly unusual and unprecedented probably for an even longer time. The last major global warming trend occurred when the last great Ice Age ended between 15,000 and 10,000 years ago: this was a warming of about 5C over 5,000 years, that is, a rate of only 0.1C per century.52 The expected magnitude and rate of planetary warming is highly likely to come with major risks and impacts in terms of sea level rise (Pliocene sea level was 2535 meters higher than now due to smaller Greenland and Antarctic ice sheets), extreme events (for example, hurricane activity is expected to increase in a warmer climate), and ecosystem loss.53 The second part of this paper examined the evidence for the current warming of the planet and discussed what is known about its causes. This part showed that global warming is already a measured and well-established fact, not a theory. Many different

lines of evidence consistently show that most of the observed warming of the past fifty years was caused by human activity. Above all, this warming is exactly what would be expected given the anthropogenic rise in greenhouse gases, and no viable alternative explanation for this warming has been proposed in the scientific literature. Taken together, the very strong evidence, accumulated from thousands of independent studies, has over the past decades convinced virtually every climatologist around the world (many of whom were initially quite skeptical, including myself) that anthropogenic global warming is a reality with which we need to deal.

Scientific consensus only goes one way


UCS 11 (Union of Concerned Scientists- the leading science-based nonprofit organization. Citing done in articles. March 7 2011. Scientific Consensus on Global Warming< http: //www.ucsusa.org/ssi/climatechange/scientific-consensus-on.html>) Scientific Consensus on Global Warming Scientific societies and scientists have released statements and studies showing the growing consensus on climate change science . A common objection to taking action to reduce our
heat-trapping emissions has been uncertainty within the scientific community on whether or not global warming is happening and if it is caused by humans. However, there

is now an overwhelming scientific consensus that global warming is indeed happening and humans are contributing to it. Below are links to documents and statements attesting to this consensus. Scientific Societies Statement on climate change from 18 scientific associations "Observations throughout the world make it clear that climate change is occurring, and rigorous scientific research demonstrates that the greenhouse gases emitted by human activities are the primary driver." (October, 2009) American Meteorological Society: Climate Change: An
Information Statement of the American Meteorological Society "Indeed, strong observational evidence and results from modeling studies indicate that, at least over the last 50 years, human activities are a major contributor to climate change." (February 2007) American Physical Society: Statement on Climate Change "The evidence is incontrovertible: Global warming is occurring. If no mitigating actions are taken, significant disruptions in the Earths physical and ecological systems, social systems, security and human health are likely to occur. We must reduce emissions of greenhouse gases beginning now." (November 2007) American Geophysical Union : Human Impacts on Climate "The Earth's climate is now clearly out of balance and is warming. Many components of the climate system including the temperatures of the atmosphere, land and ocean, the extent of sea ice and mountain glaciers, the sea level, the distribution of precipitation, and the length of seasons are now changing at rates and in patterns that are not natural and are best explained by the increased atmospheric abundances of greenhouse gases and aerosols generated by human

American Association for the Advancement of Science: AAAS Board Statement on Climate Change "The scientific evidence is clear: global climate change caused by human activities is occurring now, and it is a growing threat to society." (December 2006) Geological Society of America: Global Climate Change "The Geological Society of America (GSA) supports the scientific conclusions that Earths climate
activity during the 20th century." (Adopted December 2003, Revised and Reaffirmed December 2007) is changing; the climate changes are due in part to human activities; and the probable consequences of the climate changes will be significant and blind to geopolitical boundaries." (October 2006) American Chemical Society: Statement on Global Climate Change "There is now general agreement among scientific experts that the recent warming trend is real (and particularly strong within the past 20 years), that most of the observed warming is likely due to increased atmospheric greenhouse gas concentrations, and that climate change could have serious adverse effects by the end of this century." (July 2004) National Science Academies U.S. National Academy of Sciences: Understanding and Responding to Climate Change (pdf) "The scientific understanding of climate change is now sufficiently clear to justify taking steps to reduce the amount of greenhouse gases in the atmosphere." (2005) International academies: Joint science academies statement: Global response to climate change (pdf) "Climate change is real. There will always be uncertainty in understanding a system as complex as the worlds climate. However there is now strong evidence that significant global warming is occurring." (2005, 11 national academies of science) International academies: The Science of Climate Change "Despite increasing consensus on the science underpinning predictions of global climate change, doubts have been expressed recently about the need to mitigate the risks posed by global climate change. We do not consider such doubts justified." (2001, 16 national academies

National Research Council of the National Academies , Americas Climate Choices "Most of the recent warming can be attributed to fossil fuel burning and other human activities that release carbon dioxide and other heattrapping greenhouse gases into the atmosphere." America's Climate Choices, Advancing the Science of Climate Change, 2010 U.S. Climate Change Research Program, Global Climate Change Impacts in the United States (2009) "Global warming is unequivocal and primarily human-induced. Global temperature has increased over the past 50 years. This observed increase is due primarily to human-induced emissions of heat-trapping gases." Examining the Scientific Consensus on Climate Change, Peter T. Doran and Maggie Kendall Zimmerman "It seems that the debate on the authenticity of global warming and the role played by human activity is largely nonexistent among those who understand the nuances and scientific basis of long-term climate processes." Doran surveyed 10,257 Earth scientists. Thirty percent responded to the survey which asked: 1. When compared with pre-1800s
of science) Research

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant? and 2. Do you think human activity is a significant contributing factor in changing mean global temperatures? Beyond the Ivory Tower: The Scientific Consensus on Climate Change, Naomi Oreskes "Oreskes analyzed 928 abstracts published in refereed scientific journals between 1993 and 2003 and listed in the ISI database with the keywords 'climate change.'... Of all the papers, 75 percent either explicitly or implicitly accepted the consensus view that global warming is happening and humans are contributing to it; 25 percent dealt with methods or ancient climates, taking no position on current anthropogenic [human-caused] climate change. Remarkably, none of the papers disagreed with the consensus position." Intergovernmental Panel on Climate Change Climate Change 2007: The Physical Science Basis, IPCC, 2007. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change [Solomon, S., D. Qin, M. Manning, Z. Chen, M. Marquis, K.B. Averyt, M.Tignor and H.L. Miller (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA. Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level Most of the observed increase in global average temperatures since the mid -20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. IPCC defines "very likely" as greater than 90% probability of occurrence.

Avoiding 450 PPM is key Hansen 9 (Climate Scientist at the Goddard Institute for Space Studies at NASA and professor in the department of Earth and
Environmental Sciences at Columbia University, received the Carl-Gustaf Rossby Research Medal which is the highest award for atmospheric research of the American Meteorological Society, Time Magazines Top 100 Most influential People, Member of the National Academy of Sciences, PhD in Physics and Astronomy at the University of Iowa (John, Storms of our grandchildren, pa ges 158-160)
the carbon dioxide causing the large climate changes in the Cenozoic era necessarily came from the solid Earth reservoirs (rocks or fossil fuels; see figure 15 on page 118). The alternative- oscillation of carbon among its surface reservoirsis important for glacial-interglacial climate change, as a climate feedback, but it alters atmospheric carbon
Now let us turn to the question of why atmospheric carbon dioxide changed during the past 65 million years. First, note that dioxide by only about 100 ppm, not 1,000 ppm. The solid Earth is both a source of carbon dioxide for the surface reservoirs and a sink. The carbon dioxide source occurs at the edge of moving continental plates that subduct ocean crust. What does that mean? Continents are composed of relatively light material, typically granite. Ocean crust, that is, the solid Earth beneath the ocean water, is heavier rock, typically basalt. Both continents and ocean crust are lighter than material at greater depths, and they are slightly mobile because of convection deeper in the Earth. The energy that drives movement of the surface crust comes from the small amount of heat released by radioactive elements in Earths interior. As continents move, commonly at a rate of several centimeters per year or an inch or two, they can ride over ocean crust. Intense heat and pressure due to the overriding continent cause melting and metamorphism of the ocean crust, producing carbon dioxide and methane from calcium carbonate and organic sediments on the ocean floor. The gases come to the surface in volcanic eruptions and at seltzer springs and gas vents. This is the main source of carbon dioxide from the solid Earth to surface reservoirs. The main carbon sink that is, the return flow of carbon to the solid Earth occurs via the weathering of rocks. Chemical reactions combine carbon dioxide and minerals, with the ingredients being carried by streams and rivers to the ocean and precipitated to the ocean floor as carbonate sediments. A smaller, but still important, carbon sink is the sedimentation of organic material in the ocean, lakes and bogs. Some of this organic material eventually forms fossil fuels and methane hydrates. A key point is that the solid Earth source and the solid Earth sink of carbon are not in general equal at a given time. The imbalance causes the atmospheric carbon dioxide amount to vary. The carbon dioxide source to the atmosphere is larger, for example, when continental drift is occurring over a region of carbon rich ocean crust. A qualitative explanation for the large Cenozoic climate change, and a picture of the solid Earths role in the Cenozoic carbon cycle, almost leaps out from figures 18 and 19. During the period between 60 and 50 million years ago, India was moving about 20 centimeters (8 inches) per year, which is unusually rapid for continental drift. India was heading north through an ocean region, now called the Indian Ocean, that had long been in area into which major rivers of the world had deposited carbon sediments. Undoubtedly, atmospheric carbon dioxide: Increased rapidly during that period as the carbon-rich sediments on that ocean floor were subducted beneath the Indian continental plate. Then, 50 million years ago, India crashed into Asia, with the Indian plate sliding under the Asian plate. The colliding continental plates began to push up the Himalayan mountains and Tibetan plateau, exposing a huge amount of fresh rock for weathering with Indias sojourn across the carbon -rich ocean completed, the carbon dioxide emissions declined and the planet began a long-term cooling trend. A quantitative analysis of the Cenozoic atmospheric carbon dioxide history is carried out in our Target CO2 paper described above. We calculated the range of carbon dioxide histories that can match the observed temperature curve (figure 18), accounting for, uncertainties in the relation between the deep ocean and surface temperature. We estimated maximum carbon dioxide 50 million years ago as 1,400 ppm, with an uncertainty of about 500 ppm. The carbon dioxide amount 34 million years ago, when Antarctica became cold enough to harbor a large ice sheet it was found to be 450 ppm with an uncertainty of 100 ppm. This calculated carbon dioxide history falls within the broad range of estimates based on several indirect ways of measuring fast carbon

A striking conclusion from this analysis is the value of carbon dioxideonly 450 ppm, with estimated uncertainty of 1oo ppm at which the transition occurs from no large ice sheet to a glaciated Antarctica. This has a clear, strong implication for what constitutes a dangerous level of atmospheric carbon dioxide. If humanity burns most of the fossil fuels , doubling or tripling the preindustrial carbon dioxide level, Earth will surely head toward the ice-free condition with sea level 75 meters (250 feet) higher than today. It is difficult to say how long it will take for the melting to be complete, but once ice sheet disintegration gets well under way it will be impossible to stop . With carbon dioxide the dominant climate forcing, as it is today, it obviously would be exceedingly foolish and dangerous to allow carbon dioxide to approach 450 ppm . What does the
dioxide levels, I described in the Target C02 paper.

Cenozoic history tell us with regard to Administrator Griffins assertion that natural climate changes exceed human made change? Surely, nature changes carbon dioxide, and climate, by huge amounts. But we must look at time scales. The source of carbon dioxide emissions from the solid Earth to the surface reservoirs, when divided among the surface reservoirs, is a few ten thousandths of ppm per year. The natural sink, weathering, has a similar magnitude. The natural source and sink can be out of balance, as when India was cruising through the Indian Ocean, by typically one ten thousandth of 1 ppm per year. In a million years such an imbalance changes at mesospheric carbon dioxide by 100 ppm, a huge change. But humans, by burning fossil fuels, are now increasing atmospheric carbon dioxide by 2 ppm per year. In other words, the human climate forcing is four orders of magnitude ten thousand times more powerful , although I use the phrase in control loosely here. Okay, I know, this is getting long, but for the sake of your children and grandchildren, lets look a little more closely at another story in figure 18, one that is vitally important. I refer to the PETM, the Paleocene-Eocene thermal maximum, the rapid warming of at least 5 degrees Celsius that occurred about 55 million years ago and caused a minor rash of extinctions, mainly of marine species. The PETM looks like an explosion in figure 18, and by paleoclimate standards it was explosively rapid. Carbon isotopes in the sediments deposited during the PETM show that there was a huge injection of light carbon into the atmosphere about 3,000 gigatons of carbon, almost as much as the carbon in all of todays oil, gas, and coal. It was injected in two bursts, each no more than a thousand years in duration. The most likely source for such a rapid injection is methane hydrates.

than the natural forcing . Humans are now in control of future climate

The methane hydrate explanation is now broadly accepted, but it leaves open a vital question: What instigated the release of this methane? Was it an external trigger or a climate feedback? The answer holds enormous consequences for the future of humanity. If the trigger for the methane hydrates release was external, such as the intrusion of hot magma from below or an asteroid crashing into the Arctic Ocean,
There is more than enough methane ice on continental shelves today to provide this amount of light carbon. then humans have no influence on whether the process will happen again. And the chances are remote that another such external event would happen in a time frame that most humans would care about. There have been several PETM-like rapid warming events in the past 200 million years. At that frequency, the chance of one beginning in the next hundred years is less than 0.00001 percent. On the other hand, if the PETM and PETM-like methane hydrate releases were feedbacks, that is,

if a warming

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

climate caused the melting of frozen methane, then it is a whole different ball game. In that case, it is practically a dead certainty that business-as-usual exploitation of all fossil fuels would cause todays frozen methane to meltit is only a question of how soon. Unfortunately, paleoclimate data now unambiguously point to the methane releases being a feedback . If the PETM were an isolated case, that interpretation would
be less certain. But it has been found that several PETM-like events in the Jurassic and Paleocene eras were, as with the PETM, astronomically paced. Huh? That means the spikes in global warming and light-carbon sediments occurred simultaneously with the warm phase of climate oscillations caused by perturbations of Earths orbit. In other words, the methane releases occurred at times of natural warming events . So, why do methane hydrates produce a huge amplifying feedback in a small number of cases, while most astronomical warming show little or no evidence of methane hydrate amplification? That mercurial behavior, in fact, is exactly what is expected for methane hydrates. The largest volurne of methane hydrates is on continental shelves, in the top several hundred meters of ocean sediments, although a smaller volume exists in continental tundra. The main methane hydrates form in coastal zones with high biologic productivity. A sufficient rain of organic material onto the ocean floor yields a lowoxygen environment in the sediments, which causes the bacterial degradation of organic matter to produce methane. If the temperature is right, the methane is frozen into hydrates. If a warming occurs that is large enough to melt methane hydrate, each liter of melted hydrate expands into 160 liters of

methane gas. Why ocean circulation changed is uncertain, but it is likely related to the global warming of 2 to 3 degrees Celsius that occurred just prior to the PETM event
(figure 18). One final mundane, but sobering, inference from the PETM: The recovery time from excess carbon in the air and ocean, and from the PETM global warming spike, was about 100,000 years. That is the recovery time predicted by carbon cycle models. The added carbon dioxide in the air increases the rate of weathering and carbon uptake, which is a negative diminishing feedback. Confirmation of the recovery time is a useful verification of the models. It is also a reminder that if humans are so foolish as

to burn all fossil fuels, the planet will not recover on any time scale that humans can imagine. Such was the state of PETM research, or at least
my perspective on it, in mid-2007, right around the time that Bill McKibben was asking me about 450 ppmthough the most startling revelation from the PETM was yet to come. I formally promised Bill that I would give him a number at the December 2007 American Geo physical Union meeting, when I would present a talk on the rationale for the

The perturbed carbon cycle will not recover for tens of thousands of years, and it is carbon dioxide that determines the magnitude of the perturbation. Other forcings are important and need to be minimized, and some may be easier than carbon dioxide to deal with,
suggested carbon dioxide target. In that talk, I emphasized carbon dioxide itself, not the carbon dioxide equivalent of all human-made gases. but policy makers must understand that they cannot avoid constraints on carbon dioxide via offsets from other constituents. In addition to paleoclimate data, my talk covered ongoing observations of five phenomena, all of which imply that an appropriate initial target should be no higher than 350 ppm. In brief, here are the five observations. (1) The area of Arctic sea ice has been declining faster than models predicted. The end-of-summer sea ice area was 40 percent less in 2007 than in the late 1970s when accurate satellite measurements began. Continued growth of atmospheric carbon dioxide surely will result in an ice-free end-of-summer Arctic within

several decades, with detrimental effects on wildlife and indigenous people. It is difficult to imagine how the Greenland ice sheet could survive if Arctic sea ice is lost entirely in the warm season. Retention of warm-season sea ice likely requires restoration of the planets energy balance. At present our best estimate is that there is about 0.5 watt per square meter more energy coming into the planet than is being emitted to space as heat radiation. A reduction of carbon dioxide amount from the current 387 ppm to 350 ppm , all other things being unchanged, would increase outgoing radiation by 0.5 watt, restoring planetary energy balance. (2) Mountain glaciers are disappearing all over the world. If business-as-usual greenhouse gas emissions continue, most of the glaciers will be gone within fifty years. Rivers originating in glacier regions provide fresh water for
billions of people. If the glaciers disappear, there will be heavy snowmelt and pools in the spring, but many dry rivers in the late summer and fall. The melting of glaciers is proceeding rapidly at current atmospheric composition. Probably the best we can hope is that restoration of the planets energy balance may halt glacier recession. (3) The Greenland and West Antarctic ice sheets are each losing mass at more than 100 cubic kilometers per year, and sea level is rising at more than 3 centimeters per decade. Clearly the ice sheets are unstable with the present climate forcing. Ice shelves around Antarctica are melting rapidly. It is difficult to say how far carbon dioxide must be reduced to stabilize the ice sheets, but clearly 387 ppm is too much. (4) Data show that subtropical regions have expanded poleward by 4 degrees of latitude on average. Such expansion is an expected effect of global warming, but the change has been faster than predicted. Dry regions have expanded in the southern United States, the Mediterranean, and Australia. Fire frequency and area in the western United States have increased by 300 percent over the past several decades. Lake Powell and Lake Mead are now only half full. Climate change is a major cause of these regional shifts, although forest management practices and increased usage of freshwater aggravate the resulting problems. (5) Coral reefs, where a quarter of all main biological species are located, are suffering from multiple stresses, with two of the most important

stresses, ocean acidification and warming surface water, caused by increasing carbon dioxide. As carbon dioxide in the air increases, the ocean
dissolves some of the carbon dioxide, becoming more acidic. This makes it more difficult for animals with carbonate shells or skeletons to surviveindeed, sufficiently acidic water dissolve carbonates. Ongoing studies suggest that coral reefs would have a better chance of surviving modern stresses if carbon dioxide were reduced to less than 350 ppm. I am often asked: If we want to maintain Holocene-like climate, why should the target carbon dioxide not be close to the preindustrial trial amount, say 300 ppm or 280 ppm? The reason, in part, is that there are other climate forgings besides carbon dioxide, and we do not expect those to return to preindustrial levels. There is no plan to

remove all roadways, buildings, and other human-made effects on the planets surface. Nor will we prevent all activities that produce aerosols. Until we know all forcings and understand their net effect, it is premature to be more specific than less than 350 ppm, and it is unnecessary for policy purposes. It will take time to turn carbon dioxide around and for it to begin to approach 350 ppm. By then, if we have been making appropriate measurements, our knowledge should be much
improved and we will have extensive empirical evidence on real-world changes. Also our best current estimate for the planets mean energ y imbalance over the past decade, thus averaged over the solar cycle, is about +0.5 watt per square meter. Reducing carbon dioxide to 350 ppm would increase emission to space 0.5

watt per square meter, restoring the planets energy balance, to first approximation.

Warming causes extinction


Cummins and Allen 10*Intl. Dir. Organic Consumers Association - Policy Advisor Organic Consumers Association
(Ronnie and Will, Climate Catastrophe: Surviving the 21st Century, 14 February 2010, http://www.commondreams.org/view/2010/02/14-6) The hour is late. Leading

climate scientists such as James Hansen are literally shouting at the top of their lungs that the world needs to reduce emissions by 20-40% as soon as possible, and 80-90% by the year 2050, if we are to avoid climate chaos, crop failures, endless wars, melting of the polar icecaps, and a disastrous rise in ocean levels. Either we radically reduce CO2 and carbon dioxide equivalent (CO2e, which includes all GHGs, not just CO2) pollutants (currently at 390 parts per million and rising 2 ppm per year) to 350 ppm, including agriculture-derived methane and nitrous oxide pollution, or else survival for the present and future generations is in jeopardy. As scientists warned at Copenhagen, business as usual and a corresponding 7-8.6 degree Fahrenheit rise in global temperatures means that the carrying capacity of the Earth in 2100 will be

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

reduced to one billion people. Under this hellish scenario, billions will die of thirst, cold, heat, disease, war, and starvation.

Plan results in huge GHG reductions CCAP 6, Center for Clean Air Policy, (Center for Green air Policy, High Speed Rail and Greenhouse Gas Emissions in the U.S, Arti cle
itself is a publication, January 2006, http://www.cnt.org/repository/HighSpeedRailEmissions.pdf)//AG

To estimate high speed rails net emissions impact, we calculated the carbon dioxide (CO2) emissions saved from passengers switching to high speed rail from other modes (air, conventional rail, automobile and bus) and subtracted the estimated emissions generated by high speed rail. Our calculations were based on passenger projections and diversion rates for each corridor and typical emissions rates for each mode of travel, including several different high speed rail technologies. Current projections show that passengers would take 112 million trips on high speed rail in the U.S. in 2025, traveling more than 25 billion passenger miles. This would result in 29 million fewer automobile trips and nearly 500,000 fewer flights. We calculated a total emissions savings of 6 billion pounds of CO2 per year (2.7 MMTCO2) if all proposed high speed rail systems studied for this project are built. Savings from cancelled
automobile and airplane trips are the primary sources of the emissions savings; together these two modes make up 80 percent of the estimated emissions savings from all

Our modeling shows that high speed rail, if built as planned, will generate substantial GHG savings in all regions. The total emissions savings vary greatly by corridor, however, as do the source of those savings. In some regions, such as the Midwest, the impact on
modes. air travel is likely to be modest; our analysis shows just a 7 percent decrease in flights from todays le vels. In California, on the other hand, 19 million passengers are projected to switch from aira volume that would result in 114 percent of todays 192 million annual direct flights in the corridor being cancelled. Such ridership levels may be an overestimate, or may be possible if projected growth in air travel and indirect flights, including those from outside the corridor are included. To draw so many air passengers to rail will certainly require that high speed rail ticket prices be competitive with air and that service be as convenient and time-efficient. It is worth further study to see if such high levels of mode shifting are likely. In some respects, the California system, as it is currently planned, represents what will be the second generation of high speed rail in many of the other corridors. While areas like the Pacific Northwest may increase ridership sooner with an incremental approach to high speed rail that uses existing rail routes, the success of a new high speed rail system like Californias could prove the value of faster trains with higher upfront capital costs.

Plan sends a key signal that solves warming globally. Roberts, regular contributor to Harpers Magazine, 2004
(Paul, The End of Oil: On the Edge of a Perilous New World, pg 325-26, DOA: 5-23-12, ldg) Politically, a new US energy policy would send a powerful message to the rest of the players in the global energy economy. Just as a carbon tax would signal the markets that a new competition had begun, so a progressive aggressive American energy policy would give a warning to international businesses, many of which now regard the United States as a lucrative dumping ground for older high carbon technology. It would signal energy producers, companies and states, that they would need to start making investments for a new energy business, with differing demands and product requirements . Above all, a progressive energy policy would not only show trade partners in Japan and Europe that the United States is serious about climate but would give the United States the leverage it needs to force much-needed changes in the Kyoto treaty. With a carbon program and a serious commitment to improve efficiency and develop clean-energy technologies,
says one US climate expert, the United States could really shape a global climate policy. We could basically say to Europe, here is an American

Here are the practical steps were going to take to reduce emissions, far more effectively than your cockamamie Kyoto protocol. Similarly, the United States would finally have the moral credibility to win promises of cooperation from India and China. As James Mackenzie, the former White House energy analyst who now works on climate issues for the Washington-based World Resources Institute, told me,, Chinese climate researchers and policymakers know precisely what China must do to begin to deal with emissions but have thus far been able to use US intransigence as an excuse for their own inaction. Whenever you bring up the question of what the Chinese should be doing about climate, they just smile. They ask, Why should we in China listen to the United States and take all these steps to protect the climate, when the United States wont take the same steps itself?
answer to climate that is far better than Kyoto.

The effects of warming greatly impact minorities around the world, and the US has an obligation to help these populations.

M.R.G.I 8

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

(Minority Rights Groups International April2008 [Author: Rachel Baird (MRG) is a non-governmental organization (NGO) working to secure the rights of ethnic, religious and linguistic minorities worldwide, and to promote cooperation and understanding between communities. http://www.ohchr.org/Documents/Issues/ClimateChange/Submissions/Minority_Rights_Group_Internation al.pdf) Climate change is just beginning to be articulated as a human rights issue rather than a purely development or environmental crisis. Yet the effects of global warming described above, go to the heart of minority rights, and the key issues of existence, identity, discrimination and participation. If marginalized communities experience systemic discrimination, then they are less likely to survive the upheavals of global warming. If a minoritys lifestyle is being eroded by environmental changes, then their unique culture and language are also under threat. If they cannot participate in the decisions that will affect the outcomes of global warming, then the chances are that their needs will be ignored or sidelined. These outcomes are violations of a states obligations to minorities, under international human rights law a fact already being utilized by some indigenous groups seeking to hold their government to account for climate change-related impacts on their communities. The most famous attempt to use human rights law so far has been the petition submitted to the Inter-American Commission on Human Rights on behalf of the Inuit in Alaska and Canada. This was submitted by Sheila Watt-Clouthier, supported by the Inuit Circumpolar Council (ICC) against the United States in 2005. Their 175-page petition argued that the Arctic was more severely affected by climate change than any other place on earth and that the US, as the worlds largest greenhouse gas emitter, bore more responsibility for this than others.

Defer aff on irreversibility- even if the chance of solvency or warming existing is low, not being able to come back is means you vote aff anyway Cass R Sunstein, Professor in the Department of Political Science and at the Law School of the University of Chicago, 20 07,
Worst-Case Scenarios

Most worst-case scenarios appear to have an element of irreversibility. Once a species is lost, it is lost forever. The special concern for endangered species stems from the permanence of their loss (outside of Jurassic Park). One of the most serious fears associated with genetically modified organisms is that they might lead to irreversible ecological harm. Because some greenhouse gases stay in the atmosphere for centuries, the problem of climate change may be irreversible, at least for all practical purposes. Transgenic crops can impose irreversible losses too, because they can make pests more resistant to pesticides. If we invest significant wealth in one source of energy and neglect others, we may be effectively stuck forever, or at least for a long time. One objection to capital punishment is that errors cannot be reversed. In ordinary life, our judgments about worst-case scenarios have everything to do with irreversibility. Of course an action may be hard but not impossible to undo, and so there may be a continuum of cases, with different degrees of difficulty in reversing. A marriage can be reversed, but divorce is rarely easy; having a child is very close to irreversible; moving from New York to Paris is reversible, but moving back may be difficult. People often take steps to avoid courses of action that are burdensome rather than literally impossible to reverse. In this light, we might identify an Irreversible Harm Precautionary Principle, applicable to a subset of risks.' As a rough first approximation, the principle says this: Special steps should be taken to avoid irreversible harms, through precautions that go well beyond those that would be taken if irreversibility were not a problem. The general attitude here is "act, then learn," as opposed to the tempting alternative of "wait and learn." In the case of climate change, some people believe that research should be our first line of defense. In their view, we should refuse to commit substantial resources to the problem until evidence of serious harm is unmistakably clear.' But even assuming that the evidence is not so clear, research without action allows greenhouse gas emissions to continue, which might produce risks that are irreversible, or at best difficult and expensive to reverse. For this reason, the best course of action might well be to take precautions now as a way of preserving flexibility for future generations. In the environmental context in general, this principle suggests that regulators should proceed with far more aggressive measures than would otherwise seem justified.

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

Ignore timeframe distinctions- even if warming has a long timeframe solving now is the only way to avoid irreversible impacts Rhett Butler 2009 Many global warming impacts may be irreversible in next 100 years, January 27th
http://www.climateark.org/shared/reader/welcome.aspx?linkid=116603 "It is sometimes imagined that slow processes such as climate

changes pose small risks, on the basis of the assumption that a choice can always be made to quickly reduce emissions and thereby reverse any harm within a few years or decades," the write. "We have shown that this assumption is incorrect for carbon dioxide emissions, because of the longevity of the atmospheric CO2 perturbation and ocean warming.
Irreversible climate changes due to carbon dioxide emissions have already taken place, and future carbon dioxide emissions would imply further irreversible effects on the planet, with attendant long legacies for choices made by contemporary society." "In this paper we have quantified how societal decisions regarding carbon dioxide concentrations that have already occurred or

could occur in the coming century imply irreversible dangers relating to climate change for some illustrative populations and regions. These and other dangers pose substantial challenges to humanity and nature, with a magnitude that is directly linked to the peak level of carbon dioxide reached."

Contention 3 is Solvency
The federal government should invest in high speed rail. Doing so creates a stable market, invigorates private investment, and overcomes current obstacles. Only a federal approach mobilizes a megaregional focus to complete rails

TODOROVICH, SCHNED, & LANE 11

1. director of America 2050, a national urban planning initiative, assistant visiting professor at the Pratt Institute Graduate Center for Planning and the Environment and a member of the Board of Advisors of the Eno Transportation Foundation, Masters in City and Regional Planning from the Bloustein School of Planning and Public Policy at Rutgers University 2. associate planner for America 2050 at Regional Plan Association part-time lecturer at the Edward J. Bloustein School of Planning and Public 3. senior fellow for urban design at Regional Plan Association and a founding principal of Plan & Process LLP. Loeb Fellow at the Harvard Graduate School of Design
[Petra Todorovich, Daniel Schned, and Robert Lane, High-Speed Rail: International Lessons for U.S. Policy Makers, September 2011, Lincoln Institute of Land Policy, Policy Focus Report] U.S. Policy and Programs for High-Speed Rail Investment

Each country that has developed high-speed rail has done

so with strong national government leadership . Prior to President Barack Obamas recent embrace of highgovernment support had been a missing ingredient in U.S. passenger rail development. However, signicant federal investments in high-speed rail in 20092010 put the federal High-Speed Intercity Passenger Rail (HSIPR) Program on a solid initial footing. Whether that commitment can be
speed rail, federal

sustained in a difcult scal environment will determine whether high-speed rail in the United States can become a reality . The federal commitment to high-speed rail began in 2008, when Congress passed the Passenger Rail Investment Improvement Act (PRIIA), which authorized funding for Amtrak and state-led efforts to develop highspeed rail corridors between 2009 and 2013. In February 2009, just months after PRIIA was signed into law at the end of 2008, the act became the vehicle for appropriating $8 billion for high-speed rail under the American Recovery and Reinvestment Act (ARRA). An additional $2.5 billion for high-speed rail was appropriated by Congress in the Fiscal Year (FY) 2010 budget (gure 8). These appropriations, totaling $10.5 billion for high-speed and passenger rail, transformed the preservation-focused program established by PRIIA into a highly visible high-speed rail initiative that later became the centerpiece of the Obama administrations infrastructure agenda. However, this sudden infusion of funding also revealed PRIIAs limitations and the challenges of creating an ambitious highspeed and intercity passenger rail program virtually overnight. The subsequent Congressional appropriation for FY 2011 stripped the program of any funding in 2011 and rescinded $400 million from the FY 2010 budget. This abrupt reversal underscores the programs

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

vulnerability to shifting political winds as long as it has to rely on annual Congressional appropriations for its funding. T H E C U R R E N T L E G I S L A T I V E A N D F U N D I N G F R A M E W O R K The current federal policy framework for high-speed rail was shaped in response to both the history of unreliable and minimal federal contributions for passenger rail and the efforts of individual states acting on their own initiative and with their own funding to improve rail corridors. While PRIIA is an improvement over the previous lack of a U.S. passenger rail policy, it is not well-suited to a more ambitious, sustained federal commitment to building dedicated, multistate high-speed rail corridors. Unlike the U.S. highway and transit programs, which rely on dedicated revenue streams from the federal motor fuels tax, passenger rail has no dedicated source of revenue and thus relies on Congress for general fund appropriations. Prior to the passage of PRIIA, most passenger rail appropriations were made directly to Amtrak each year, but with no multiyear authorization since 2002. Numerous Amtrak ofcials have testied to Congress over the years that the uncertainty of these annual, often politicized, appropriations makes planning and operating the railroad difcult. In the absence of consistent federal support for passenger rail, states including California, North Carolina, Pennsylvania, and Washington have established dedicated funding streams to improve conventional passenger rail corridors operated by Amtrak. Other states, such as Illinois, Maine, and Vermont,
have directed state general funds or exible federal funds to subsidize and supplement their passenger rail service (U.S. GAO 2010). These state investments have led to the purchase of new rail cars in Washington, track upgrades

for and re-electrication of the Keystone Corridor in Pennsylvania, and more frequent, reliable service and higher ridership on all state-sponsored lines. State funding for rail has come from various sources, including portions of
state gas and diesel taxes, exible funding from the federal Congestion Mitigation and Air Quality Improvement Program, state rental car taxes, and proceeds from specially branded Cash Train scratch lottery tickets in Washington state. T H E H I G H - S P E E D I N T E R C I T Y P A S S E N G E R R A I L P R O G R A M In recognition of these and other state initiatives,

PRIIA established a competitive federal grant program to assist the states and Amtrak in making capital improvements to existing passenger rail corridors that could enhance service, relieve congestion, and develop new high-speed rail services on either existing or new rights-of-way (table 3). These statutes provide the basis for the HSIPR Program,
administered by the U.S. Department of Transportation. The program began in June 2009 with an announcement of funding availability and interim guidelines (U.S. GAO 2011). The FRA was charged with administering the program, selecting applicants, awarding grants, negotiating funding agreements, and writing a national passenger and freight rail plan. These new responsibilities required the FRA to increase its planning staff quickly, since most of its existing employees were focused on the traditional roles of the agencysafety and regulatory enforcement of freight and passenger rail services. The HSIPR Program has three funding categories to which states, groups of states, interstate compacts, public agencies, or Amtrak are eligible to apply. One of the programs, the High-Speed Rail Corridor Development Program, is restricted to the 11 federally designated high-speed rail corridors, although grants can be obtained through the other two funding categories for projects on other corridors. The sudden $10.1

billion in funding for high-speed rail in ARRA and the FY 2010 budget was welcomed with great enthusiasm by states nationwide, 39 of which applied for rail planning or construction grants. But it required a rapid increase in capacity at the federal level and within state transportation departments to administer and participate in the program. This new program relied on the states to submit applications for eligible projects. Given the previous lack of federal commitment to passenger rail, only a few states had staff capacity for rail planning or the expertise to develop proposals for Core Express high-speed rail. States with previous commitments to rail planning and funding generally were able to put together successful proposals in the 2009 and 2010 rounds of grant making (gure 9). The two states that had already developed plans for Core
Express high-speed rail were the most successful in the competition for federal funding. California voters had passed a $9 billion bond act in 2008 to fund a Core Express high-speed rail project connecting Northern and Southern California, and the state was awarded federal grants of approximately $3.6 billion. Florida, which was able to resubmit its high-speed rail proposal from the 2000s, was awarded a total of $2.4 billion for the initial TampaOrlando segment of the statewide high-speed rail project. However, this project was cancelled in early 2011 by newly elected Governor Rick Scott. The remaining federal grant awards went to conventional rail projects, such as those in Washington and Illinois, for projects to increase the speed, reliability, and frequency of passenger rail services on shared passenger and freight corridors. By mid-2011, the distribution of grants largely reected the status of rail planning efforts across the country, with some attention to geographic equity. The FRAs grantmaking process was criticized for a lack of transparency by Chairman John Mica of the House Transportation and Infrastructure Committee. However, a U.S. Government Accountability Ofce (GAO) report that he commissioned states: The FRA established a fair and objective approach for distributing these funds and substantially followed recommended discretional grant award practices used throughout the government (U.S. GAO 2011, 22). By August 2011, two years after the launch of the HSIPR Program, nearly 75 percent of the awarded funds had been released to 25 states, the District of Columbia, and Amtrak, allowing them to start work. However, the program continues to face criticisms, largely focused on the perceived high cost of rail investments;

unimpressive trip time savings; and the lengthy timeline for rail planning, engineering, environmental review, and construction. F E D E R A L R A I L P O L I C Y C H A L L E N G E S Even though PRIIA is authorized

10

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

through 2013, stakeholders in the rail industry, including one of the drafters of PRIIA, have remarked on the need to adjust federal rail policy to respond to current circumstances, including greater political instability in the Middle East and its implications for Americas dependence on foreign oil; growing international and private sector interest in helping to nance high-speed rail in the United States; and the presidents own ambitious proposals for a national high-speed rail network to give 80 percent of Americans access to high-speed rail over the next 25 years (Gardner 2011). Such a vision requires a stronger and more active federal commitment that must start with secure funding . The most recent setback of zero funding for highthe need for a sustainable revenue source as reliable as funding for highway and transit programs in the past. President Obamas proposal to include a $53 billion, six-year high-speed rail program as part of the surface transportation bill would help to achieve this kind of equity among transportation modes. In conjunction with a funding strategy, the role of high-speed rail in Americas larger transportation network needs to be better dened (U.S. GAO 2009). A sharper, more narrowly focused program directed at corridors that meet clearly articulated objectives for high-speed rail service would address criticisms that the program is diffuse, ineffective, and dependent on ongoing subsidies . Nationally available data could help to evaluate the most promising regions for attracting ridership and enhancing economic and other benets. A phasing plan and funding allocation strategy could help develop the full build-out of a national network by helping states secure rights-of-way for high-speed rail corridors. Another
speed rail in the FY 2011 budget underscores challenge is to clarify the differences between conventional and high-speed rail corridors. PRIIA provides federal grants for both conventional passenger rail and new high-speed corridors, although the media has tended to focus on the high-speed program. Neither PRIIA nor ARRA specied the share of federal funding to be used for high-speed Core Express corridors versus conventional passenger rail. In fact, the dearth of highspeed rail projects in the planning pipeline means that grants will be shared among various types of rail projects. A more active role by the federal government could help clarify the

respective roles of high-speed Core Express corridors and conventional Regional and Emerging/Feeder routes, including funding them through separate programs and clearly dening the objectives for each type of rail service. Funding for maintaining and upgrading existing rail corridors could be provided through formula funds based on passenger train movements, track miles, or ridership. President Obamas FY 2012 budget proposal for the Department of Transportation moved in this direction by establishing different competitive grant programs ,
including network development for constructing new corridors and system preservation for maintaining safety and reliability on existing corridors (White House 2011). The national high-speed rail program also must overcome a lack of effective institutions and administrative structures for building and operating multistate corridors. Public

benet corporations capable of entering into public-private partnerships could develop and maintain highspeed rail infrastructure across megaregional, multistate, and even binational territories. These corporations would be responsible for the tracks, while separate public and private entities would operate the trains. Federal legislation could be developed to enable the creation of these public infrastructure corporations. International examples of publicly chartered infrastructure corporations include the High Speed 1 (HS1) and High
Speed 2 (HS2) companies in the United Kingdom, Spains state-owned Administrator of Railway Infrastructures (Adif), and Rseau Ferr de France (RFF), the French Rail Network. Regional public benet corporations could be created in the United Sta tes to develop and manage track infrastructure, receive federal high-speed rail grants, and enter into contracts with private consortia for design, construction, and maintenance. S U M M A R Y The PRIIA legislation enacted in 2008 provided a transition from an era with no federal partner for high-speed and passenger rail to a period of active federal partnership with the states. Thirty-two states, the District of Columbia, and Amtrak have been awarded funding through the HSIPR Program and are moving ahead to plan or build high-speed and conventional rail projects. Given the quick start-up nature of the program, the FRA did an admirable job of responding to many simultaneous new duties, but also faced challenges in both laying the groundwork for a foundational program and implementing it at the same time. The setbacks experienced in 2011, when several governors cancelled rail projects and Congress appropriated zero dollars for highspeed rail, provide

an impetus to reset the program in a way that will better position it for long-term success. Federal policy initiatives could set the program on rmer footing for a long-term commitment and restore public condence in an era of scal austerity

11

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

Investing in rails would massively reduce fossil fuel emissions thats key to combatting global climate change and avoiding the worst impacts. Only federal action matters

SCHWARTZ et al 08

1. President and CEO, Sam Schwartz Engineering (SSE), a multidisciplinary consulting firm specializing in traffic and transportation engineering 2. Assistant Commissioner, Division of Traffic Management, New York City Department of Transportation 3. Vice President, Director of Planning and Design, Sam Schwartz Engineering 4. Senior Transportation Planner, Sam Schwartz Engineering
[Sam Schwartz, Gerard Soffian, Jee Mee Kim, and Annie Weinstock, Symposium: Breaking the Logjam: Environmental Reform for the New Congress and Administration: Panel V: Urban Issues: A Comprehensive Transportation Policy for the 21st Century: A Case Study of Congestion Pricing in New York City, New York University Environmental Law Journal, 2008, 17 N.Y.U. Envtl. L.J. 580]

Transportation funding at the Federal level plays a direct role in environmental protection as cars and other vehicles contribute significantly to urban air pollution by producing CO2, the primary pollutant attributed to global climate change. Pricing strategies that consider the true costs of travel, such as congestion pricing measures in urban areas, as well as increased aviation fees and rail investment, particularly between well-traveled metropolitan areas, are direct measures that could reduce VMT while funding transit and rail. To achieve reductions in VMT between metropolitan areas less than 500 miles apart, rail needs to become a more affordable and convenient alternative to flying. This is a significant challenge as the cost of flying has become cheaper and more affordable in recent years due to the rise of bargain airlines and shrinking rail subsidies. Despite the Federal trend steering some funding away from traditional highway projects, the table below shows that the annual lion's share of Federal funding is directed at highways ($ 34 billion), with air travel receiving a little less than half that amount ($ 13.8 billion) (see Table 5). Meanwhile, rail funding is just a meager $ 360 million, or 1 percent of highway allocation and 3 percent of air funding. Of the $ 13.8 billion in air travel funding, $ 2.4 billion was allocated towards infrastructure
development, capital improvements and efficiency. In fact, there are more than [*606] one hundred locales in the U.S. that receive federally subsidized airline service. n44 In contrast, funding for passenger rail in 2001 was at its lowest level in over ten years. Adjusted for inflation, passenger rail in 2003 received less than two-thirds of what it was getting twenty years ago, while funding for highways and aviation have doubled. n45 Air travelers contribute little to the cost of providing public services. Some critics have proposed imposing an aviation tax to offset some of these externalities. In fact, Britain's Department for Transport suggested in December 2000 that if these hidden costs were included, air travel demand would decrease by 3 to 5 percent, equal to a tax of about 1 billion. Further, the European Environment Agency has suggested that total external cost of [*607] British aviation alone is about 6 billion per year. Advisor to the British government on the economics of climate change, Sir Nicholas Stern, has argued that if, for example, the environmental cost of each ton of CO2 emitted were priced at $ 85, one London-Miami return flight emitting approximately two tons of CO2 per passenger would need to add $ 170 to the current price. n46 Similar pricing strategies have been proposed (beyond congestion pricing) to account for the true cost of driving. Although it is impossible to calculate the precise cost of these externalities, some conservative estimates show them adding up to 22 cents for every mile Americans drive. At 22 cents per mile, a gas tax of $ 6.60 a gallon would be necessary to make drivers fully pay for the cost that car travel imposes on the economy. n47 To

increase public usage of rail, Federal subsidies must increase, including investments to

infrastructure , as well as the development of new high speed rail service. To further institute a system where travel is more accurately priced to reflect its true cost, the cost of flying must increase. In recent years, Americans have become increasingly enlightened to the problems facing the environment and are likely to be more open than ever to changes in the functioning of their transportation system. In facing the lead-up to the 2009 reauthorization of the federal transportation bill, Congress now has the opportunity to provide leadership on a host of transportation reforms. Measures such as congestion pricing and an increased investment in regional rail could be instrumental in reducing overall VMT and, as a result, in decreasing emissions. Such steps are imperative in addressing global climate change and the long-term impacts of man on the environment.

It isnt about the types of rails or their completion it is about investing on the federal level to help the environment and jobs. Investment overcomes obstacles

PRUM & CATZ 12

*Assistant Professor, Environmental Law, Florida State University ** Director, Center for Urban Infrastructure; Research Associate, Institute of Transportation Studies, University of California, Irvine

12

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

[Darren A. Prum & Sarah Catz, High Speed Rail in America: An Evaluation of the Regulatory, Real Property, and Environmental Obstacles a Project will Encounter, North Carolina Journal of Law & Technology, 2012]
After electric generation, transportation in

the United States is the second largest as well as the second fastest growing source of greenhouse gas emissions.164 Smarter transportation policies could reduce congestion and emissions and help revitalize the economy jointly.165 As a result, HSR often receives mention as a solution to reducing congestion, increasing mobility, and helping clean up the environment through the reduction of greenhouse gas emissions; yet in most jurisdictions, transportation policies fail to take on this issue.166 Colin Peppard, the deputy director of Federal Transportation Policy at the National Resource Defense Council, echoed this sentiment when he stated, Most states transportation departments seem to be ignoring their important role in stopping climate change. If states considered all their transportation policy options, they could tap into tremendous potential to reduce carbon emissions, even with limited resources.16 Supporting this notion, a recent report released by Smart Growth America, concluded that most states do not make any effort to connect transportation policy with climate change and energy goals; some even put in place systems that effectively sabotage these goals. 168 The report found that current transportation policy in most states will likely worsen greenhouse gas emission trends in the United States.169 As such, if we want to strive for a better transportation system that can reduce carbon emissions at the same time, state and federal transportation policies cannot work at odds with carbon reduction efforts.170 Otherwise, states are at risk both environmentally and economically.171 Keeping these perspectives in mind, both direct and indirect economic and environmental benefits of HSR represent an important convergence of policy objectives and an opportunity to shift the terms of the debate by demonstrating how a transformative, large--scale infrastructure project would contribute favorably to both desired outcomes . A projects positive economic impact
deserves a more thorough analysis and understanding by not only regional planners and policymakers but also the public at large. While many of the states planning for HSR systems have run out of highway capacity and have seen their mobility almost completely diminish, creative solutions still exist; but they require ingenuity, flexibility, prospective outlook and, most importantly, political will to overcome the financial hesitancies. In order to gain and maintain political will, the HSR projects will need to develop a visionary strategy. The projects will also need to form collaborative partnerships with the business, environmental, and community leaders who will come forward in support of the goal. For example, a project will need to select a particular technology for use on its routes. Many factors will play a role in this decision, since maglev and steel wheel technology present different positives and negatives to each set of circumstances. Often, the steel wheel technology receives more consideration over maglev due to its ability to operate on existing track; however, the present rail infrastructure owned by the freight railways will not allow for the higher speeds. The existing track will need upgrades in order to allow for the equivalent speeds of the maglev system, which will erase many of the steel wheel advantages of using the existing infrastructure With this premise in mind, the amount of development surrounding the rail line will shape the technological approach. Because the maglev system requires a dedicated guideway, the installation of track within less developed regions of the country or where more wide--open spaces occur correlates very similarly to that of the steel wheel technology making the two options comparable. However, the steel wheel approach fits better within an urban setting since it can utilize existing rail infrastructure with minimal retrofitting needs albeit at a much slower speed. In other situations where geography plays a role, the additional infrastructure requirements may produce a different analysis . For instance, some parts of the country can benefit from maglevs ability to overcome mountain passes with little need for additional infrastructure like tunnels, while the terrain in other areas can utilize steel wheel technology because of its more level geography.172 Accordingly, the countrys diversity on both urban and rural settings in conjunction with its geographic variety demonstrates that neither technology provides a superior choice in all settings. Furthermore, the ROW issues will also present a hurdle to HSR projects not associated with Amtrak. Because Amtrak chose to indemnify the track owners for possible torts claims, a nongovernmental project choosing to utilize existing freight track will need to overcome this precedent while securing access and possibly the right to upgrade and maintain a better quality of rail line infrastructure. A project will also need to either obtain new ROWs where possible or share track with existing infrastructure in other locations to fulfill its high--speed mission. As such, both of these hurdles provide significant concerns towards accomplishing the HSR goal, but the financial model used to operate the HSR can resolve many of these economic issues associated with ROW. Finally, the concluded Stage 1 NEPA analysis in both the southeast and California--Nevada corridors opted for HSR instead of other choices like improving highways and airports or taking little to no action .173 The fact that

two independent macro level studies for different projects concluded that HSR offered a better solution over the traditional highway and aviation solutions shows the strength of the overall benefits provided by HSR on both the transportation and environmental aspects. Thus, the missing element to successfully implementing HSR across the country comes from a lack of political will in Congress and at the state level to foster the appropriate setting; since most, if not all, of the identifiable obstacles can be remedied in the

13

Dominion Debate 2012-2013 [High Speed Rail 1AC]


comprehensive operating plan and on a financial level.

[Rob Wimberly]
VII. Conclusion With the foregoing in mind, none of the issues

outlined are insurmountable to accomplish the goal of bringing HSR to the United States. However, HSR will not occur in this country if the different levels of government do not start to align their transportation, environmental, and economic policies into a unified direction. Unfortunately few of the enumerated benefits will occur if transit budgets remain slashed and if states continue to lack a nexus between their transportation, environmental, and economic policies. A HSR system will not reach its potential if rail feeder buses and light and commuter rail services are abandoned. If our leaders are sincere about implementing

climate change initiatives, transit should be recognized as the most essential component lending to the reduction of greenhouse gas emissions instead of treated as a mere afterthought. In practical terms, adequate funding must be preserved to promote all modes of public transportation To this end, the foundational elements that justify HSRs existence need continued support by all levels of government. In order to successfully
implement a HSR system in this nation, the many opponents will need proof that HSR is a system that not only can be built in a sustainable, responsible, and efficient manner but also follows the environmental guidelines of NEPA and relevant state laws while lowering travel times, increasing mobility, as well as reducing congestion and emissions Hence, the Obama Administration created the initial momentum to take control of some of the many global warming issues, while pushing for a cleaner energy policy throughout the country by investing in a smarter and greener transportation infrastructure such as HSR

that creates multiple benefits simultaneously.

Contention 4 is Pre-empts
War wont happen Deudney 9
(professor of political science at John Hopkins et al (Daniel, and John Ikenberry, professor of international affairs at Princeton, Foreign Affairs, http://www.foreignaffairs.com/articles/63721/daniel-deudney-and-gjohn-ikenberry/the-myth-of-the-autocratic-revival) This bleak outlook is based on an exaggeration of recent developments and ignores powerful countervailing factors and forces.) Indeed, contraryto what the revivalists describe, the most striking features of the contemporary international landscape are the intensification of economic globalization, thickening institutions, and shared problems of interdependence. The overall structure of the international system today is quite unlike that of the nineteenth century. Compared to older orders, the contemporary liberal-centered international order provides a set of constraints and opportunities-of pushes and pulls-that reduce the likelihood of severe conflict while creating strong imperatives for cooperative problem solving . Those invoking the nineteenth century as a model for the twenty-first also fail to acknowledge the extent to which war as a path to conflict resolution and great-power expansion has become largely obsolete. Most important, nuclear weapons have transformed great-power war from a routine feature of international politics into an exercise in national suicide. With all of the great powers possessing nuclear weapons and ample means to rapidly expand their deterrent forces, warfare among these states has truly become an option of last resort. The prospect of such great losses has instilled in the great powers a level of caution and restraint that effectively precludes major revisionist efforts. Furthermore, the diffusion of small arms and the near universality of nationalism have severely limited the ability of great powers to conquer and occupy territory inhabited by resisting populations (as Algeria, Vietnam, Afghanistan, and now Iraq have demonstrated). Unlike during the days of empire building in the nineteenth century, states today cannot translate great asymmetries of power into effective territorial control; at most, they can hope for loose hegemonic relationships that require them to give something in return . Also unlike in the nineteenth century, today the density of trade, investment, and production networks across international borders raises even more the costs of war. A Chinese invasion of Taiwan, to take one of the most plausible cases of a future interstate war, would pose for the Chinese communist regime daunting economic costs, both domestic and international. Taken together, these changes in the economy of violence mean that the international system is far more primed for peace than the autocratic revivalists acknowledge .

14

Dominion Debate 2012-2013 [High Speed Rail 1AC] Nukes prevent war Murdock 8

[Rob Wimberly]

(clarksenior advisor at CSIS, March 2008, the Department of Defense and the Nuclear Mission in the 21st Century, CSIS, online) From a systemic perspective, nuclear deterrence suppressed the level of violence associated with major power competition: wartime fatalities consumed 2 percent of the worlds population in the 1600s and 1700s, about 1 percent in the 1800s, about 1.5 percent in World War I and 2.5 percent in World War II, but about one-tenth during the Cold War (minus the Korean War, which pushed fatalities up to 0.5 percent). A leading practitioner of the art of nuclear deterrence, Sir Michael Quinlan, aptly observed: Better a world with nuclear weapons but no major war, than one with major war but no nuclear weapons.17 Despite the close calls and the now almost inexplicable buildup of nuclear weapons by the superpowers, the fact remains: nuclear weapons kept the superpower competition from becoming a war . The violence-suppressive effect of nuclear weapons has not gone away with the end of the Cold War. Noted Cold War deterrent theorist and Nobel economics laureate Thomas Schelling told a recent World Economic Forum retreat (according to Thomas Barnett, the Pentagons favorite futurist) that (1) no state that has developed nuclear weapons has ever been attacked by another state and (2) no state armed with nuclear weapons has ever attacked another state similarly armed.18 With his characteristic flair, Barnett observes that the United States and the Soviet Union learned that nuclear weapons are for having and not using. Due to the equalizing threats of mutually assured destruction, these devices cannot win wars but only prevent them. The same logic has heldall these decadesfor powers as diverse as the United Kingdom, France, China, India, Pakistan and Israel, with North Korea stepping up to the plate and Iran on deck. Thus we have survived the democratic bomb and the totalitarian bomb, as well as the capitalist bomb and the communist bomb. In religious terms, we have survived the Christian and atheist bombs, the Confucian and Hindu bombs and the Islamic and Jewish bombs . Somehow, despite all the irrationalities ascribed to each new member, the logic of nuclear deterrence holds fast .19

Even if the worst case scenario happens, total nuclear war wont cause extinction Nyquist, 1999
[J.R., Is Nuclear War Survivable? May 20, WorldNetDaily.com, http://www.antipas.org/protected_files/news/world/nuclear_war.html] As I write about Russia's nuclear war preparations, I get some interesting mail in response. Some correspondents imagine I am totally ignorant. They point out that nuclear war would cause "nuclear winter," and everyone would die. Since nobody wants to die, nobody would ever start a nuclear war (and nobody would ever seriously prepare for one). Other correspondents suggest I am ignorant of the world-destroying effects of nuclear radiation. I patiently reply to these correspondents that nuclear war would not be the end of the world. I then point to studies showing that "nuclear winter" has no scientific basis, that fallout from a nuclear war would not kill all life on earth. Surprisingly, few of my correspondents are convinced. They prefer apocalyptic myths created by pop scientists, movie producers and journalists. If Dr. Carl Sagan once said "nuclear winter" would follow a nuclear war, then it must be true. If radiation wipes out mankind in a movie, then that's what we can expect in real life. But Carl Sagan was wrong about nuclear winter. And the movie "On the Beach" misled American filmgoers about the effects of fallout. It is time, once and for all, to lay these myths to rest. Nuclear war would not bring about the end of the world, though it would be horribly destructive. The truth is, many prominent physicists have condemned the nuclear winter hypothesis . Nobel laureate Freeman Dyson once said of nuclear winter research, "It's an absolutely atrocious piece of science, but I quite despair of setting the public record straight." Professor Michael McElroy, a Harvard physics professor, also criticized the nuclear winter hypothesis. McElroy said that nuclear winter researchers "stacked the deck" in their study, which was titled "Nuclear Winter: Global Consequences of Multiple Nuclear Explosions" (Science, December 1983). Nuclear winter is the theory that the mass use of nuclear weapons would create enough smoke and dust to blot out the sun, causing a catastrophic drop in global temperatures. According to Carl Sagan, in this situation the earth would freeze. No crops could be grown. Humanity would die of cold and starvation. In truth, natural disasters have frequently produced smoke and dust far greater than those expected from a nuclear war. In 1883 Krakatoa exploded with a blast equivalent to 10,000 onemegaton bombs, a detonation greater than the combined nuclear arsenals of planet earth. The Krakatoa explosion had negligible weather effects. Even more disastrous, going back many thousands of years, a meteor struck Quebec

15

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

with the force of 17.5 million one-megaton bombs, creating a crater 63 kilometers in diameter. But the world did not freeze. Life on earth was not extinguished. Consider the views of Professor George Rathjens of MIT, a known antinuclear activist, who said, "Nuclear winter is the worst example of misrepresentation of science to the public in my memory." Also consider Professor Russell Seitz, at Harvard University's Center for International Affairs,
who says that the nuclear winter hypothesis has been discredited. Two researchers, Starley Thompson and Stephen Schneider, debunked the nuclear winter hypothesis in the summer 1986 issue of Foreign Affairs. Thompson and Schneider stated: " the global apocalyptic conclusions of the initial nuclear winter hypothesis can now be relegated to a vanishingly low level of probability." OK, so nuclear winter isn't going to happen. What about nuclear fallout? Wouldn't the radiation from a nuclear war contaminate the whole earth, killing everyone? The short answer is: absolutely not. Nuclear fallout is a problem, but we should not exaggerate its effects. As it happens, there are two types of fallout produced by nuclear detonations. These are: 1) delayed fallout; and 2) short-term fallout. According to researcher Peter V. Pry, "Delayed fallout will not, contrary to popular belief, gradually kill billions of people everywhere in the world." Of course, delayed fallout would increase the number of people dying of lymphatic cancer, leukemia, and cancer of the thyroid. "However," says Pry, " these deaths would probably be far fewer than deaths now resulting from ... smoking, or from automobile accidents." The real hazard in a nuclear war is the short-term fallout. This is a type of fallout created when a nuclear weapon is detonated at ground level. This type of fallout could kill millions of people, depending on the targeting strategy of the attacking country.

But short-term fallout rapidly subsides to safe levels in 13 to 18 days. It is not permanent. People who live outside of the affected areas will be fine. Those in affected areas can survive if they have access to underground shelters. In some areas, staying indoors may even suffice. Contrary to popular misconception, there were no documented deaths from short-term or delayed fallout at either Hiroshima or Nagasaki. These blasts were low airbursts, which produced minimal fallout effects. Today's thermonuclear weapons are even "cleaner." If used in airburst mode, these weapons would produce few (if any) fallout casualties.

HSR infrastructure projects wont go over budget funding transparency, easy to expedite, and increase economic growth more than offsetting costs
CNN, CNN.com staff, U.S. high-speed rail 'myths' debunked, April 13, 2011
Budget overruns? Comment: "You know that these projects (like high-speed rail) never end at or under budget." -- CNN.com user "rothana" Expert response: 'Not true' Puentes at the Brookings Institution: "It's not true to say these projects are always over budget since we have no high speed rail in the country currently." Stimulus package: "Things are different today. The federal government's stimulus package places a tremendous emphasis on making sure every dollar was spent in a transparent way. This kind of transparency is very helpful to prevent enormous overruns..." Expert response: 'The reader is correct' GOP Reps. Mica and Shuster: "The reader is correct -- in the past, many of America's transportation projects have run over cost and over budget." Bureaucracy: "The reason for this can largely be found in the cumbersome manner in which federal transportation projects are advanced. The Transportation and Infrastructure Committee has received testimony that simply adding one federal dollar to a transportation project adds 14 years to the delivery time. This is unacceptable and it inflates project costs unnecessarily." Example: "The federal government needs to learn to do more with less. ... The I-35 W bridge that collapsed over the Mississippi River in Minneapolis in 2007 was contracted to be rebuilt in just 437 days, and actually came in ahead of schedule and under budget. There is no reason we can't expedite the process for other projects around the country." Expert response: Proper spending ensured Under Secretary Kienitz, U.S. Department of Transportation: Economic engines: "These projects help build the economy. According to a study by the U.S. Conference of Mayors, a high-speed rail line to Los Angeles would create as much as $7.6 billion a year in new business sales, producing up to 55,000 new jobs and $3 billion in new wages. In Chicago, high-speed rail would produce up to $6.1 billion in yearly sales, 42,000 new jobs, and $2.5 billion in new wages for workers."

16

Dominion Debate 2012-2013 [High Speed Rail 1AC]

[Rob Wimberly]

Focus on debt increases is flawed poor economic model no risk of US collapse that way
STIGLITZ 12 in Economics University Professor at Columbia University, and a Nobel laureate
[Joseph E. Stiglitz, Stimulating the Economy in an Era of Debt and Deficit, The Economists Voice http://www.degruyter.com/view/j/ev March, 2012]

The first priority of the country should be a return to full employment. The underemployment of labor is a massive waste and, more than anything else, jeopardizes our countrys future, as the skills of our young get wasted and alienation grows. As the work of Jayadev5 as well as the IMF6 convincingly shows, austerity in America will almost surely weaken growth. Moreover, as the work of Ferguson and Johnson7 shows, we should view with suspicion the claim (e.g. by Rogoff and Reinhardt) that exceeding a certain a debt-to-GDP ratio will trigger a crash. Even if this notion were true on average, the U.S. is not an average country. It is a reserve currency country, with markets responding to global instabilityeven when caused by the U.S.by lowering interest rates. The U.S. has managed even bigger deficits. Unlike the countries of Europe, there is no risk that we will not pay what we owe. To put it bluntly, we promise to repay dollars, and we control the printing presses. But a focus on the ratio of debt-to-GDP is simply economic nonsense. No one would judge a firm by looking at its debt alone. Anyone claiming economic expertise would want to look at the balance sheet assets as well as liabilities. Borrowing to invest is different from borrowing for consumption. The failure of the deficit
hawks to realize this is consistent with my earlier conclusion that this debate is not about the size of the deficit, but about the size of the government and the progressivity of the tax system.

17

You might also like