Is science necessary anymore for defining and solving true environmental problems? President Bush’s plan to semi-combat carbon dioxide is unworthy of his principled stand taken last year against the Kyoto Protocol, the international agreement to limit emissions by the United States of carbon dioxide and other greenhouse gases that are feared may cause global warming. The mandated limits on greenhouse gas emissions would, if implemented, devastate the U.S. economy because they are steep de facto cuts in energy use. As a result, U.S. jobs would be destroyed in the coal, steel, petroleum, aluminum, cement, and other energy-intensive industries. Those jobs would flee overseas, to countries exempt from making any emission cuts, like China, India or Mexico. The Energy Information Administration forecast a loss of U.S. Gross Domestic Product of roughly $300 billion per year resulting from the mandated emission cuts. Over a decade, that GDP loss would accumulate to a loss three times more intense than the impact of the Great Depression. All this would have little meaningful impact on the concentration of carbon dioxide in the air, because of massive worldwide exemptions to emission reductions. Worse, the loss to the U.S. economy would echo loudly around the world. Most hurt would be the struggling poor and elderly around the world. Economically developing nations would lose a major partner in trade – one important step toward improving those economies. Moreover, U.S. aid and military support would diminish, encouraging thugs to prey further on the innocent. Bush rejected the Protocol on the grounds of its futility in significantly reducing the air’s carbon dioxide content and the inhumanity of its guarantee of economic havoc. Most important, science rejects Kyoto’s mandates. The new White House plan, while increasing federal outlays for research on the human causes of global climate change, encourages voluntary emission reductions, most significantly of carbon dioxide and methane. But bureaucratically institutionalizing volunteerism and slow-downs on energy use, no matter how soft, would harm U.S. energy, national security and economic policy, especially when science states that the risk of human-made global warming resolutely remains, at present, vanishingly small. For example, the White House plan states that targeting methane is a potentially cost-effective mitigation strategy because, although it is a smaller component of human-made greenhouse gas emissions, its warming effect is 20 times more potent per molecule compared to carbon dioxide, the most prevalent greenhouse gas emitted by human activities. In terms of emission reductions, methane may be, strictly speaking, a cost-effective target for reductions. But here is where science kicks in: while methane emission worldwide rose in the past decade, the growth of its concentration in the air dropped. Presumably, a natural sink of methane is absorbing the human-made emission. Thus, spending effort and resources to deter methane emission will do little to influence the methane content of the air – which must be governed by large, natural factors that remain a challenge to scientific understanding. More mischievous is the program to encourage voluntary reductions in carbon dioxide emission. R. Glenn Hubbard, Chairman of the President’s Council of Economic Advisers justifies the need for the program based on “the fact that [human-produced] climate change is a real risk,” presumably based on scientifically-vetted results (New York Times, 2/15/02, p. A25). The largely nonspecific White House proposal seeks broad agreements from U.S. companies to pledge programs that “will reduce greenhouse gas intensity of the U.S. economy by 18% in the next ten years.” The volunteer program promises “transferable credits” for future climate policy, “as the science justifies.” That phrase, calling for reductions in greenhouse gas emissions “as the science justifies” is strange. The phrase should read, “if the science justifies,” because today’s best climate science already rejects the need for such a program. While initially promoted as a fuzzy limit, a target for emission cuts has now materialized, along with a promise to “protect and provide” future credits for emission reduction. Those institutionalized promises remove altruism from volunteerism, because a complex bureaucratic infrastructure is mandated to measure, report and study mitigation for an unworkable credit-trading scheme that will pervade every aspect of the United States’ energy-intensive economy. The specter of future emission reduction credits also clouds the future of coal, a major carbon-dioxide emitter that produces roughly 55% of U.S. electricity. The coal industries and the people who rely on coal will be first disadvantaged. The need to bureaucratize carbon dioxide emission and energy use arises from forecasts of global warming trends from computer simulations. The forecasts themselves are highly uncertain, because they are made with computer simulations of climate change. Yet the extremely complex simulations are known to be inadequate for making forecasts – on this, climate science is confident. The forecasts fail to meet the fundamental standard of science because they incorrectly predict key trends in observed temperatures over the last several decades. This is the critical period during which the air’s human-made greenhouse gases have increased sharply. In the layer of air supposedly most sensitive to the added greenhouse gases, the lack of an observable, meaningful human-made warming trend over the last several decades is acute. The science is not uncertain. Records of surface temperature of the earth sampled worldwide show that the 20th century was warmer than the 19th century. Is the cause of that warming the emission of greenhouse gases to the air from fossil-fuel burning and the harbinger of catastrophic future warming? Science emphatically rejects those notions, for at least two reasons. First, the surface temperature record in the 20th century displays three distinct phases: a strong natural warming trend prior to 1940, a cooling trend between about 1940 and the late 1970s, and a warming trend since then. Because the large majority of the increase in carbon dioxide emission from human activities took place after 1950, only the recent surface warming could possibly be human-made. That recent warming trend is observed to be on the order of one-tenth degree C per decade, for a maximum century-long trend forecast to accumulate to 1 C. That variability is on par with natural climate fluctuations, and would hardly be catastrophic. However, key temperature measurements say that the 1 C world-wide surface warming is minimally due to human effects, and therefore must have natural causes. Here’s the reason: all the climate simulations say that the air’s increased carbon dioxide content must warm the air from an altitude of about 5,000 to 28,000 feet. That warmed layer is then expected, in every computer simulation, to warm the surface. In other words, a human-made global warming trend must be observed in the low air as well as the surface, and the low layer of air actually is predicted to warm more strongly than the surface. Both satellites and balloons have carried instruments aloft to sense the temperature changes of those layers. In the case of the satellite record, which begins in 1979, there is a globally-averaged warming trend of only 0.04 C per decade, which projects to 0.4 C per century. But that exceedingly small positive trend is probably not the result of human activities. Owing to the strong pulse of natural warmth caused by the 1997-1998 El Ni
Using ESG investing to replace them harms both people and the environment, as there are no readily available alternatives for the 80% of the world’s energy that currently comes from oil, coal, and natural gas.