All posts by Garry Peterson

Prof. of Environmental science at Stockholm Resilience Centre at Stockholm University in Sweden.

Ecology and Development: the MA & MDGs

Strategy for Sust Dev - Sachs & Reid Science 2006 The economist Jeffrey Sachs, the director the development oriented Millennium Project, and the ecologist Walt Reid, former director of the Millennium Ecosystem Assessment have written a joint policy forum in Science (May 19, 2006) Investments Toward Sustainable Development. They note that both projects have broad agreement about the need to integrate ecology and poverty alleviation.  They recommend that the world invest in ecological infrastructure in poor countries and establish a periodic assessment of the benefits that people obtains from ecosystems. They write:

The United Nations (U.N.) Millennium Project and the Millennium Ecosystem Assessment highlighted the centrality of environmental management for poverty reduction and general well-being. Each report emphasized the unsustainability of our current trajectory. Millions of people die each year because of their poverty and extreme vulnerability to droughts, crop failure, lack of safe drinking water, and other environmentally related ills. The desperation of the poor and heedlessness of the rich also exact a toll on future well-being in terms of habitat destruction, species extinction, and climate change.

The goal of the Millennium Project is to develop and to promote practical plans for achieving the U.N. Millennium Development Goals (MDGs) for ending poverty, eradicating hunger, achieving universal primary education, improving health, and restoring a healthy environment. The MA, in turn, examined the consequences of ecosystem change for human well-being and analyzed options for conserving ecosystems while enhancing their contributions to people. The MA and the Millennium Project reached strikingly parallel conclusions:

  1. Environmental degradation is a major barrier to the achievement of the MDGs. The MA examined 24 ecosystem services (the benefits people obtain from ecosystems) and found that productivity of only 4 had been enhanced over the last 50 years, whereas 15 (including capture fisheries, water purification, natural hazard regulation, and regional climate regulation) had been degraded. More than 70% of the 1.1 billion poor people surviving on less than $1 per day live in rural areas, where they are directly dependent on ecosystem services.
  2. Investing in environmental assets and management are vital to cost-effective and equitable strategies to achieve national goals for relief from poverty, hunger, and disease. For example, investments in improved agricultural practices to reduce water pollution can boost coastal fishing industry. Wetlands protection can meet needs of rural communities while avoiding costs of expensive flood control infrastructure. Yet these investments are often overlooked.
  3. Reaching environmental goals requires progress in eradicating poverty. More coherent and bolder poverty reduction strategies could ease environmental stresses by slowing population growth and enabling the poor to invest long term in their environment.

We recommend the following measures in 2006. First, we call on the rich donor countries to establish a Millennium Ecosystem Fund to give poor countries the wherewithal to incorporate environmental sustainability into national development strategies. The fund would support work that focuses on how poverty reduction can enhance environmental conservation (e.g., by giving farmers alternatives to slash and burn) and how environmental sustainability can support poverty reduction (e.g., watershed management to maintain clean water supplies). It would also support national ecosystem service assessments to help decision-makers factor the economic and health consequences of changes in ecosystem services into their planning choices.

The fund would initially need roughly $200 million over 5 years. It would enable universities and scientists in dozens of the poorest countries to incorporate the science of environmental sustainability into poverty reduction strategies. The programs would generate evidence for countries to use in setting priorities for national development and environmental investments.

Second, the United Nations should establish a cycle of global assessments modeled on the MA and similar to the climate change reports produced at 4- to 5-year intervals by the Intergovernmental Panel on Climate Change (IPCC). The MA and IPCC cost roughly $20 million, and each mobilized in-kind contributions of that magnitude. A global network of respected ecologists, economists, and social scientists working to bring scientific knowledge to decision-makers and to the public can clarify the state of scientific knowledge, help to mobilize needed research, and defeat the obfuscation led by vested interests.

Paradox of the clumps

Sean Nee and Nick Colegrave comment on the Scheffer and van Ness PNAS paper on the formation and persistence of ecological lumps (see the earlier post by Buzz Holling, and the commentary Discontinuities in ecological data by Craig Allen in PNAS).

Their commentary, Paradox of the clumps (Nature May 25, 2006) suggest that ecological clumpiness of species may change how we think about species. They write:

…Scheffer and van Nes have revisited a well-studied classical model of competing species and discovered something new. Even in the absence of any environmental discontinuities, they find that assemblages of species will self-organize into clumps of species with very similar niches within a clump and a large difference between clumps. So, paradoxically, species both do, and do not, organize themselves into discrete niches.

In the Origin of Species, Darwin asked: “Why, if species have descended from other species by fine gradations, do we not everywhere see innumerable transitional forms? Why is not all nature in confusion, instead of the species being, as we see them, well defined?” This evolutionary question has a closely related ecological counterpart: how similar can species be to one another and still coexist? … With a single, continuous niche axis, how many species can you pack along it? Or, is there a limit to how close the species can be along this axis?

Previous analytical results produced single species widely spaced along the niche axis. But Scheffer and van Nes find widely spaced clumps of species occupying very similar niches. Why the difference? Analytical work looks at the long-term equilibria of models, whereas a simulation study allows the system to be observed as it moves towards these equilibria. Scheffer and van Nes take the simulation approach, which starts out with a large number of species along the axis and then evolves the system according to standard equations that govern competition between species. The clumps they observe are transient, and each will ultimately be thinned out to a single species. But ‘ultimately’ can be a very long time indeed: we now know that transient phenomena can be very long-lasting and, hence, important in ecology, and such phenomena can be studied effectively only by simulation. There is also good experimental evidence for long-lasting coexistence between similar species.

The emergence of clumps of highly similar species resonates with a proposed solution to another possible problem: the coexistence of large numbers of species in environments that do not seem to allow for much niche differentiation. Plankton and tropical forest plants are the usual examples. These organisms have a simple set of requirements: light, carbon dioxide and a few nutrients. How is it possible to carve out thousands of distinct niches from so few requirements? It has been proposed that such high numbers of species can coexist precisely because their niches are so similar that exclusion takes a very long time, perhaps on the same timescale as speciation.

We can go further: on what basis did Darwin make his assertion about the discreteness of species? This question is distinct from debates about the definition of species in nature. Blackberries reproduce asexually, and it is impossible to agree on how many ‘species’ there are; but, nonetheless, we all know a ‘blackberry’ when we see one and do not wonder if it is actually a raspberry. Great tits, blue tits and coal tits are all quite distinct when considered as a set, but are surely just more-or-less continuous variants on a tit theme when compared with flamingos. Bacteria that are vastly different genetically are all called Legionella because they clump along the single niche axis that matters to us: they all cause Legionnaire’s disease.

So what is the correct or meaningful frame of reference when thinking about the ecological nature of species? As well as providing stimulating theoretical results, Scheffer and van Nes have revitalized the fundamental question of how we should look at the ecological identity of species.

Pollution risks Yangtze’s ‘death’

From BBC news, Pollution risks Yangtze’s ‘death’, briefly describes China’s fears that how large scale eutrophication and pollution is impacting human wellbeing and economic growth prospects along the Yangze.

YangtzeThe Yangtze, China’s longest river, is “cancerous” with pollution, reports in the country’s state media have said.Environmental experts fear pollution from untreated agricultural and industrial waste could turn the Yangtze into a “dead river” within five years.

That would make it unable to sustain marine life or provide drinking water to the booming cities along its banks.

The Yangtze rises in China’s western mountains and passes through some of its most densely populated areas.

The government has promised to clean up the Yangtze, which supplies water to almost 200 cities along its banks.

But experts speaking in China’s state media say that unless action is taken quickly, billions of tonnes of untreated industrial and agricultural waste and sewage are likely to kill what remains of the river’s plant and wildlife species within five years.

China’s rapid economic development means that many of the nation’s waterways are facing similar problems.

Last year the authorities announced that the country’s second-longest river, the Yellow River, was so polluted that it was not safe for drinking.

Correspondents say that 300 million people in China do not have access to safe drinking water.

They say that government efforts to clean up the country’s polluted lakes and waterways are being thwarted by lax enforcement standards.

Designing Resilient Software?

Daniel Jackson has written an article, Dependable Software by Design, on how software design tools can be used to improve the resilience of software. In Scientific American he wrote:

An architectural marvel when it opened 11 years ago, the new Denver International Airport’s high-tech jewel was to be its automated baggage handler. It would autonomously route luggage around 26 miles of conveyors for rapid, seamless delivery to planes and passengers. But software problems dogged the system, delaying the airport’s opening by 16 months and adding hundreds of millions of dollars in cost overruns. Despite years of tweaking, it never ran reliably. Last summer airport managers finally pulled the plug–reverting to traditional manually loaded baggage carts and tugs with human drivers. The mechanized handler’s designer, BAE Automated Systems, was liquidated, and United Airlines, its principal user, slipped into bankruptcy, in part because of the mess.

…Such massive failures occur because crucial design flaws are discovered too late. Only after programmers began building the code–the instructions a computer uses to execute a program–do they discover the inadequacy of their designs. Sometimes a fatal inconsistency or omission is at fault, but more often the overall design is vague and poorly thought out. As the code grows with the addition of piecemeal fixes, a detailed design structure indeed emerges–but it is a design full of special cases and loopholes, without coherent principles. As in a building, when the software’s foundation is unsound, the resulting structure is unstable.

…Now a new generation of software design tools is emerging. Their analysis engines are similar in principle to tools that engineers increasingly use to check computer hardware designs. A developer models a software design using a high-level (summary) coding notation and then applies a tool that explores billions of possible executions of the system, looking for unusual conditions that would cause it to behave in an unexpected way. This process catches subtle flaws in the design before it is even coded, but more important, it results in a design that is precise, robust and thoroughly exercised. One example of such a tool is Alloy, which my research group and I constructed. Alloy (which is freely available on the Web) has proved useful in applications as varied as avionics software, telephony, cryptographic systems and the design of machines used in cancer therapy.

Daniel Jackson has also recently written a book Software Abstractions: Logic, Language, and Analysis. The book’s website describes the book:

Daniel Jackson introduces a new approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach–which Jackson calls “lightweight formal methods” or “agile modeling”–takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from “the tarpit of implementation technologies” and return them to thinking deeply about underlying concepts.

via ThreeQuarksDaily

Ecology for Transformation

David Zaks and Chad Monfreda write on Worldchanging on Steve Carpenter and Carl Folke‘s 2006 paper in Ecology for Transformation (doi.org/10.1016/j.tree.2006.02.007):

Ecology has long been a descriptive science with real but limited links to the policy community. A new science of ecology, however, is emerging to forge the collaborations with social scientists and decision makers needed for a bright green future. Stephen Carpenter and Carl Folke outline a vision for the future of ecology in their recent article, Ecology for Transformation. You need a subscription to access the full article, so we’ll quote them at length:

“Scenarios with positive visions are quite different from projections of environmental disaster. Doom-and-gloom predictions are sometimes needed, and they might sell newspapers, but they do little to inspire people or to evoke proactive forward-looking steps toward a better world. Transformation requires evocative visions of better worlds to compare and evaluate the diverse alternatives available to us … Although we cannot predict the future, we have much to decide. Better decisions start from better visions, and such visions need ecological perspectives.”

Ecology for Transformation offers the perspective of resilient social-ecological systems. Simply put, it recognizes that ecosystems and human society are interdependent, and that they need the capacity to withstand and adapt to an increasingly bumpy future.

Examples of resilient social-ecological systems abound in all kinds of notoriously difficult to manage areas, like natural disaster response and rangeland management. Resilience sounds great, but how do we get there? Fortunately Carpenter and Folke offer a theoretically robust three-part transformative framework:

1. Diversity
2. Environmentally sound technology
3. Adaptive governance

Diversity constitutes the raw material we can draw from to create effective technologies and institutions. It reflects the wealth of genetic and memetic resources at our disposal, in the form of biodiversity, landscapes, cultures, ideas, and economic livelihoods. We need to foster diversity as an insurance package for hard times because…

“…crisis can create opportunities for reorganizing the relationships of society to ecosystems. At such times, barriers to action might break down, if only for a short time, and new approaches have a chance to change the direction of ecosystem management. To succeed, a particular approach or vision must be well-formed by the time the crisis arises, because the opportunity for change might be short-lived.”

Environmentally sound technology ranges from incremental advancements in energy efficiency to innovative economic tools like natural capital valuation and markets for ecosystem services. Diversity and technology should sound familiar enough to WorldChanging readers. Ecology for transformation, however, goes on, to challenge us to engage in adaptive governance that recognizes the reality of constant change. The authors define adaptive governance as:

“Institutional and political frameworks designed to adapt to changing relationships between society and ecosystems in ways that sustain ecosystem services; expands the focus from adaptive management of ecosystems to address the broader social contexts that enable ecosystem based management.”

Governance is much broader than what we normally think of as government and encompasses all of the actors who shape the way we work, live, and interact. Communication across various scales, from individuals to institutions, is vital for effective governance. Many of the management and governance structures currently in place are static, but an ‘adaptive’ approach promises more sustainable outcomes by negotiating uncertainty and change.

Steve Carpenter, WA Brock, and I addressed the issue of how scientists can encourage transformations by creating new management models in our 2003 paper Uncertainty and the management of multistate ecosystems: an apparently rational route to collapse (Ecology. 84(6) 1403-1411). We wrote:

…scientists can contribute to broadening the worldview of ecosystem management in at least three ways.

(1) Scientists can point out that uncertainty is a property of the set of models under consideration. This set of models is a mental construct (even if it depends in part on prior observation of the ecosystem). It therefore depends on attitudes and beliefs that are unrelated to putatively objective information about the ecosystem. Despite this discomforting aspect of uncertainty, it cannot be ignored.

(2) Scientists can help to imagine novel models for how the system might change in the future. There will be cases where such novel models carry non-negligible weight in decision, for example when the costs of collapse are high. The consequences of candidate policies can be examined under models with very different implications for ecosystem behavior. Such explorations of the robustness of policies can be carried out when model uncertainty is quite high or even unknown, for example in scenario analysis.

(3) Scientists can point out the value of safe, informative experiments to test models beyond the range of available data. In the model presented here, fossilization of beliefs follows from fixation on policies that do not reveal the full dynamic potential of the ecosystem, leading to the underestimation of model uncertainty. Experimentation at scales appropriate for testing alternative models for ecosystem behavior is one way out the trap. Of course, largescale experiments on ecosystems that support human well being must be approached with caution. Nevertheless, in situations where surprising and unfavorable ecosystem dynamics are possible, it may be valuable to experiment with innovative practices that could reinforce desirable ecosystem states.

I think our second point, the need for creative synthesis, is not emphasized enough in science, which tends to focus on testing existing models. Ecological governance needs new ways of thinking about nature that are useful in governance situations. The creation of novel, practical models is a vital part of connecting science to policy and action. Without practical models, people are unable to develop desirable policy or effective actions.

Mapping global soil degradation

UNEP global map of soil degradation

UNEP provides access to a number of global environmental datasets via the GEO data portal. For example, the Global Assessment of Human Induced Soil Degradation data base was created in 1990 by ISRIC. The database contains information on soil degradation within map units reported by a survey of soil experts from around the world. It includes the type, degree, extent, cause and rate of soil degradation.
The main causes of degradation are (in order) overgrazing, deforestation, and agricultural mismanagement. Each of these is responsible for a bit less than a third of total degraded area. Other causes of degradation make up less than 8% of the total degraded area. Deforestation dominates in Asia, while overgrazing is the main driver in Africa and Australia.
For details of assessment see:

Oldeman LR, Hakkeling RTA and Sombroek WG 1991. World Map of the Status of Human-Induced Soil Degradation: An explanatory Note (rev. ed.), UNEP and ISRIC, Wageningen pdf

Women’s work and the global economy

The April 12th Economist has an article on Women and the world economy

… it is misleading to talk of women’s “entry” into the workforce. Besides formal employment, women have always worked in the home, looking after children, cleaning or cooking, but because this is unpaid, it is not counted in the official statistics. To some extent, the increase in female paid employment has meant fewer hours of unpaid housework. However, the value of housework has fallen by much less than the time spent on it, because of the increased productivity afforded by dishwashers, washing machines and so forth. Paid nannies and cleaners employed by working women now also do some work that used to belong in the non-market economy.

Nevertheless, most working women are still responsible for the bulk of chores in their homes. In developed economies, women produce just under 40% of official GDP. But if the worth of housework is added (valuing the hours worked at the average wage rates of a home help or a nanny) then women probably produce slightly more than half of total output.

The increase in female employment has also accounted for a big chunk of global growth in recent decades. GDP growth can come from three sources: employing more people; using more capital per worker; or an increase in the productivity of labour and capital due to new technology, say. Since 1970 women have filled two new jobs for every one taken by a man. Back-of-the-envelope calculations suggest that the employment of extra women has not only added more to GDP than new jobs for men but has also chipped in more than either capital investment or increased productivity. Carve up the world’s economic growth a different way and another surprising conclusion emerges: over the past decade or so, the increased employment of women in developed economies has contributed much more to global growth than China has.

In particular, there is strong evidence that educating girls boosts prosperity. It is probably the single best investment that can be made in the developing world. Not only are better educated women more productive, but they raise healthier, better educated children. There is huge potential to raise income per head in developing countries, where fewer girls go to school than boys. More than two-thirds of the world’s illiterate adults are women.

More Jobs More Babies - from Economist April 12 2006 It is sometimes argued that it is shortsighted to get more women into paid employment. The more women go out to work, it is said, the fewer children there will be and the lower growth will be in the long run. Yet the facts suggest otherwise. Chart 3 shows that countries with high female labour participation rates, such as Sweden, tend to have higher fertility rates than Germany, Italy and Japan, where fewer women work. Indeed, the decline in fertility has been greatest in several countries where female employment is low.

via Three Quarks

Hurricanes, Risk Models, and Insurance

Roger Pielke Jr has an interesting post Are We Seeing the End of Hurricane Insurability? on the Prometheus weblog. The insurance industry uses models of expected losses to set rates for catastrophic losses – from things such as huricanes. However, the models that are properitarity and not open to public evaluation. Now consumer groups are attacking the providers of “catastrophe models” arguing that these models main purpose is to justify increases in insurance rates.

In the past consumer groups have argued:

Consumers were told that, after the big price increases in the wake of Hurricane Andrew, they would see price stability. This was because the projections were not based on short-term weather history, as they had been in the past, but on very long-term data from 10,000 to 100,000 years of projected experience. The rate requests at the time were based upon the average of these long-range projections. Decades with no hurricane activity were assessed in the projections as were decades of severe hurricane activity, as most weather experts agree we are experiencing now. Small storms predominated, but there were projections of huge, category 5 hurricanes hitting Miami or New York as well, causing hundreds of billions of dollars in damage. Consumers were assured that, although hurricane activity was cyclical, they would not see significant price decreases during periods of little or no hurricane activity, nor price increases during periods of frequent activity. That promise has now been broken.

While the catastrophe modelling firms argue:

Given a constant climatological state (or if annual variations from that state are short lived and unpredictable) the activity rate in a catastrophe model can best be represented as the average of long-term history. In this situation there is no need to characterize the period over which the activity is considered to apply because, with current knowledge, it is expected that rate will continue indefinitely. The assumption that activity remains consistent breaks down, however, where there are either multi-year fluctuations in activity or persistent trends. It then becomes necessary to characterize the time period over which the activity in the Cat model is intended to apply.

Pielke argues that the disaster modellers are implying

…that the historical climatology of hurricane activity is no longer a valid basis for estimating future risks. This means that the catastrophe models that they provide are untethered from experience. Imagine if you are playing a game of poker, and the dealer tells you that the composition of the deck has been completely changed – now you don’t know whether there are 4 aces in the deck or 20. It would make gambling based on probabilities a pretty dodgy exercise. If RMS [Risk Management Solutions – a catastrophe modelling company] is correct, then it has planted the seed that has potential to completely transform its business and the modern insurance and reinsurance industries.

What happens if history is no longer a guide to the future? One answer is that you set your expectations about the future based on factors other than experience. One such approach is to ask the relevant experts what they expect. This is what RMS did last fall, convening Kerry Emanuel, Tom Knutson, Jim Elsner, and Mark Saunders in order to conduct an “expert elicitation”.

… RMS conducted its elicitation October, 2005 with the intent that it will shape its risk estimates for the next 5 years. This is wholly unrealistic in such a fast moving area of science. It is unlikely that the perspectives elicited from these 4 scientists will characterize the views of the relevant community (or even their own views!) over the next five years as further research is published and hurricane seasons unfold. Because RMS has changed from a historical approach to defining risk, which changes very, very slowly, if at all over time, to an expert-focused approach, it should fully expect to see very large changes in expert views as science evolves. This is a recipe for price instability, exactly the opposite from what the consumer groups, and insurance commissioners, want.

From the perspective of the basic functioning of the insurance and reinsurance industries, the change in approach by RMS is an admission that the future is far more uncertain than has been the norm for this community. Such uncertainty may call into question the very basis of hurricane insurance and reinsurance which lies in an ability to quantify and anticipate risks. If the industry can’t anticipate risks, or simply come to a consensus on how to calculate risks (even if inaccurate), then this removes one of the key characteristics of successful insurance. Debate on this issue has only just begun.

Hedging ones bets with insurance is a good strategy to deal with risk – where known outcomes are expected to occur with some known probability. However, when confronting more uncertain situations other approaches such as building resilience to potential classes of shock, engaging in experimental management to decrease uncertainty, and accelerating learning by integrating sources of knowledge across a wider variety of domains (e.g. meterology, ecology, and urban planning) and different regions (e.g. Sri Lanka, the Netherlands, and New Orleans).

ATEAM: Modelling ecosystem services

Worldchanging guest writers David Zaks and Chad Monfreda, from Center for Sustainability and the Global Environment at the U of Wisconsin, have a post ATEAM: Mr.T takes on ecosystems services on a project to model ecosystem services in Europe.

The ATEAM (Advanced Terrestrial Ecosystem Analysis and Modeling) project (also here and here) is not made up of rogue soldiers of fortune, but academics in Europe. The scientific assessment correlates changes in human well-being with future changes in climate and land-use. Researchers combined global climate models and land-use scenarios using innovative interdisciplinary methods to show how ecosystem goods and services are likely to change through the 21st century in Europe. ATEAM paints a mixed picture of the continent divided into a vulnerable south and adaptive north. The results are freely available online as a downloadable (PC only) mapping tool that displays the vulnerability of six key sectors: agriculture, forestry, carbon storage and energy, water and biodiversity.Stakeholder input helped to quantify regional adaptive capacity, while climate and land-use models estimated potential impacts. Adaptive capacity and potential impacts together define the overall vulnerability of individual ecosystem services. Even when ‘potential impacts’ are fixed, differential vulnerability across Europe indicates an opportunity to boost ‘adaptive capacity’. Emphasis on adaptation certainly doesn’t condone inaction on climate change and environmental degradation. Rather it stresses resilience in a world that must prepare for surprise threats that are increasingly the norm.

ATEAM is a wonderful example of sustainability science that lets people imagine the possible futures being shaped through decisions taken today. Integrated assessments like ATEAM and the MA (also here) have a huge potential to create a sustainable biosphere by offering solutions that are at once technical and social. Combined with many ideas that WC readers are already familiar with—planetary extension of real-time monitoring networks, open source scenario building, and pervasive citizen participation—the next generation of assessments could help tip the meaning of ‘global change’ from gloomy to bright green.

Climate change, smoking, and gaming mental models

The environmenal journalist Mark Hertsgaard has an article about climate change politics and journalism in Vanity Fair (May 2006) that shows how climate denial was involved many of the same people who worked to deny the health impacts of smoking.

Although scientists apply the neutral term “climate change” to all of these phenomena, “climate chaos” better conveys the abrupt, interconnected, wide-ranging consequences that lie in store. “It’s a very appropriate term for the layperson,” says Schellnhuber, a physicist who specializes in chaos theory. “I keep telling politicians that I’m not so concerned about a gradual climate change that may force farmers in Great Britain to plant different crops. I’m worried about triggering positive feedbacks that, in the worst case, could kick off some type of runaway greenhouse dynamics.”…

No one pretends that phasing out carbon-based fuels will be easy. The momentum of the climate system means that “a certain amount of pain is inevitable,” says Michael Oppenheimer. “But we still have a choice between pain and disaster.”

Unfortunately, we are getting a late start, which is something of a puzzle. The threat of global warming has been recognized at the highest levels of government for more than 25 years. Former president Jimmy Carter highlighted it in 1980, and Al Gore championed it in Congress throughout the 1980s. Margaret Thatcher, the arch-conservative prime minister of Britain from 1979 to 1990, delivered some of the hardest-hitting speeches ever given on climate change. But progress stalled in the 1990s, even as Gore was elected vice president and the scientific case grew definitive. It turned out there were powerful pockets of resistance to tackling this problem, and they put up a hell of a fight.

In the 1970s and 1980s, R. J. Reynolds Industries, Inc. funded medical research with the help of Dr. Frederick Seitz, a former president of the National Academy of Sciences, to the tune of $45 million. However, the research focused on issues other than the health effects of smoking, which was the central health concern for Reynolds. Seitz admitted that they were not allowed to study the health effects of smoking. Despite this, the tobacco industry used the multi-million-dollar research program as evidence of its commitment to science, and to argue that the evidence on the health effects of smoking was inconclusive. For smoking enthusiasts, the Higher Grade Store offers a wide selection of smoking accessories to choose from.

In the 1990s, Seitz began arguing that the science behind global warming was likewise inconclusive and certainly didn’t warrant imposing mandatory limits on greenhouse-gas emissions. He made his case vocally, trashing the integrity of a 1995 I.P.C.C. report on the op-ed page of The Wall Street Journal, signing a letter to the Clinton administration accusing it of misrepresenting the science, and authoring a paper which said that global warming and ozone depletion were exaggerated threats devised by environmentalists and unscrupulous scientists pushing a political agenda. In that same paper, Seitz asserted that secondhand smoke posed no real health risks, an opinion he repeats in our interview. “I just can’t believe it’s that bad,” he says.

Al Gore and others have said, but generally without offering evidence, that the people who deny the dangers of climate change are like the tobacco executives who denied the dangers of smoking. The example of Frederick Seitz, described here in full for the first time, shows that the two camps overlap in ways that are quite literal—and lucrative. Seitz earned approximately $585,000 for his consulting work for R. J. Reynolds, according to company documents unearthed by researchers for the Greenpeace Web site ExxonSecrets.org and confirmed by Seitz. Meanwhile, during the years he consulted for Reynolds, Seitz continued to draw a salary as president emeritus at Rockefeller University, an institution founded in 1901 and subsidized with profits from Standard Oil, the predecessor corporation of ExxonMobil.

Seitz was the highest-ranking scientist among a band of doubters who, beginning in the early 1990s, resolutely disputed suggestions that climate change was a real and present danger. As a former president of the National Academy of Sciences (from 1962 to 1969) and a winner of the National Medal of Science, Seitz gave such objections instant credibility. Richard Lindzen, a professor of meteorology at M.I.T., was another high-profile scientist who consistently denigrated the case for global warming. But most of the public argument was carried by lesser scientists and, above all, by lobbyists and paid spokesmen for the Global Climate Coalition. Created and funded by the energy and auto industries, the Coalition spent millions of dollars spreading the message that global warming was an uncertain threat. Journalist Ross Gelbspan exposed the corporate campaign in his 1997 book, The Heat Is On, which quoted a 1991 strategy memo: the goal was to “reposition global warming as theory rather than fact.”

“Not trivial” is how Seitz reckons the influence he and fellow skeptics have had, and their critics agree. The effect on media coverage was striking, according to Bill McKibben, who in 1989 published the first major popular book on global warming, The End of Nature. Introducing the 10th-anniversary edition, in 1999, McKibben noted that virtually every week over the past decade studies had appeared in scientific publications painting an ever more alarming picture of the global-warming threat. Most news reports, on the other hand, “seem to be coming from some other planet.”

The deniers’ arguments were frequently cited in Washington policy debates. Their most important legislative victory was the Senate’s 95-to-0 vote in 1997 to oppose U.S. participation in any international agreement—i.e., the Kyoto Protocol—that imposed mandatory greenhouse-gas reductions on the U.S.

Tim Lambert on Deltoid provides some further background on the funding of smoking and climate denialists documented in tobacco documents.