All posts by Victor Galaz

Visualizations of Adaptive Governance

Inspired by a recent course I taught, and my colleague’s Garry Peterson’s search for visualizations of social-ecological systems, (also here), I found myself looking for illustrations of adaptive governance – that is, modes of governance that play out at multiple levels, and that are able to link institutional, with ecosystem dynamics (see Folke et al. 2005 [PDF].  Here are a few examples of how this has been illustrated in the literature. If you have other examples, please add them in the comment field!

This first one is from Andersson and Ostrom (2008), and their analysis of decentralization of natural resource management, and the need to link these initiatives in a wider polycentric setting.

This second one is from Berkes (2007), and explores institutional linkages at multiple levels, for a conservation project in Guyana.

This illustration is from Hahn et al. (2006), and builds on several articles published about Kristianstad Vattenrike (Sweden).

This beautiful visualization is based on a network analysis by Ernstson et al (2010) about network governance of urban ecosystems in Stockholm.

And lastly, one illustration from a report [PDF] from the finalized European project Governance and Ecosystems Management for the CONservation of  BIOdiversity (GEM-CON-BIO). The figure shows an analytical framework applied for a range of case studies recently published in PNAS.

Scanning the Internet for Ecological Early Warnings

If Google Flu Trends can, why can’t we? The possibility to mine large amounts of individual reports and local news posted on the Internet as early warning signs of pending epidemic outbreaks has been a part of global epidemic governance for quite some time. The question is; could we do the same for ecological crises? A couple of years ago, a couple of colleagues and I wrote a conceptual piece in Frontiers entitled “Can webcrawlers revolutionize ecological monitoring?” where we elaborated issue. Until today however, the idea hasn’t moved much from its conceptual phase. Luckily, analysts and GIS-experts at the USDA Forest Service, now have begun to test the concept with real world data. In a new paper entitled “Internet Map Services: New portal for global ecological monitoring, or geodata junkyard?”, Alan Ager and colleagues, present initial results from runs with a geodata webcrawler . They report:
At the USDA Forest Service’s Western Wildland Environmental Threat Assessment Center (WWETAC), we are exploring webcrawlers to facilitate wildland threat assessments. The Threat Center was established by Congress in 2005 to facilitate the development of tools and methods for the assessment of multiple interacting threats (wildfire, insects, disease, invasive species, climate change, land use change)
The Threat News Explorer (see image) visualizes some of the results.

However, they also note that
much of the online data is stored in large institutional data warehouses (Natureserve, Geodata.gov, etc.) that have their own catalog and searching systems and are not open to webcrawlers like ours.  In fact, most federal land management agencies do not allow services to their data, but allow downloading and in-house viewers (i.e. FHTET 2006). This policy does not simplify the problem of integrated threat assessments for federal land management agencies.
The group is now developing a more powerful webcrawler. You can find and search the database for geospatial data and map here. Still a long way to go it seems, but a very important first step!

Information and communication technologies in the Anthropocene

UPDATED: Slides from the talks at the end of this blogpost

The use of social media for political mobilization during the political uprisings in Northern Africa and the Middle East during 2010 and 2011; digital coordination of climate skeptic networks during “Climategate” in 2010; and the repercussions of hackers in carbon markets the last years. These are all examples of intriguing phenomena that take place at the interface between rapid information technological change, and the emergence of globally spanning virtual networks.

Exactly how information and communication technologies affect the behavior of actors at multiple scales, is of course widely debated. The question is: how do we make sense of these changes, from a wider resilience perspective?

Some of these discussions took place at the 2011 Resilience conference in Arizona in a panel convened by us at the Stockholm Resilience Centre, and with generous support from the International Development Research Centre (IDRC, Canada). Ola Tjornbo from Social Innovation Generation (SIG) at the University of Waterloo, explored some of the opportunites, but also profound challenges, related in trying to design effective virtual deliberation processes. Ola noted that while several success stories related to crowd-sourcing (Wikipedia) and collective intelligence (e.g. Polymath) do exist, we have surprisingly little systematic knowledge of how to design digital decision-making processes that help overcome conflicts of interest related to issues of sustainability. Some if these issues are elaborated by SiG, and you can find videos from an interesting panel on “Open Source Democracy” here.

Richard Taylor from SEI-Oxford presented a rapidly evolving platform for integration and dissemination of knowledge on climate adaptation – weADAPT. This platform combines the strengths of a growing community of climate adaptation experts, a database of ongoing local climate adaptation projects, semantic web technologies, and a Google Earth interface. The visualizations are stunning, and provide and interesting example of how ICTs can be used for scientific communication.

Angelica Ospina from the Centre of Development Informatics at the University of Manchester, showcased some ongoing work on mobile technologies and climate adaptation resilience. As Ospina noted, ICTs can provide some very tangible support for various features of resilience, ranging from self-organization, to learning and flexibility. You can find a working paper  by Angelica here.

To summarize: three very different yet complementary perspectives on how ICTs could be harnessed in the Anthropocene: by building new types of virtually supported decision making and collective intelligence processes; linking expert communities and local natural resource management experimentation together; and by exploring the resilience building strengths of decentralized mobile technologies.

Slides from the talks

Victor Galaz (intro)

Ola Tjornbo

Richard Taylor

Controversies around the Social Cost of Carbon

What is the social cost of carbon? That is,the monetary value of the long-term damages done by greenhouse gas emissions? Frank Ackerman from the Stockholm Environment Institute U.S. Center, recently gave a fascinating talk at the Stockholm Resilience Centre where he presented the widely used FUND-model, an integrated assessment model of climate change that links climate change science with economics. According to Ackerman, the interesting aspect with this model is not only that it is commonly cited by policy-makers in the US, but also that some of its basic assumptions, lead to quite bizarre results. The policy implications can not be overestimated.

As Ackerman notes in the TripleCrisis blog:

True or false: Risks of a climate catastrophe can be ignored, even as temperatures rise? The economic impact of climate change is no greater than the increased cost of air conditioning in a warmer future? The ideal temperature for agriculture could be 17oC above historical levels?

All true, according to the increasingly popular FUND model of climate economics. It is one of three models used by the federal government’s Interagency Working Group to estimate the “social cost of carbon” – that is, the monetary value of the long-term damages done by greenhouse gas emissions. According to FUND, as used by the Working Group, the social cost of carbon is a mere $6 per ton of CO2. That translates into $0.06 per gallon of gasoline. Do you believe that a tax of $0.06 per gallon at the gas pump (and equivalent taxes on other fossil fuels) would solve the climate problem and pay for all future climate damages?

I didn’t believe it, either. But the FUND model is growing in acceptance as a standard for evaluation of climate economics. To explain the model’s apparent dismissal of potential harm, I undertook a study of the inner workings of FUND (with the help of an expert in the relevant software language) for E3 Network. Having looked under the hood, I’d say the model needs to be towed back to the shop for a major overhaul.

A working paper that teases the critique in detail can be found here. To summarize the conclusions for non-economists: the social cost of carbon is way higher than $6 per ton of CO2….

Does an increased awareness of catastrophic “tipping points”, really trigger political action?

This critical question relates to a suite of resilience related research fields, ranging from early warnings of catastrophic shifts in ecosystems, non-linear planetary boundaries, and the role of perceived crisis as triggers of transformations towards more adaptive forms of ecosystem governance.

The answer might seem quite straight-forward: “yes!”. Why wouldn’t political actors try to steer away from potentially devastating tipping points? Political philosopher Stephen M. Gardiner elaborates the opposite position in a very thought-provoking article in the Journal of Social Philosophy (2009) about the moral implications of abrupt climate change.

Planetary Boundaries

Planetary Boundaries

According to Gardiner, several economical, psychological and intergenerational dilemmas make it likely that an increased awareness of devastating “tipping points”, undermine political actors’ work towards effective climate change mitigation. Instead, it induces them to focus on adaptation measures, and involve in what Gardiner denotes an “Intergenerational Arms Race”.

Suppose, for example, that a given generation knew that it would be hit with a catastrophic abrupt change no matter what it did. Might it not be inclined to fatalism? If so, then the temporal proximity of abrupt change would actually enhance political inertia, rather than undercut it. (Why bother?)

In addition, according to Gardiner, in facing abrupt climate change, there will be other more urgent concerns than climate change mitigation, again creating greater risks for future generations.

[T]he proximity of the abrupt change may actually provide an incentive for increasing current emissions above the amount that even a completely self-interested generation would normally choose. What I have in mind is this. Suppose that a generation could increase its own ability to cope with an impending abrupt change by increasing its emissions beyond their existing level. (For example, suppose that it could boost economic output to enhance adaptation efforts by relaxing existing emissions standards.) Then, it would have a generation-relative reason to do so, and it would have this even if the net costs of the additional emissions to future generations far exceed the short-term benefits. Given this, it is conceivable that the impending presence of a given abrupt change may actually exacerbate the PIBP “the problem of intergenerational buck passing”], leaving future generations worse off than under the gradualist paradigm.

So what are the ways to get out of this dilemma? Gardiner suggests:

In my view, if we are to solve this problem, we will need to look beyond people’s generation-relative preferences. Moreover, the prevalence of the intergenerational problem suggests that one set of motivations that we need to think hard about engaging is those connected to moral beliefs about our obligations to those only recently, or not yet, born. This leaves us with one final question. Can the abrupt paradigm assist us in this last task? Perhaps so: for one intriguing possibility is that abrupt change will help us to engage intergenerational motivations.

(Thanks to Simon Birnbaum for passing on Gardiner’s article.)

Impacts of the 2010 tsunami in Chile

UPDATE: Here is a link to a video to Prof. Castilla’s talk (via @sthlmresilience)

03:34 a.m. February 27th 2010. Suddenly, a devastating earthquake and a series of tsunamis hits the central–south coast of Chile. An earthquake so powerful (8.8 on the moment magnitude scale), that not only is the fifth largest recorded on earth, but also moves the city of Buenos Aires in Argentina, 10 feet (!) to the west.

Juan Carlos Castilla from the Pontificia Universidad Católica de Chile, recently visited Stockholm, and gave an update about the tsunamis’ impact on coastal communities. The effects of the tsunami were devastating, and the death toll from the 2-3 tsunamis alone was between 170-200 in the coastal areas of regions VI, VII and VIII. The most noticeable biophysical impact in the region is the elevation of the whole coastal area, ranging from 1.5 to 3 meters. This obviously has had big impacts on the composition of species and vegetation on the coast. The impacts on coastal ecosystems and fisheries is however still unclear.

Based on extensive field studies two months after the disaster, Castilla and his research team noted that only 8-12 (about 6%) of the 200 deceased where from fisherman families. According to Castilla, this low figure can be explained by the existence of strong social networks, and local knowledge passed on from generation to generation. As an artisan fisherman in the study, summarized one shared local saying:

“if an earthquake is so strong you can not stand up: run to the hills”

Luckily, February 27th was a night of full moon. This allowed people to more easily run for protection in the hills. According to Castilla, the combination of full moon, local knowledge, and strong bonds between neighbors, made it possible for members of fishermen communities to rapidly act on the first warning signal: the earthquake. The fact that locals also were taught not to leave the hills after at least a couple of hours after an earthquake, also helped them avoid the following devastating tsunamis. Unfortunately, visitors and tourists in the tsunami affected coastal areas, were not.

Read more:

Marín, A et al. (2010) ”The 2010 tsunami in Chile: Devastation and survival of coastal small-scale fishing communities”, Marine Policy, 2010, 34:1381-1384.

Gelchich, S et al. “Nagivating transformations in governance of Chilean marine coastal resources”, PNAS, 107(39): 16794-16799.

See Henrik’s post just the days after the Chilean earthquake here.

A Moratorium on Geoengineering? Really?

In the end of October 2010, participants in the international Convention on Biological Diversity (CBD) included in their agreement to protect biodiversity , a moratorium on geo­engineering. This CBD moratorium came timely as the debate around geoengineering virtually exploded internationally with several high-profile reports being published by, amongst others, the British Royal Society, and the U.S. Congress. The IPCC has announced it will organize several expert meetings in 2011 to focus on geoengineering, to help prepare the next review of climate science, due for completion in 2014.

But what does this “moratorium” really imply? This is not a trivial question considering the often acclaimed fragmentation of global environmental governance, and the fact that most geoengineering schemes would have impacts on additional planetary boundaries such as land use change and biodiversity. Two main (and highly simplified of course) interpretations seem to exist in a quite complicated legal debate.

One is that the CBD moratorium places a considerable limit on geoengineering experimentation and attempts. The only exception are “small-scale” controlled experiments that meet specific requirements, i.e.: that they are assumed in controlled settings and for explicit scientific purposes, are subject to prior environmental impact assessment, and have no impacts beyond national jurisdiction. Proponents of this position note that even if the CBD moratorium is not legally binding, governments launching large geoengineering experiments would “risk their credibility and diplomatic reputations”, a strong enough disincentive that effectively “blocks risky climate techno-fixes”. The Canadian NGO ETC Group elaborates this point here.

The second position instead highlights several points that undermine the strenght of the CBD moratorium. The first is that the agreement has no legally binding power, and that formal sanctioning mechanisms are absent. The CBD moratorium is “soft law” which implies that States  still could launch geoengineering schemes unilaterally. Note also that the United States has not formally ratified the CBD convention.

Second, even though the CBD moratorium might be seen as defining an upper limit on the scale of geoengineering experiments, key definitional questions remain to be teased out. What is to be defined as “small-scale” and  “experiment”? And what is its status compared to other related pieces of international law, such as the UN Convention on the Law of the Sea, the London Convention, and the Convention on the Prohibition of Military or Other Hostile Use of Environmental Modification Techniques, just to mention a few.

Third, as the US Congressional Research Service notes in its report, international agreements are best equipped to deal with disputes between countries, and not necessarily between one country and one private actor, or between private actors that may shift locations to suit their interests (pp. 29). And major private or semi-private actors and funders are out there, including the Bill Gates and Richard Branson $4.6 million Fund for Innovative Climate and Energy Resources, Ice911, Intellectual Ventures (see WJS article “Global warming might be solved with a helium balloon and a few miles of garden hose”), Carbon Engineeering, Planktos Foundation, and GreenSea Ventures (featured in Nature here).

So, do we really have a real, effective global moratorium on geoengineering? Far from it it seems. Feel free to disagree in the comment field below.

Originally posted in adaptiveness.wordpress.com

The Politics of Cascading Ecological Crises

Twitter | @vgalaz

How do we make sense of ecological crises that cascade across spatial scales and that propagate from ecological to social and economical systems? Considering a number of recent crises events with clear ecological dimensions – ranging from the 2008 food crisis (video below) to the spread of plant disease Ug99 in East Africa and parts of the Middle East – there is actually quite little research on the sociopolitical dimensions of ecological crises events.

During 2008-2009, we organized several small workshops with political science and media scholars from the Swedish National Center for Crisis Management Research and Training (CRISMART). Our ambition was to bring together insights from the crisis management research community, and insights from resilience theory, especially the notion of “tipping points” and ecological surprise.

The results of our work have just been published in the journal Public Administration in an article entitled “Institutional and Political Leadership Dimensions of Cascading Ecological Crises”. Here we elaborate a range of difficult political challenges that emerge though different phases of a complex crisis: early warning, sense making, response and post-crisis learning. As we elaborate, even though there are several examples of successful governance of ecological stresses and crises, cascading ecological crises are:

• notoriously hard to detect in advance due to their underlying complexities,
and poor monitoring systems.
• challenge the decision-making and coordinating capacities of actors at multiple
levels of societal organization due to their cascading and recombining capacities.
• are prone to blame games, which hinder post-crisis learning and reform.

Also posted in Adaptiveness and Innovation in Earth System Science

Visualizing Planetary Boundaries

Seems like Christmas comes early this year! Visualizing.org just announced the results  of the Visualizing Marathon 2010. One of the challenges was to visualize planetary boundaries, i.e. the concept of multiple and non-linear earth system processes presented by Johan Rockström and colleagues last year.

The winner: MICA Team #3 and the project One Day Cause + Effect: A look at energy emissions and water usage over the course of one day (by Christina Beard, Christopher Clark, Chris McCampbell, Supisa Wattanasansanee). Congratulations! The other visualizations are also well worth a look – and a few clicks as many of them are interactive.

2010 Honorable Mention: SVA Team #1: Pushing the Boundaries: A Visualization of Our Footprint on Earth. Submitted by: Clint Beharry, David Bellona, Colleen Miller, Erin Moore, Tina Ye

2010 Honorable Mention: MICA Team #1: What Kind of World Do You Want?: A visualization of planetary boundaries. Submitted by: Melissa Barat, Bryan Connor, Ann Liu, Isabel Uria

What Kind of World Do You Want?

Resilience meets architecture and urban planning

by Matteo Giusti [contact: matteo.giusti [at] gmail.com]
Does resilience thinking and architecture really mix? The answer is a clear “yes” if you ask urban planner Marco Miglioranzi, and Matteo Giusti, Master student at the Stockholm Resilience Centre. Together with the German based firm of architects N2M, they have developed two projects led by resilience concepts. Their first work, based on social-ecological systems, was preselected in the EuroPan10 competition. The second one, “A Resilient Social-Ecological Urbanity: A Case Study of Henna, Finland” with an emphasis on urban resilience, has been published by the German Academy for Urban and Regional Spatial Planning (DASL) and also featured by HOK –  a renowned global architectural firm.
The project proposes a wide range of theoretical solutions based on urban resilience which find practical application in Henna’s (Finland) urban area. Governance networks, social dynamics, metabolic flows and built environment are separately analyzed to ultimately restore, and maintain over time, the equilibrium between human demands and ecological lifecycles.
But the project also challenges current urban planning practices as it states the city’s  future requirements to be unknown. As a result, it identifies “the development-process as a dynamic flow instead of a static state”. Time scale for urban planning is therefore included within an evolving spatial design.
Diagram of the parametric cell structure: reversible space layer (upper left) and reversible building layer (right)The project description elaborates: “As a result, the planning is not static anymore. It is not a blueprint, not a collection of architectural elements to create an hypothetic Henna out of the current mindsets and needs, but a multitude of tools, methods, opportunities, options, to define a sustainable developing strategy to meet future’s demands. We keep an eye on time, its complexity and we humbly admit we cannot foresee future; we can only provide guiding principles from current scientific understanding to define a social ecological urbanity capable of sustainably moving on with unique identity.”
_
All these theoretical premises ends up in Henna’s planning. This includes an energetic smart grid based primarily on Enhanced Geothermal Systems (EGS); community-managed greenhouse areas to enhance food local self- reliance; low-diluted sewage system to reduce water consumption; efficient reuse of municipal solid waste to reach the Zero waste goal; and a problem solving centre to analyze ever-changing social ecological demands. Time is included in space, people in their natural environment, urban services in ecological processes. An harmonious cycle of growth and decays.