Stewart Brand on How Cities Learn

On WorldChanging Chris Coldeway discusses a recent talk by Stewart Brand on how cities learn. In 1994, he wrote the excellent book How Buildings Learn: What Happens After They’re Built:

The redoubtable Stewart Brand gave a talk at GBN last night on global urbanization, expanding the “City Planet” material he first outlined at a Long Now talk and [WorldChanging] covered in detail. As we stand in 2006 at a point where the world’s population tips from mostly rural to mostly urban, Stewart considers this a good time to ruminate on the nature of cities and the causes and implications of a rapidly urbanizing world.

In typical Brand form, the talk swept from the beginnings of civilization — with a view of one of the oldest continuously occupied areas and discussion of how Jerusalem has been sacked or taken over 36 times — to the future of the world, with a look at the largest megacities of 2015. While the largest cities one hundred years ago were primarily in the US and Europe, these 21st century megacities are profoundly global. With cities such as Mumbai, Sao Paulo, and Karachi dominating the list, Stewart noted the similarity to another era of international cities — 1000 AD.

In asking himself how cities “learn” over time in the way that buildings do, Stewart found that while cities do learn, they also teach: they teach civilization how to be civilized. He discussed Levittown as a counterintuitive example, with its lenient do-it-yourself home customization policies actually facilitating the development of community. Squatter cities in the developing world were another example, with the view that squatter cities are what a population getting out of poverty ASAP looks like: self-constructed, and self-organizing, and vibrant.

Stewart sees cities playing out the same patterns of “pace layering” that he sees in civilization overall. Nature changes the slowest, with culture, governance, infrastructure, commerce, and fashion as progressively faster changing “layers.” Cities specialize in acceleration, in the faster cycles of commerce and fashion, but must balance those with the slower layers at the risk of collapse.

I previously mentioned Stewart Brand on cities in Feb and Sept 2005.

Alternative Strategies of Ecosystem Managmenet and Institutional Interplay

Henry Regier explaining conflict over management of the Great Lakes, when he recieved a life time achievement award for important and continued contributions to the field of Great Lakes research. (From Post-normal Times):

Two strategies have been used within our Great Lakes Basin’s governance institutions in recent decades to cope with adverse interrelationships between humans and the rest of nature. Important features of each strategy can be traced back to different emphases within Darwinism a century ago. T. H. Huxley emphasized the role of agonistic or combative interactions within natural selection while P. Kropotkin emphasized mutualistic or cooperative interactions. Capitalists invoked Huxley’s Mutual Harm version for legitimation of their practices while communitarians invoked Kropotkin’s Mutual Aid version. Implicitly the more legalistic regulatory strategies that now dominate within governance in our Basin presuppose Mutual Harm dynamics and seek to temper such harm through pre-cast technocratic capabilities. Participatory democratic programs, now sub-dominant, seek to foster Mutual Aid dynamics less formally. Old Rational Management tries to Temper Mutual Harm Technocratically, TMHT. Drama-of-the-Commons Governance tries to Foster Mutual Aid Democratically, FMAD. Currently, the higher the level of governance in which action on some environmental issue is centred, the more likely that TMHT will dominate, and vice versa. This asymmetry creates problems in hybrid cross-level Adaptive Co-Management and vertical inter-agency partnerships.

New Orleans and the ecology of the Mississippi River

Richard Sparks writes about the ecological/geological context in which New Orleans exists, how people have changed them, and what rebuilders should consider. His article is Rethinking, Then Rebuilding New Orleans, in the Winter 2006 Issues in Science and Technology.

His article focusses on the natural forces that have shaped the Mississippi and how humans have shaped those forces. One of the most interesting points he raises is how land cover change, and river management have radically changed the sediment load of the Mississippi, shifting the balance between land building and subsidence in the delta. In other words flood protection higher in the river has made lower portions of the river more vulnerable to flooding.

sediment loads carried by the Mississippi River 1700 & 1980-1990Figure: The sediment loads carried by the Mississippi River to the Gulf of Mexico have decreased by half since 1700, so less sediment is available to build up the Delta and counteract subsidence and sea level rise. The greatest decrease occurred after 1950, when large reservoirs constructed trapped most of the sediment entering them. Part of the water and sediment from the Mississippi River below Vicksburg is now diverted through the Corps of Engineers’ Old River Outflow Channel and the Atchafalaya River. Without the controlling works, the Mississippi would have shifted most of its water and sediment from its present course to the Atchafalaya, as part of the natural delta switching process. The widths of the rivers in the diagram are proportional to the estimated (1700) or measured (1980–1990) suspended sediment loads (in millions of metric tons per year).

Continue reading

When are economics models of human behaviour reasonable?

Colin Camerer and Ernst Fehr have a great review paper in When Does ‘Economic Man’ Dominate Social Behavior? (Science 6 January 2006: 47-52), of how social context interacts within individual difference. The article begins by laying out the ‘rational actor’ assumption that has underpins most economic theory, along with the recent challenges to this view. Populations seem to farily consistently contain self-regarding and cooperative individuals individuals. Self-regarding individuals approximate ‘Economic Man’, while cooperative actors reward cooperation and are willing to bear a cost for punishing unfair behaviour.

The rationality assumption consists of two components: first, individuals are assumed to form, on average, correct beliefs about events in their environment and about other people’s behavior; second, given their beliefs, individuals choose those actions that best satisfy their preferences. If individuals exhibit, however, systematically biased beliefs about external events or other people’s behavior or if they systematically deviate from the action that best satisfies their preferences, we speak of bounded rationality. Preferences are considered to be self-regarding if an individual does not care per se for the outcomes and behaviors of other individuals. Self-regarding preferences may, therefore, be considered to be amoral preferences because a self-regarding person neither likes nor dislikes others’ outcomes or behaviors as long as they do not affect his or her economic well-being. In contrast, people with other-regarding preferences value per se the outcomes or behaviors of other persons either positively or negatively. A large body of evidence accumulated over the last three decades shows that many people violate the rationality and preference assumptions that are routinely made in economics. Among other things, people frequently do not form rational beliefs, objectively irrelevant contextual details affect their behavior in systematic ways, they prefer to be treated fairly and resist unfair outcomes, and they do not always choose what seems to be in their best interest.

The interesting part of this review is how the behaviour of ‘cooperative’ and ‘economic’ actors changes based upon the context in which they interact and their perceptions about the composition of population an individual is interacting with. The presence of cooperators can causes ‘economic’ actors to behave cooperatatively, and the presence of ‘economic’ actors can cause cooperative actors to behave in a more self-regarding fashion.

To show how the interactions between strong reciprocators and self-regarding individuals shape bargaining behavior, we consider the ultimatum game, in which a buyer offers a price p to a seller, who can sell an indivisible good. For simplicity, assume that the buyer values the good at 100 and the seller values it at 0. The buyer can make exactly one offer to the seller, which the latter can accept or reject. Trade takes place only if the seller accepts the offer. If the seller is self-regarding, she accepts even a price of 1 because 1 is better than nothing. Thus, a self-regarding buyer will offer p=1 so that the seller earns almost nothing from the trade. Strong reciprocators reject such unfair offers, however, preferring no trade to trading at an unfair price. In fact, a large share of experimental subjects reject low offers in this game, across a wide variety of different cultures, even when facing high monetary stakes. This fact induces many self-regarding buyers to make relatively fair offers that strong reciprocators will accept. Often the average offers are around p=40, and between 50% and 70% of the buyers propose offers between p=40 and p=50. The behavior of both buyers and sellers changes dramatically, however, if we introduce just a little bit of competition on the seller’s side. Assume, for example, that instead of one there are two sellers who both want to sell their good. Again the buyer can make only one offer which, if accepted by one of the sellers, leads to trade. If both sellers reject, no trade takes place; if both sellers accept, one seller is randomly chosen to sell the good at the offered price. Almost all buyers make much lower offers in this situation, and almost all sellers accept much lower offers. In fact, if one introduces five competing sellers into this game, prices and rejection rates converge to very low levels such that the trading seller earns only slightly more than 10% of the available gains from trade.

Continue reading

Climate Tipping Points?

A short excerpt from a talk by James Hansen at this year’s AGU meeting is in the The New York Review of Books (53:1):

The Earth’s climate is nearing, but has not passed, a tipping point beyond which it will be impossible to avoid climate change with far-ranging undesirable consequences. These include not only the loss of the Arctic as we know it, with all that implies for wildlife and indigenous peoples, but losses on a much vaster scale due to rising seas.

Ocean levels will increase slowly at first, as losses at the fringes of Greenland and Antarctica due to accelerating ice streams are nearly balanced by increased snowfall and ice sheet thickening in the ice sheet interiors.

But as Greenland and West Antarctic ice is softened and lubricated by meltwater, and as buttressing ice shelves disappear because of a warming ocean, the balance will tip toward the rapid disintegration of ice sheets.

The Earth’s history suggests that with warming of two to three degrees, the new sea level will include not only most of the ice from Greenland and West Antarctica, but a portion of East Antarctica, raising the sea level by twenty-five meters, or eighty feet. Within a century, coastal dwellers will be faced with irregular flooding associated with storms. They will have to continually rebuild above a transient water level.

This grim scenario can be halted if the growth of greenhouse gas emissions is slowed in the first quarter of this century.

—From a presentation to the American Geophysical Union, December 6, 2005

Continue reading

Millennium Ecosystem Assessment Wins Environmental Prize

The Millennium Ecosystem Assessment recently won the Zayed international environmental prize.

A BBC article writes:

UN secretary-general Kofi Annan has been given one of the most prestigious environmental awards, the Zayed Prize.

The citation noted his “personal leadership” on sustainable development.

The 1,360 scientists whose research contributed to the Millennium Ecosystem Assessment were also honoured, as were activists from Trinidad and Indonesia.

The winners of the prize, which honours former UAE President Sheikh Zayed, share $1m (£564,000); previous awards have gone to Jimmy Carter and the BBC.

Among the instances given of the UN chief’s leadership was his decision to set up the Millennium Ecosystem Assessment, a global research project aimed at producing a definitive snapshot of the planet’s environmental health.

The scientists who contributed share the second element of the Zayed prize worth $300,000, for Scientific and Technological Achievement.

The jury described it as a “landmark study” which “demonstrates that the degradation of ecosystems is progressing at an alarming and unsustainable rate”.

MA Wetlands and Health Synthesis Report

covers of MA health and well-being syntheses

The final two synthesis volumes of the Millennium Ecosystem Assessment have now been released. The first is the Ecosystems & Human Well-being: Wetlands & Water Synthesis, a synthesis volume aimed specifically at the RAMSAR convention, and more generally at wetland issues. The second is the Ecosystems and Human Well-being: Health Synthesis produced in cooperation with the world health organization. The technical volumes should be released sometime early in 2006.

Continue reading

Visualization of Complex Networks


Flight density during one week between international airports. From SD Magazine (Japan).

VisualComplexity is a website that collects visualizations of complex networks. The project aims to display the results of visualization methods used in different disciplines to stimulate the creation of new visualizations and new visualization approaches.

Example categories include food webs, knowledge networks, social networks, and art.

An online tool for visualizing networks on the internet or in Amazon.com’s database is TouchGraph. For example, the related sales network of Panarchy editted by Gunderson and Holling or the google network of Resilience Science.

Inequality of Climate Change Impacts

Jonathan Patz et al have recently published a review paper on the Impact of Regional Climate Change on Human Health, in a special feature on regional climate change in the Nov 16th issue of Nature.

The article shows that climate change is already a substantial factor shortening people’s lives. The authors estimate that climate change kills an excess 154 000/yr. This mortality compares with 6 million deaths/yr caused by childhood and maternal malnutrition (the largest proportion of mortality) and with 109 000 deaths/yr from carnciogen exposure (data from Rodgers et al 2004 Distribution of Major Health Risks: Findings from the Global Burden of Disease Study. PLOS Medicine pdf)

Climate change deaths are estimated to occur primarily due to increases in malnutrition (77 000 deaths), diarrhoea (47 000 deaths), and Malaria (27 000 deaths). However, the health impacts of climate change vary greatly across the world. In general the areas, least responsible for changing the climate, are suffering the most deaths from climate change. These deaths are concentrated in poor countries, with about half of these deaths occuring in poor countries in S and SE Asia (specifically Bangladesh, Bhutan, Democratic People’s Republic of Korea, India, Maldives, Myanmar, Nepal), which are home to 1.2 billion people.

The mismatch between the countries most responsible for producing climate change and its impact is shown in the two maps below. The first map shows CO2 emissions/capita in 1998 from WRI data, while the second shows the estimated numbers of deaths per million people that could be attributed to global climate change in the year 2000 (From Patz et al). The mismatch be further exagerated if the cumulative CO2 emissions/capita of nations, a better indicator of national responsibility for climate change, were shown.

national level co2 emissions per capita 1998 Drawing from data from the World Health Organization, the map was also created by Patz's team. Map courtesy the Center for Sustainability and the Global Environment.

[click on a map to see a larger version]

Continue reading

Faculty of 1000 & Resilience Science

Discovering interesting articles within sea of scientific publications can be difficult. BioMedCentral produces – Faculty of 1000 – an internet based research filtering service that highlights and reviews the papers published in the biological sciences, based on the ranking and recommendations of a faculty of well over 1000 selected researchers.

Along with many other ecologists from diverse backgrounds, a number of resilience researchers including Carl Folke, Terry Chapin and Ann Kinzig, participate in the Faculty of 10000, but none of them have recommended papers yet.” Resilience Alliance program director Brian Walker, is also a member and he recently recommended Marty Anderies new paper on how deforestation produced a soil-moisture regime shift in the south-eastern Australia,

Minimal models and agroecological policy at the regional scale: An application to salinity problems in southeastern Australia Regional Environmental Change 2005 5:1-17

Continue reading