Category Archives: General

Stewart Brand on How Cities Learn

On WorldChanging Chris Coldeway discusses a recent talk by Stewart Brand on how cities learn. In 1994, he wrote the excellent book How Buildings Learn: What Happens After They’re Built:

The redoubtable Stewart Brand gave a talk at GBN last night on global urbanization, expanding the “City Planet” material he first outlined at a Long Now talk and [WorldChanging] covered in detail. As we stand in 2006 at a point where the world’s population tips from mostly rural to mostly urban, Stewart considers this a good time to ruminate on the nature of cities and the causes and implications of a rapidly urbanizing world.

In typical Brand form, the talk swept from the beginnings of civilization — with a view of one of the oldest continuously occupied areas and discussion of how Jerusalem has been sacked or taken over 36 times — to the future of the world, with a look at the largest megacities of 2015. While the largest cities one hundred years ago were primarily in the US and Europe, these 21st century megacities are profoundly global. With cities such as Mumbai, Sao Paulo, and Karachi dominating the list, Stewart noted the similarity to another era of international cities — 1000 AD.

In asking himself how cities “learn” over time in the way that buildings do, Stewart found that while cities do learn, they also teach: they teach civilization how to be civilized. He discussed Levittown as a counterintuitive example, with its lenient do-it-yourself home customization policies actually facilitating the development of community. Squatter cities in the developing world were another example, with the view that squatter cities are what a population getting out of poverty ASAP looks like: self-constructed, and self-organizing, and vibrant.

Stewart sees cities playing out the same patterns of “pace layering” that he sees in civilization overall. Nature changes the slowest, with culture, governance, infrastructure, commerce, and fashion as progressively faster changing “layers.” Cities specialize in acceleration, in the faster cycles of commerce and fashion, but must balance those with the slower layers at the risk of collapse.

I previously mentioned Stewart Brand on cities in Feb and Sept 2005.

Alternative Strategies of Ecosystem Managmenet and Institutional Interplay

Henry Regier explaining conflict over management of the Great Lakes, when he recieved a life time achievement award for important and continued contributions to the field of Great Lakes research. (From Post-normal Times):

Two strategies have been used within our Great Lakes Basin’s governance institutions in recent decades to cope with adverse interrelationships between humans and the rest of nature. Important features of each strategy can be traced back to different emphases within Darwinism a century ago. T. H. Huxley emphasized the role of agonistic or combative interactions within natural selection while P. Kropotkin emphasized mutualistic or cooperative interactions. Capitalists invoked Huxley’s Mutual Harm version for legitimation of their practices while communitarians invoked Kropotkin’s Mutual Aid version. Implicitly the more legalistic regulatory strategies that now dominate within governance in our Basin presuppose Mutual Harm dynamics and seek to temper such harm through pre-cast technocratic capabilities. Participatory democratic programs, now sub-dominant, seek to foster Mutual Aid dynamics less formally. Old Rational Management tries to Temper Mutual Harm Technocratically, TMHT. Drama-of-the-Commons Governance tries to Foster Mutual Aid Democratically, FMAD. Currently, the higher the level of governance in which action on some environmental issue is centred, the more likely that TMHT will dominate, and vice versa. This asymmetry creates problems in hybrid cross-level Adaptive Co-Management and vertical inter-agency partnerships.

New Orleans and the ecology of the Mississippi River

Richard Sparks writes about the ecological/geological context in which New Orleans exists, how people have changed them, and what rebuilders should consider. His article is Rethinking, Then Rebuilding New Orleans, in the Winter 2006 Issues in Science and Technology.

His article focusses on the natural forces that have shaped the Mississippi and how humans have shaped those forces. One of the most interesting points he raises is how land cover change, and river management have radically changed the sediment load of the Mississippi, shifting the balance between land building and subsidence in the delta. In other words flood protection higher in the river has made lower portions of the river more vulnerable to flooding.

sediment loads carried by the Mississippi River 1700 & 1980-1990Figure: The sediment loads carried by the Mississippi River to the Gulf of Mexico have decreased by half since 1700, so less sediment is available to build up the Delta and counteract subsidence and sea level rise. The greatest decrease occurred after 1950, when large reservoirs constructed trapped most of the sediment entering them. Part of the water and sediment from the Mississippi River below Vicksburg is now diverted through the Corps of Engineers’ Old River Outflow Channel and the Atchafalaya River. Without the controlling works, the Mississippi would have shifted most of its water and sediment from its present course to the Atchafalaya, as part of the natural delta switching process. The widths of the rivers in the diagram are proportional to the estimated (1700) or measured (1980–1990) suspended sediment loads (in millions of metric tons per year).

Continue reading

When are economics models of human behaviour reasonable?

Colin Camerer and Ernst Fehr have a great review paper in When Does ‘Economic Man’ Dominate Social Behavior? (Science 6 January 2006: 47-52), of how social context interacts within individual difference. The article begins by laying out the ‘rational actor’ assumption that has underpins most economic theory, along with the recent challenges to this view. Populations seem to farily consistently contain self-regarding and cooperative individuals individuals. Self-regarding individuals approximate ‘Economic Man’, while cooperative actors reward cooperation and are willing to bear a cost for punishing unfair behaviour.

The rationality assumption consists of two components: first, individuals are assumed to form, on average, correct beliefs about events in their environment and about other people’s behavior; second, given their beliefs, individuals choose those actions that best satisfy their preferences. If individuals exhibit, however, systematically biased beliefs about external events or other people’s behavior or if they systematically deviate from the action that best satisfies their preferences, we speak of bounded rationality. Preferences are considered to be self-regarding if an individual does not care per se for the outcomes and behaviors of other individuals. Self-regarding preferences may, therefore, be considered to be amoral preferences because a self-regarding person neither likes nor dislikes others’ outcomes or behaviors as long as they do not affect his or her economic well-being. In contrast, people with other-regarding preferences value per se the outcomes or behaviors of other persons either positively or negatively. A large body of evidence accumulated over the last three decades shows that many people violate the rationality and preference assumptions that are routinely made in economics. Among other things, people frequently do not form rational beliefs, objectively irrelevant contextual details affect their behavior in systematic ways, they prefer to be treated fairly and resist unfair outcomes, and they do not always choose what seems to be in their best interest.

The interesting part of this review is how the behaviour of ‘cooperative’ and ‘economic’ actors changes based upon the context in which they interact and their perceptions about the composition of population an individual is interacting with. The presence of cooperators can causes ‘economic’ actors to behave cooperatatively, and the presence of ‘economic’ actors can cause cooperative actors to behave in a more self-regarding fashion.

To show how the interactions between strong reciprocators and self-regarding individuals shape bargaining behavior, we consider the ultimatum game, in which a buyer offers a price p to a seller, who can sell an indivisible good. For simplicity, assume that the buyer values the good at 100 and the seller values it at 0. The buyer can make exactly one offer to the seller, which the latter can accept or reject. Trade takes place only if the seller accepts the offer. If the seller is self-regarding, she accepts even a price of 1 because 1 is better than nothing. Thus, a self-regarding buyer will offer p=1 so that the seller earns almost nothing from the trade. Strong reciprocators reject such unfair offers, however, preferring no trade to trading at an unfair price. In fact, a large share of experimental subjects reject low offers in this game, across a wide variety of different cultures, even when facing high monetary stakes. This fact induces many self-regarding buyers to make relatively fair offers that strong reciprocators will accept. Often the average offers are around p=40, and between 50% and 70% of the buyers propose offers between p=40 and p=50. The behavior of both buyers and sellers changes dramatically, however, if we introduce just a little bit of competition on the seller’s side. Assume, for example, that instead of one there are two sellers who both want to sell their good. Again the buyer can make only one offer which, if accepted by one of the sellers, leads to trade. If both sellers reject, no trade takes place; if both sellers accept, one seller is randomly chosen to sell the good at the offered price. Almost all buyers make much lower offers in this situation, and almost all sellers accept much lower offers. In fact, if one introduces five competing sellers into this game, prices and rejection rates converge to very low levels such that the trading seller earns only slightly more than 10% of the available gains from trade.

Continue reading

Climate Tipping Points?

A short excerpt from a talk by James Hansen at this year’s AGU meeting is in the The New York Review of Books (53:1):

The Earth’s climate is nearing, but has not passed, a tipping point beyond which it will be impossible to avoid climate change with far-ranging undesirable consequences. These include not only the loss of the Arctic as we know it, with all that implies for wildlife and indigenous peoples, but losses on a much vaster scale due to rising seas.

Ocean levels will increase slowly at first, as losses at the fringes of Greenland and Antarctica due to accelerating ice streams are nearly balanced by increased snowfall and ice sheet thickening in the ice sheet interiors.

But as Greenland and West Antarctic ice is softened and lubricated by meltwater, and as buttressing ice shelves disappear because of a warming ocean, the balance will tip toward the rapid disintegration of ice sheets.

The Earth’s history suggests that with warming of two to three degrees, the new sea level will include not only most of the ice from Greenland and West Antarctica, but a portion of East Antarctica, raising the sea level by twenty-five meters, or eighty feet. Within a century, coastal dwellers will be faced with irregular flooding associated with storms. They will have to continually rebuild above a transient water level.

This grim scenario can be halted if the growth of greenhouse gas emissions is slowed in the first quarter of this century.

—From a presentation to the American Geophysical Union, December 6, 2005

Continue reading

Millennium Ecosystem Assessment Wins Environmental Prize

The Millennium Ecosystem Assessment recently won the Zayed international environmental prize.

A BBC article writes:

UN secretary-general Kofi Annan has been given one of the most prestigious environmental awards, the Zayed Prize.

The citation noted his “personal leadership” on sustainable development.

The 1,360 scientists whose research contributed to the Millennium Ecosystem Assessment were also honoured, as were activists from Trinidad and Indonesia.

The winners of the prize, which honours former UAE President Sheikh Zayed, share $1m (£564,000); previous awards have gone to Jimmy Carter and the BBC.

Among the instances given of the UN chief’s leadership was his decision to set up the Millennium Ecosystem Assessment, a global research project aimed at producing a definitive snapshot of the planet’s environmental health.

The scientists who contributed share the second element of the Zayed prize worth $300,000, for Scientific and Technological Achievement.

The jury described it as a “landmark study” which “demonstrates that the degradation of ecosystems is progressing at an alarming and unsustainable rate”.

MA Wetlands and Health Synthesis Report

covers of MA health and well-being syntheses

The final two synthesis volumes of the Millennium Ecosystem Assessment have now been released. The first is the Ecosystems & Human Well-being: Wetlands & Water Synthesis, a synthesis volume aimed specifically at the RAMSAR convention, and more generally at wetland issues. The second is the Ecosystems and Human Well-being: Health Synthesis produced in cooperation with the world health organization. The technical volumes should be released sometime early in 2006.

Continue reading

The Greening of Sahel: Passive recovery or active adaptation?

The drought years in the Sahel in the early 1970’s that resulted in a large-scale famine gave rise to scientific and policy discussions about land degradation and desertification. A popular belief was that the limited resource base in the Sahel, with vulnerable soils and highly variable and scarce rainfall could not sustain the growing population. The droughts was seen as a stress to a system which was already struggling with a rapidly decreasing resource base (e.g. deforestation of woodlands for agricultural expansion, shortening of fallow times, and soil nutrient depletion) and bad land management practices leading to increased poverty and out-migration.

Sahel Greening.  Overall trends in vegetation greenness throughout the period 1982–2003 based on monthly AVHRR NDVI time series. Percentages express changes in average NDVI between 1982 and 2003. From Hermann et al 2005

New analysis of satellite data, by among others Olsson et al., illustrating a greening trend in the Sahel since 1983 thus comes as a surprise for many people. It has also triggered a scientific discussion of whether this greening is merely a recovery of vegetation due to increasing rainfall, or if this trend at least partially can be explained by widespread changes in land management by farmers in the region. Hutchins et al., in the introduction to a recent special issue of Journal of Arid Environments, suggests that there is increasing evidence that farmers have adapted to the changes during the droughts and made a transition from degrading land use trajectories to more sustainable and productive production systems, suggesting that the recovery in many places actually is an active adaptation by the farmers in the region.

Continue reading

Robustness of the Internet

Router-level topology of Abilene. Each vertex represents a router, and each link represents a physical connection; however, each physical connection can support many virtual connections, giving the appearance of greater connectivity at higher layers of the IP stack. End-user networks are shown in white, peering networks are shown in blue, and high-degree routers can only be found at the network periphery (not shown).

John Doyle and his colleagues published a very interesting paper on the structure of the internet and its implications for robustness. It is a popular belief that the structure of the Internet follow a scale free distribution of the number of connections, which then results in being sensitive to target attacks at the hubs. Doyle et al. dig deeper in to the real structure of the internet and falsify this myth. Indeed the number of connections follow a scale free distribution, but there are various ways to derive such a distribution. Doyle et al. find that the components of the internet with the most connections are not the crucial hubs of the internet.

Doyle et al. define an alternative model to generate networks structures of the internet (an alternative to the preferential attachment models). This alternative model is based on the highly optimized tollerance (HOT) concept and includes specific technological (bandwidth) and economic (costs) constraints. The resulting model generates statistics more in line with the real internet, and an important finding is that this structure is robust to targeted attacks to highly connected nodes.

John C. Doyle, David L. Alderson, Lun Li, Steven Low, Matthew Roughan, Stanislav Shalunov, Reiko Tanaka, and Walter Willinger (2005) The “robust yet fragile” nature of the Internet, Proceedings of the National Academy of Sciences of the United States of America 102: 14497-14502