Category Archives: Ideas

Steven Johnson on the source of good ideas

Two short videos by science writer Steven Johnson on his book Where good ideas come from: the natural history of innovation.

An animated promotional video for his book:

And him giving a TED talk.

Steve Johnson has posted some of the responses to his ideas on his blog.

I haven’t read the book, but complex systems scientist Cosma Shalizi has a rich review that addresses many of the books strengths and weaknesses.  He introduces the book as:

This is 100-proof American evolutionist, naturalistic liberalism, which is to say, Pragmatism. It is a celebration of the virtues of openness, experimentation (including failed experiments), giving “slow hunches” chances to develop, to serendipitously blending ideas from diverse intellectual backgrounds and disciplines, and the continuity of human culture and thought with processes in the natural world. It’s a view of the social life of the mind, illustrated by engagingly-told anecdotes from the history of science and technology; apt references to a wide range of scholarly studies; long, admiring quotations from Darwin; the natural history of coral reefs and the evolution of sexual reproduction. (The broader history of culture, especially the fine arts, is occasionally alluded to, and there are abundantly merited plugs for his old teacher Franco Moretti’s studies on the evolution of genres and “distant reading”; but mostly it’s a science-and-technology book.) Johnson has painted a crowd scene: good ideas hardly ever come from isolated individuals thinking very hard and having flashes of inspiration; they come from people who are immersed in communities of inquiry, and especially from those who bridge multiple communities. The picture is an attractive one, which I actually think (or perhaps “fervently pray”) has a lot of truth to it.

Green Growth vs. No Growth – a debate on CBC’s Ideas

CBC’s radio show Ideas recently hosted and then podcast a debate on Green Growth or No Growth at the University of Ottawa.  The debate starts from accepting the idea that humanity faces serious environmental problems.  The debaters then debate the resolution: Be it resolved that building an environmentally sustainable society will require an end to economic growth.

I disagree with the idea framing the debate that human impact on the natural world is always problem. While reducing the environmental impact per unit of human wellbeing is good, we can also work to shift the impacts of human impact from a negative to a positive. Or in other words we can also choose to invest in the building, enhancing, restoring Earth rather than only reducing the amount we impact it.

The program was released as a downloadable podcast on February 28, 2011, and will be available until the end of March.  The podcast can be found at at http://www.cbc.ca/podcasting/index.html?newsandcurrent. Click on the link and scroll about half way down the page, click “The Best of Ideas” link.

The ‘no-growth’ side was:

Peter Victor
Author of Managing Without Growth: Slower By Design, Not Disaster, professor (and former Dean) at York University, and former Assistant Deputy Minister in the Ontario government.

Tim Jackson
Economics commissioner with the UK Sustainable Development Commission, professor at the University of Surrey (UK), and author of Prosperity without Growth – economics for a finite planet.(external link)

The ‘green-growth’ side was:

Richard Lipsey
one of Canada’s pre-eminent economists, professor emeritus at Simon Fraser University, and author of Economic Transformations: General Purpose Technologies and Long Term Economic Growth.

Paul Ekins
Author of Economic Growth and Environmental Sustainability: The Prospects for Green Growth, professor at University College London, and Director of the UK Green Fiscal Commission.

Living in the Anthropocene

On Yale360 Paul Crutzen and Christian Schwägerl write that Living in the Anthropocene:

Living up to the Anthropocene means building a culture that grows with Earth’s biological wealth instead of depleting it. Remember, in this new era, nature is us.

In the March 2011 National Geographic, environmental journalist Elizabeth Kolbert writes Enter the Anthropocene—Age of Man, which describes the idea and the geological changes being produced by humanity.  This article looks at the anthropocene more from the point of view as damage to the biosphere, rather than what we can do to reduce that damage and increase human wellbeing. It is illustrated by photos two of which are shown above.

For more on living in the anthropocene, see our 2009 post resilience as an operating system for the anthropocene on Chris Turner‘s article Age of Breathing Underwater on the anthropocene in the Walrus, as well as our recent article on the Environmentalist’s Paradox.

Mapping impact of snow and ice feedbacks on climate

NASA Earth Observatory Image of the day has some powerful figures created with data from a new paper by Mark Flanner and others Radiative forcing and albedo feedback from the Northern Hemisphere cryosphere between 1979 and 2008. in Nature Geoscience. They use satellite data to estimate how changes in snow and ice in the Northern Hemisphere have contributed to rising temperatures over the last 30 years. They found that these changes in albedo have warmed the planet more than expected from models.

NASA Earth Observatory writes:

The left image shows how much energy the Northern Hemisphere’s snow and ice—called the cryosphere—reflected on average between 1979 and 2008. Dark blue indicates more reflected energy, in Watts per square meter, and thus more cooling. The Greenland ice sheet reflects more energy than any other single location in the Northern Hemisphere. The second-largest contributor to cooling is the cap of sea ice over the Arctic Ocean.

The right image shows how the energy being reflected from the cryosphere has changed between 1979 and 2008. When snow and ice disappear, they are replaced by dark land or ocean, both of which absorb energy. The image shows that the Northern Hemisphere is absorbing more energy, particularly along the outer edges of the Arctic Ocean, where sea ice has disappeared, and in the mountains of Central Asia.

“On average, the Northern Hemisphere now absorbs about 100 PetaWatts more solar energy because of changes in snow and ice cover,” says Flanner. “To put it in perspective, 100 PetaWatts is seven-fold greater than all the energy humans use in a year.” Changes in the extent and timing of snow cover account for about half of the change, while melting sea ice accounts for the other half.

Flanner and his colleagues made both calculations by compiling field measurements and satellite observations from the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer, and Nimbus-7 and DMSP SSM/I passive microwave data. The analysis is the first calculation of how much the energy the entire cryosphere reflects. It is also the first observation of changes in reflected energy because of changes in the entire cryosphere.

Energy intensity convergence

In climate change discussions, energy intensity is the amount of energy required to produce a dollars worth of GDP.  While there are big differences in energy intensity around the world.  Generally poor countries are more energy intensive than rich, and the US, Canada and Australia are more energy intensive than Europe and Japan.  A recent graph from the Economist illustrates how energy intensities are falling and converging, unfortunately at a slower rate than economic growth, meaning that energy use, and hence CO2 emissions, continue to grow.

From the Economist:

Reading through computer eyes

by Juan Carlos Rocha (PhD student at Stockholm Resilience Centre working on Regime Shifts)

An N-gram is a sequence of characters separated by a space in a text. An N-gram may be a word, a number or a combination of both. The concept of N-grams simplifies the application of statistical methods to assess the frequency of a word or a phrase in body of text. N-gram statistical analyses have been around for years, but recently Jean-Baptiste Michel and collaborators had the opportunity to applying N-gram text analysis techniques to the massive Google Books collection of digitalized books. They analyzed over 5 million documents which they estimate are about 4% of all books ever published, and published their work in Science [doi].

The potential of exploring huge amounts of text, which no single person could read, provides the opportunity to trace the use of words over time. This allows researchers to track the impact of events on word use and even the evolution of language, grammar and culture. For example, by counting the words used in English books, the team found that in the year 2000 the English lexicon had over one million words, and it has been growing about 8500 words per year. Similarly, they were able to track word fads, for example the changes in the regular or irregular forms of verb conjugations over time (e.g. burned vs burnt). More interestingly, based on particular events and famous names they identified that our collective memory, as recorded in books, has both a short-term and long-term component; we are forgetting our past faster than before; but we are also learning faster when it comes to, for example, the adoption of technologies.

The options for reading books with machine eyes does not end there. Censorship during the German Nazi regime was identified by comparing the frequency of author’s names in the German and English corpus. The researchers could detect a fingerprint of the suppression of a person’s ideas in the language corpus.

The researchers term this quantitative analysis of our historic knowledge and culture through the analysis of this huge amount of data – culturomics. They plan further research will incorporate newspapers, manuscripts, artwork, maps and other human creations. Possible future applications are the development of methods for historical epidemiology (e.g. influenza peaks), the analysis of conflicts and wars, the evolution of ideas (e.g. feminism), and I think, why not ecological regime shifts?

Above you can see the frequency of some of the regime shifts we are working with in the English corpus. Soil salinization and lake eutrophication appear in 1940’s and 1960’s respectively, probably with the first description of such shifts. Similarly, coral bleaching take off during the 1980’s when reef degradation in the Caribbean basin began to be documented. Similarly, the concept of regime shift has been more and more used since 1980’s, probably not only to describe ecological shifts but also political and managerial transitions.

Although data may be noisy, the frequency of shock events may be tracked as well. Here for example we plot oil spill and see the peak corresponding to the case of January 1989 in Floreffe, Pennsylvania. Note that it does not show the oil spill in the Gulf of Mexico last year because the database is updated to 2008.

If you want to play around with your favorite words or your theme of interest, have a look to the n-gram viewer at Google Labs and have fun!

Impacts of the 2010 tsunami in Chile

UPDATE: Here is a link to a video to Prof. Castilla’s talk (via @sthlmresilience)

03:34 a.m. February 27th 2010. Suddenly, a devastating earthquake and a series of tsunamis hits the central–south coast of Chile. An earthquake so powerful (8.8 on the moment magnitude scale), that not only is the fifth largest recorded on earth, but also moves the city of Buenos Aires in Argentina, 10 feet (!) to the west.

Juan Carlos Castilla from the Pontificia Universidad Católica de Chile, recently visited Stockholm, and gave an update about the tsunamis’ impact on coastal communities. The effects of the tsunami were devastating, and the death toll from the 2-3 tsunamis alone was between 170-200 in the coastal areas of regions VI, VII and VIII. The most noticeable biophysical impact in the region is the elevation of the whole coastal area, ranging from 1.5 to 3 meters. This obviously has had big impacts on the composition of species and vegetation on the coast. The impacts on coastal ecosystems and fisheries is however still unclear.

Based on extensive field studies two months after the disaster, Castilla and his research team noted that only 8-12 (about 6%) of the 200 deceased where from fisherman families. According to Castilla, this low figure can be explained by the existence of strong social networks, and local knowledge passed on from generation to generation. As an artisan fisherman in the study, summarized one shared local saying:

“if an earthquake is so strong you can not stand up: run to the hills”

Luckily, February 27th was a night of full moon. This allowed people to more easily run for protection in the hills. According to Castilla, the combination of full moon, local knowledge, and strong bonds between neighbors, made it possible for members of fishermen communities to rapidly act on the first warning signal: the earthquake. The fact that locals also were taught not to leave the hills after at least a couple of hours after an earthquake, also helped them avoid the following devastating tsunamis. Unfortunately, visitors and tourists in the tsunami affected coastal areas, were not.

Read more:

Marín, A et al. (2010) ”The 2010 tsunami in Chile: Devastation and survival of coastal small-scale fishing communities”, Marine Policy, 2010, 34:1381-1384.

Gelchich, S et al. “Nagivating transformations in governance of Chilean marine coastal resources”, PNAS, 107(39): 16794-16799.

See Henrik’s post just the days after the Chilean earthquake here.

OECD global shock reports

The OECD’s Risk Management project has commissioned a number of reports to examine possible future global shocks and how society can become resilient to them.  They write:

The Project … recognises that shocks can provide opportunities for progress, not just negative consequences. Amongst the inputs from which the final report will draw are six background papers and case studies on the following themes: Systemic Financial Risk ; Pandemics ; Cyber Risks ; Geomagnetic Storms ; Social Unrest and Anticipating Extreme Events.

I haven’t read these reports (which are available through the links above), but they look interesting.  For example, prolific complexity scientist John Casti wrote the report on Anticipating Extreme Events.

thanks to Victor Galaz for the tip.

Floods in Brisbane and Brazil

The near simultaneous floods in Brazil and Brisbane provide a contrast in terms of their impact (and media coverage).  Brisbane is experiencing huge property damage, but relatively little loss of life – while Brazil is experiencing large loss of life, without as much property damage.

In Brazil experienced much smaller area flooded, but due to the rapidity, terrain and vulnerability of people much more death.  Recent reports state the death toll exceeds 500 people, making it Brazil’s most deadly natural disaster (see also BBC). The Christian Science Monitor writes:

Less than a year ago, just a few miles from where this week’s devastation occurred, 160 people died when houses built on top a hillside garbage dump gave way. Another 250 were killed by mudslides in other parts of the state.

In São Paulo, the two rivers that ring the city routinely burst their banks causing traffic chaos and some neighborhoods spent several weeks under water last year.

Government officials vowed they would review the current procedures that ensure much more money is spent on cleaning up disasters rather than stopping them from happening, with leading Civil Defense official Humberto Vianna telling the government news agency: “[Our] logic needs to be inverted. We are going to prioritize prevention.”

Meanwhile, in Brisbane Dan Hill from architecture and urbanism blog city of sound writes about a long reflection filled post about details and feeling of the flood:

Part of all this is just Queensland. It comes with the territory, as they say. Comes with the terrain might be a better way of putting it, as Brisbane is basically built in a flood plain. You can’t help but consider the folly of building Australia’s third largest city in a flood plain, but then Melbourne is built on a big old swamp too, so that’s two of them. And Sydney will hardly be immune to rising sea levels.Brisbane is characterised, like perhaps no other city on earth, by a particular kind of domestic architecture: the Queenslander. This is typically a wooden house with a pitched tin roof overhanging a wrap-around veranda, a cruciform internal layout to enable air flow, and elevated high on stilts to catch the breeze and avoid the bugs. Designed to create good air flows under and through the building, and originally enable people to sleep outside, you see them everywhere across the city. It’s uniquely identified with the city. Over time, they’ve become both coveted and replaced, with good examples being preserved and becoming expensive, and yet many demolished in favour of new builds done in the cheaper ‘slab on ground’ model of building, which is the easiest way of doing it. But guess which is most appropriate for these conditions? Those wooden houses on stilts are often sitting pretty above the rising water at the moment.

There will be much finger-pointing after this, from insurance companies refusing to pay up due to the releases from dams not technically being floods (what on earth else are they then?); from those who point out that, as memory of the ’74 floods faded, developers were allowed to build in flood plains earmarked for further dams; from those pointing out that the floods are a result of climate change (even if these ones aren’t, future ones will be); from those pointing out that the entire fragile mode of suburban development of Australian cities is particularly unsuited to the resilience required of the near-future; that development should not have been allowed on the riversides and basins of floodplains, and so on.

There will be a time for discussing how to achieve more resilient patterns of settlement in Australia. I’m not at all convinced that Australians have the appetite for genuinely addressing this, even despite the floods. Most people are apparently incapable of thinking about the future on the scale required for investment in things like urban resilience, even accepting we need to get better at communicating all this. I’m not sure people see the connection between devastating flooding and a culture where property developers call the shots, where cost drives aspiration in building and infrastructure, and where a car-based fabric of dispersed tarmac’ed low-density communities is virtually the Australian dream. But if it’s not events like this, I’m not sure what else it would take to make this clear and force the issue.

Continue reading

Haiti a year after the quake

The strong 2010 Haiti earthquake had its epicentre near Port-au-Prince, Haiti’s capital. It killed about 230,000 people, injured another 300,000, and made another 1,000,000 homeless a huge impact on a country of 10 million. The earthquake caused an estimated $10 billion worth of damage, more than Haiti’s annual GDP, a huge impact on a small, poor country.

The Big Picture photoblog has a great collection of photos from a year after the quake:

Soccer players from Haiti's Zaryen team (in blue) and the national amputee team fight for the ball during a friendly match at the national stadium in Port-au-Prince January 10, 2011. Sprinting on their crutches at breakneck speed, the young soccer players who lost legs in Haiti's earthquake last year project a symbol of hope and resilience in a land where so much is broken. (REUTERS/Kena Betancur) #

New York Times has a collection of aerial photos that show Haiti before the quake, immediately after, and now.  They also have the stories of six Haitians in the year after the quake.

NPR has a collection of stories on the post-quake recovery.

Michael K. Lindell writes in Nature Geoscience on the need for earthquake resilient buildings. He writes:

Usually, the poorest suffer the most in disasters that hit developing countries, but this may not have been so in Haiti. The lowest quality housing experienced less damage than many higher quality structures. Specifically, shanty housing made of mixed wood and corrugated metal fared well, as did concrete masonry unit structures made of concrete blocks and corrugated metal roofs. These inexpensive shacks probably had a very low incidence of failure because they are such light structures. At the other extreme, the most expensive seismically designed structures also seem to have performed well, but for quite different reasons. Although they were heavier, they had designs that avoided well-known problems, and the materials used in building were of adequate quality and quantity. It seems to have been the moderately expensive structures, built with concrete columns and slabs, that were reinforced, but concrete block walls that were not. Such structures frequently experienced severe damage or collapse because their builders cut costs with inadequate designs, materials and construction methods.

The relationship between building cost and seismic safety thus seems to be not just non-linear, but non-monotonic. That is, people can spend their way into hazard vulnerability, not just out of it. To avoid this problem, three main requirements must be met. First, earthquake risk maps are needed to identify the areas where seismic-resistant construction is required. Second, building codes must then be adopted, implemented and enforced. Finally, insurance is required to fund rebuilding after an earthquake in which building codes have saved lives but not buildings.

Today, mitigation of earthquake hazards is not held back primarily by a lack of engineering solutions: architects had access to manuals for seismic-resistant design for nearly 20 years at the time of the Haiti earthquake. But substantial further research is needed to examine how people can be convinced to make use of existing options for achieving physical and financial safety — especially in areas, such as the Central United States New Madrid seismic zone, that have earthquake recurrence intervals of hundreds of years. Implementing risk-management strategies for coping with such low-probability, high-consequence events will require innovative public/private partnerships.

Ultimately, even the poorest countries must regard building codes as necessities, not luxuries. Moreover, even relatively wealthy countries need to develop more effective strategies for managing seismic risks. This will require collaboration among earth scientists, social scientists, earthquake engineers and urban planners.