Category Archives: Visualization

Plutocracy in the USA: four views

1) From State of Working America an interactive graph of changes in average income and the top percentiles of US income.

From  1939 to 1973, during the 24 years from the start of WW2 to the first oil crisis the US’s average income grew by $30,500.  72% of this economic growth in incomes went to the poorest 90%, and 28 % went to the richest 10%.

Over the following 24 years, between 1974 and 2008, the US’s average income grew by $11,000 – average income of the poorest 90% declined, and all the growth went to the richest 10%, mostly to the richest 1%.

The data on the website comes from economist Emmanuel Saez’s work.

2) Economist Daron Acemoglu of MIT is interviewed by EconTalk about the role income inequality may have played in creating the financial crisis.

… Acemoglu suggests a simpler story where the financial sector through its political influence distorted the rules of the game, benefiting executives in the industry, which in turn led to outsized rewards and ultimate instability in the financial industry.

3) From the Washington Post Federal investigators expose vast web of insider trading.

“Given the scope of the allegations to date, we are not talking simply about the occasional corrupt individual; we’re talking about something verging on a corrupt business model,” Preet Bharara, the U.S. attorney for the Southern District of New York, said in a statement.

The heightened focus on insider trading by the Justice Department and the Securities and Exchange Commission comes as the financial crisis has shaken confidence in the honesty of the financial markets.

Far from being a victimless crime, insider trading takes advantage of honest investors. In a series of cases, financiers are accused of gaining – or avoiding the loss of – more than $100 million trading such familiar stocks as Google, IBM, Hilton and Intel.

“What’s at stake is the credibility of our markets,” said former Sen. Ted Kaufman (D-Del.), chairman of a panel Congress created to review the Treasury Department’s $700 billion bailout of financial companies. Insider trading, he said, “sends a clear message to people who want to invest in the United States that . . . I’m not going to get a fair shake in the market. And that’s very dangerous.”

4) Economist Robert Schiller is an expert on speculative bubbles. He was recently interviewed by the Browser and recommends five books about human behaviour, inequality and the financial crisis. Among other books he recommends Winner-Take-All Politics (mentioned earlier on Resilience Science). He says:

This is a new book – it just came out. It’s about rising inequality and it traces back to fundamental causes.

… In the US, we’ve seen a rapid concentration of wealth at the extreme high end. The top tenth of a per cent of the top hundredth of a per cent of the population is getting wealthy very fast. They point out that this is not true in Europe, and yet the economies are very similar and growing at similar rates. If the technology is the same, why would there be a difference at the extreme high end? And they argue that the answer is really political. There have been political changes in the US that allow the extreme high end to garner more wealth. Ultimately, it represents a failure of our society to take account of the fact that the extreme high end can lobby and can organise for its own interests, and we’ve let it happen.

Modernist agricultural diversity

Agricultural Fields near Perdizes, Minas Gerais, Brazil from NASA EOS

The visual diversity of the field forms is matched by the variety of crops: sunflowers, wheat, potatoes, coffee, rice, soybeans, and corn are among the products of the region. While the Northern Hemisphere is still in the grip of winter, crops are growing in the Southern Hemisphere, as indicated by the many green fields. Fallow fields—not in active agricultural use—display the violet, reddish, and light tan soils common to this part of Brazil. Darker soils are often rich in iron and aluminum oxides, and are typical of highly weathered soil that forms in hot, humid climates.

Mapping Greenland’s melt

The same arctic weather patterns that have been cooling N. Europe and the Eastern USA have been warming Greenland as is shown in NASA’s image of the day Record Melting in Greenland during 2010:

2010 was an exceptional year for Greenland’s ice cap. Melting started early and stretched later in the year than usual. Little snow fell to replenish the losses. By the end of the season, much of southern Greenland had set a new record, with melting that lasted 50 days longer than average.

This image was assembled from microwave data from the Special Sensor Microwave/Imager (SSM/I) of the Defense Meteorological Satellites Program. Snow and ice emit microwaves, but the signal is different for wet, melting snow than for dry. Marco Tedesco, a professor at the City College of New York, uses this difference to chart the number of days that snow is melting every year. This image above shows 2010 compared to the average number of melt days per year between 1979 and 2009.

When snow melts, the fine, bright powder turns to larger-grained, gravely snow. These large grains reflect less light, which means that they can absorb more energy and melt even faster. When the annual snow is melted away, parts of the ice cap are exposed. The surface of the ice is also darker than snow. Since dark ice was exposed earlier and longer in 2010, it absorbed more energy, leading to a longer melt season. A fresh coat of summer snow would have protected the ice sheet, but little snow fell.

Energy intensity convergence

In climate change discussions, energy intensity is the amount of energy required to produce a dollars worth of GDP.  While there are big differences in energy intensity around the world.  Generally poor countries are more energy intensive than rich, and the US, Canada and Australia are more energy intensive than Europe and Japan.  A recent graph from the Economist illustrates how energy intensities are falling and converging, unfortunately at a slower rate than economic growth, meaning that energy use, and hence CO2 emissions, continue to grow.

From the Economist:

Reading through computer eyes

by Juan Carlos Rocha (PhD student at Stockholm Resilience Centre working on Regime Shifts)

An N-gram is a sequence of characters separated by a space in a text. An N-gram may be a word, a number or a combination of both. The concept of N-grams simplifies the application of statistical methods to assess the frequency of a word or a phrase in body of text. N-gram statistical analyses have been around for years, but recently Jean-Baptiste Michel and collaborators had the opportunity to applying N-gram text analysis techniques to the massive Google Books collection of digitalized books. They analyzed over 5 million documents which they estimate are about 4% of all books ever published, and published their work in Science [doi].

The potential of exploring huge amounts of text, which no single person could read, provides the opportunity to trace the use of words over time. This allows researchers to track the impact of events on word use and even the evolution of language, grammar and culture. For example, by counting the words used in English books, the team found that in the year 2000 the English lexicon had over one million words, and it has been growing about 8500 words per year. Similarly, they were able to track word fads, for example the changes in the regular or irregular forms of verb conjugations over time (e.g. burned vs burnt). More interestingly, based on particular events and famous names they identified that our collective memory, as recorded in books, has both a short-term and long-term component; we are forgetting our past faster than before; but we are also learning faster when it comes to, for example, the adoption of technologies.

The options for reading books with machine eyes does not end there. Censorship during the German Nazi regime was identified by comparing the frequency of author’s names in the German and English corpus. The researchers could detect a fingerprint of the suppression of a person’s ideas in the language corpus.

The researchers term this quantitative analysis of our historic knowledge and culture through the analysis of this huge amount of data – culturomics. They plan further research will incorporate newspapers, manuscripts, artwork, maps and other human creations. Possible future applications are the development of methods for historical epidemiology (e.g. influenza peaks), the analysis of conflicts and wars, the evolution of ideas (e.g. feminism), and I think, why not ecological regime shifts?

Above you can see the frequency of some of the regime shifts we are working with in the English corpus. Soil salinization and lake eutrophication appear in 1940’s and 1960’s respectively, probably with the first description of such shifts. Similarly, coral bleaching take off during the 1980’s when reef degradation in the Caribbean basin began to be documented. Similarly, the concept of regime shift has been more and more used since 1980’s, probably not only to describe ecological shifts but also political and managerial transitions.

Although data may be noisy, the frequency of shock events may be tracked as well. Here for example we plot oil spill and see the peak corresponding to the case of January 1989 in Floreffe, Pennsylvania. Note that it does not show the oil spill in the Gulf of Mexico last year because the database is updated to 2008.

If you want to play around with your favorite words or your theme of interest, have a look to the n-gram viewer at Google Labs and have fun!

Brisbane floods: before and after

From Australian Broadcasting Company the Brisbane floods: before and after:

High-resolution aerial photos taken over Brisbane last week have revealed the scale of devastation across dozens of suburbs and tens of thousands of homes and businesses.

The aerial photos of the Brisbane floods were taken in flyovers on January 13 and January 14.

See part one and part two.

Haiti a year after the quake

The strong 2010 Haiti earthquake had its epicentre near Port-au-Prince, Haiti’s capital. It killed about 230,000 people, injured another 300,000, and made another 1,000,000 homeless a huge impact on a country of 10 million. The earthquake caused an estimated $10 billion worth of damage, more than Haiti’s annual GDP, a huge impact on a small, poor country.

The Big Picture photoblog has a great collection of photos from a year after the quake:

Soccer players from Haiti's Zaryen team (in blue) and the national amputee team fight for the ball during a friendly match at the national stadium in Port-au-Prince January 10, 2011. Sprinting on their crutches at breakneck speed, the young soccer players who lost legs in Haiti's earthquake last year project a symbol of hope and resilience in a land where so much is broken. (REUTERS/Kena Betancur) #

New York Times has a collection of aerial photos that show Haiti before the quake, immediately after, and now.  They also have the stories of six Haitians in the year after the quake.

NPR has a collection of stories on the post-quake recovery.

Michael K. Lindell writes in Nature Geoscience on the need for earthquake resilient buildings. He writes:

Usually, the poorest suffer the most in disasters that hit developing countries, but this may not have been so in Haiti. The lowest quality housing experienced less damage than many higher quality structures. Specifically, shanty housing made of mixed wood and corrugated metal fared well, as did concrete masonry unit structures made of concrete blocks and corrugated metal roofs. These inexpensive shacks probably had a very low incidence of failure because they are such light structures. At the other extreme, the most expensive seismically designed structures also seem to have performed well, but for quite different reasons. Although they were heavier, they had designs that avoided well-known problems, and the materials used in building were of adequate quality and quantity. It seems to have been the moderately expensive structures, built with concrete columns and slabs, that were reinforced, but concrete block walls that were not. Such structures frequently experienced severe damage or collapse because their builders cut costs with inadequate designs, materials and construction methods.

The relationship between building cost and seismic safety thus seems to be not just non-linear, but non-monotonic. That is, people can spend their way into hazard vulnerability, not just out of it. To avoid this problem, three main requirements must be met. First, earthquake risk maps are needed to identify the areas where seismic-resistant construction is required. Second, building codes must then be adopted, implemented and enforced. Finally, insurance is required to fund rebuilding after an earthquake in which building codes have saved lives but not buildings.

Today, mitigation of earthquake hazards is not held back primarily by a lack of engineering solutions: architects had access to manuals for seismic-resistant design for nearly 20 years at the time of the Haiti earthquake. But substantial further research is needed to examine how people can be convinced to make use of existing options for achieving physical and financial safety — especially in areas, such as the Central United States New Madrid seismic zone, that have earthquake recurrence intervals of hundreds of years. Implementing risk-management strategies for coping with such low-probability, high-consequence events will require innovative public/private partnerships.

Ultimately, even the poorest countries must regard building codes as necessities, not luxuries. Moreover, even relatively wealthy countries need to develop more effective strategies for managing seismic risks. This will require collaboration among earth scientists, social scientists, earthquake engineers and urban planners.

Peak Travel?

A new paper in Transport Reviews by Adam Millard-Ball and Lee Schipper asks Are We Reaching Peak Travel? Trends in Passenger Transport in Eight Industrialized Countries.

Ball and Schipper looked at data from 1970-2008 in the United States, Canada, Sweden, France, Germany, the United Kingdom, Japan and Australia.  They show that increases in passenger activity have driven energy use in transport, because growths in activity have swamped increases in efficiency.  But the relationship between travel and GDP changed during the last decade.  Previously increases in GDP lead to increases in travel, but in the last decade travel seems to have plateaued, and this halting of growth does not appear to be due to increases in gas prices. This is shown in Figure 2 in their paper.

In light of these findings, it becomes evident that understanding and forecasting inflection points in transportation trends is crucial for effective future planning. This is particularly significant for countries like Australia, known for its thriving tourism industry, with popular destinations like the Gold Coast attracting millions of visitors annually. By examining the changes in passenger activity and energy use in transport, as highlighted by Ball and Schipper’s research, policymakers and tourism authorities in Australia can better anticipate shifts in travel patterns and tailor their strategies accordingly. For instance, analyzing the impact of GDP growth on travel trends in the context of Australia’s tourism sector could help identify potential opportunities and challenges for the industry, potentially leading to more sustainable and efficient approaches in promoting travel experiences within the Gold Coast travel guide and beyond.

One of the challenges in planning for the future is anticipating inflection points in ongoing trends.  This paper could have made this point stronger if they compared predicted vehicle use against actual vehicle use, but that was not their main point.

They write:

As with total travel activity, the recent decline in car and light truck use is difficult to attribute solely to higher fuel prices, as it is far in excess of what recent estimates of fuel price elasticities would suggest. For example, Hughes et al. (2006) estimate the short-run fuel price elasticity in the U.S. to range from -0.034 to -0.077, which corresponds to a reduction in fuel consumption by just over 1% in response to the 15% increase in gasoline prices between 2007 and 2008. In reality, per capita energy use for light-duty vehicles fell by 4.3% over this period.

…[in these countries transportation sector] the major factor behind increasing energy use and CO2 emissions since the 1970s – activity – has ceased its rise, at least for the time being. Should this plateau continue, it is possible that accelerated decline in the energy intensity of car travel, some shifts back to rail and bus modes, and at least somewhat less carbon per unit of energy might leave absolute levels of emissions in 2020 or 2030 lower than today.

via Miller-Mcune

Resilience and Regime Shift videos

Kit Hill a Masters student at the Stockholm Resilience Centre made the three short videos below to introduce the concept of resilience and the idea of a regime shift.

Resilience from Kit Hill on Vimeo.

Beating Down Resilience from Kit Hill on Vimeo.

Regime Shift from Kit Hill on Vimeo.

This video was made in order to help people understand the concept of a Social Ecological Regime Shift.

Kit’s efforts are part of a larger project to create a set of communication and teaching resources that can be used to communicate different aspects of resilience thinking.

For more information on resilience see:
the resilience alliance,
Stockholm Resilience Centre, Brian Walker and David Salt’s book Resilience Thinking, or collection of resilience papers on Mendeley.

Hans Rosling animates 200 years of human development

Hans Rosling shows how visualizing public health statistics can communicate development and inequality on the BBC show the Joy of stats.  The BBC writes:

Despite its light and witty touch, the film nonetheless has a serious message – without statistics we are cast adrift on an ocean of confusion, but armed with stats we can take control of our lives, hold our rulers to account and see the world as it really is. What’s more, Hans concludes, we can now collect and analyse such huge quantities of data and at such speeds that scientific method itself seems to be changing.

Resilience Science has featured Hans Rosling’s great work with Gapminder many times before : Hans Rosling at TED, Google gapminder, has the world become a better place?, and visualizing development.  Furthermore, this visualization of the huge growth in health and wealth over the past centuries illustrates the point that my co-authors and I made in our Environmentalist’s paradox paper.