Category Archives: Ideas

How to deal with uncertainty in climate change economics

From the economist Martin L. Weitzman‘s website, a new draft paper On Modeling and Interpreting the Economics of Catastrophic Climate Change(pdf).  He proposes a method for including unlikely but extreme events (fat tails) in cost-benefit analyses, such as the uncertainty surrounding climate sensitivity.  Considering the possibility of such events can completely change the results of an analysis, and favour action as a type of catastrophe insurance.

Abstract: Using climate change as a prototype example, this paper analyzes the implications of structural uncertainty for the economics of low-probability high-impact catastrophes. The paper is an application of the idea that having an uncertain multiplicative parameter, which scales or amplifes exogenous shocks and is updated by Bayesian learning, induces a critical tail fattening of posterior-predictive distributions. These fattened tails can have very strong implications for situations (like climate change) where a catastrophe is theoretically possible because prior knowledge cannot place sufficiently narrow bounds on overall damages. The essence of the problem is the difficulty of learning extreme-impact tail behavior from finite data alone. At least potentially, the ináuence on cost-benefit analysis of fat-tailed uncertainty about climate change, coupled with extreme unsureness about high-temperature damages, can outweigh the influence of discounting or anything else.

The paper concludes:

In principle, what might be called the catastrophe-insurance aspect of such a fat-tailed unlimited-exposure situation, which can never be fully learned away, can dominate the social-discounting aspect, the pure-risk aspect, or the consumption-smoothing aspect. Even if this principle in and of itself does not provide an easy answer to questions about how much catastrophe insurance to buy (or even an easy answer in practical terms to the question of what exactly is catastrophe insurance buying for climate change or other applications), I believe it still might provide a useful way of framing the economic analysis of catastrophes.

Arctic sea ice: is it tipped yet?

RealClimate reports from the AGU about Arctic sea ice: is it tipped yet?

The summer of 2007 was apocalyptic for Arctic sea ice. The coverage and thickness of sea ice in the Arctic has been declining steadily over the past few decades, but this year the ice lost an area about the size of Texas, reaching its minimum on about the 16th of September. Arctic sea ice seems to me the best and more imminent example of a tipping point in the climate system. A series of talks aimed to explain the reason for the meltdown.

The disappearance of the ice was set up by warming surface waters and loss of the thicker multi-year ice in favor of thinner single-year ice. But the collapse of ice coverage this year was also something of a random event. This change was much more abrupt than the averaged results of the multiple IPCC AR4 models, but if you look at individual model runs, you can find sudden decreases in ice cover such as this. In the particular model run which looks most like 2007, the ice subsequently recovered somewhat, although never regaining the coverage before the meltback event.

So what is the implication of the meltback, the prognosis for the future? Has the tipping point tipped yet? When ice melts, it allows the surface ocean to begin absorbing sunlight, potentially locking in the ice-free condition. Instead of making his own prognosis, Overland allowed the audience to vote on it. The options were

* A The meltback is permanent
* B Ice coverage will partially recover but continue to decrease
* C The ice would recover to 1980’s levels but then continue to decline over the coming century

Options A and B had significant audience support, while only one brave soul voted for the most conservative option C. No one remarked that the “skeptic” possibility, that Arctic sea ice is not melting back at all, was not even offered or asked for. Climate scientists have moved beyond that.

For more coverage see Nature’s Great Beyond.

David Quammen on Emerging Infectious Disease

From National Geographic

David Quammen writes about emerging infectious diseases in National Geographic (Oct 2007):

Infectious disease is all around us. Infectious disease is a kind of natural mortar binding one creature to another, one species to another, within the elaborate edifices we call ecosystems. It’s one of the basic processes that ecologists study, including also predation, competition, and photosynthesis. Predators are relatively big beasts that eat their prey from outside. Pathogens (disease-causing agents, such as viruses) are relatively small beasts that eat their prey from within. Although infectious disease can seem grisly and dreadful, under ordinary conditions it’s every bit as natural as what lions do to wildebeests, zebras, and gazelles.

But conditions aren’t always ordinary.

Just as predators have their accustomed prey species, their favored targets, so do pathogens. And just as a lion might occasionally depart from its normal behavior—to kill a cow instead of a wildebeest, a human instead of a zebra—so can a pathogen shift to a new target. Accidents happen. Aberrations occur. Circumstances change and, with them, opportunities and exigencies also change. When a pathogen leaps from some nonhuman animal into a person, and succeeds there in making trouble, the result is what’s known as a zoonosis.

The word zoonosis is unfamiliar to most people. But it helps clarify the biological reality behind the scary headlines about bird flu, SARS, other forms of nasty new disease, and the threat of a coming pandemic. It says something essential about the origin of HIV. It’s a word of the future, destined for heavy use in the 21st century.

Close contact between humans and other species can occur in various ways: through killing and eating of wild animals (as in Mayibout II), through caregiving to domestic animals (as in Hendra), through fondling of pets (as with monkeypox, brought into the American pet trade by way of imported African rodents), through taming enticements (feeding bananas to the monkeys at a Balinese temple), through intensive animal husbandry combined with habitat destruction (as on Malaysian pig farms), and through any other sort of disruptive penetration of humans into wild landscape—of which, needless to say, there’s plenty happening around the world. Once the contact has occurred and the pathogen has crossed over, two other factors contribute to the possibility of cataclysmic consequences: the sheer abundance of humans on Earth, all available for infection, and the speed of our travel from one place to another. When a bad new disease catches hold, one that manages to be transmissible from person to person by a handshake, a kiss, or a sneeze, it might easily circle the world and kill millions of people before medical science can find a way to control it.

But our safety, our health, isn’t the only issue. Another thing worth remembering is that disease can go both ways: from humans to other species as well as from them to us. Measles, polio, scabies, influenza, tuberculosis, and other human diseases are considered threats to non-human primates. The label for those infections is anthropozoonotic. Any of them might be carried by a tourist, a researcher, or a local person, with potentially devastating impacts on a tiny, isolated population of great apes with a relatively small gene pool, such as the mountain gorillas of Rwanda or the chimps of Gombe. 

Food prices rising due increases in meat consumption and biofuels

The Economist (Dec 6th 2007) writes about how global agricultural prices are Cheap no more:

economist on food prices

…what is most remarkable about the present bout of “agflation” is that record prices are being achieved at a time not of scarcity but of abundance. According to the International Grains Council, a trade body based in London, this year’s total cereals crop will be 1.66 billion tonnes, the largest on record and 89m tonnes more than last year’s harvest, another bumper crop. That the biggest grain harvest the world has ever seen is not enough to forestall scarcity prices tells you that something fundamental is affecting the world’s demand for cereals.

Two things, in fact. One is increasing wealth in China and India. This is stoking demand for meat in those countries, in turn boosting the demand for cereals to feed to animals. The use of grains for bread, tortillas and chapattis is linked to the growth of the world’s population. It has been flat for decades, reflecting the slowing of population growth. But demand for meat is tied to economic growth (see chart 1) and global GDP is now in its fifth successive year of expansion at a rate of 4%-plus.

Higher incomes in India and China have made hundreds of millions of people rich enough to afford meat and other foods. In 1985 the average Chinese consumer ate 20kg (44lb) of meat a year; now he eats more than 50kg. China’s appetite for meat may be nearing satiation, but other countries are following behind: in developing countries as a whole, consumption of cereals has been flat since 1980, but demand for meat has doubled.

Not surprisingly, farmers are switching, too: they now feed about 200m-250m more tonnes of grain to their animals than they did 20 years ago. That increase alone accounts for a significant share of the world’s total cereals crop. Calorie for calorie, you need more grain if you eat it transformed into meat than if you eat it as bread: it takes three kilograms of cereals to produce a kilo of pork, eight for a kilo of beef. So a shift in diet is multiplied many times over in the grain markets. Since the late 1980s an inexorable annual increase of 1-2% in the demand for feedgrains has ratcheted up the overall demand for cereals and pushed up prices.

Because this change in diet has been slow and incremental, it cannot explain the dramatic price movements of the past year. The second change can: the rampant demand for ethanol as fuel for American cars. In 2000 around 15m tonnes of America’s maize crop was turned into ethanol; this year the quantity is likely to be around 85m tonnes. America is easily the world’s largest maize exporter—and it now uses more of its maize crop for ethanol than it sells abroad.

Ethanol is the dominant reason for this year’s increase in grain prices. It accounts for the rise in the price of maize because the federal government has in practice waded into the market to mop up about one-third of America’s corn harvest. A big expansion of the ethanol programme in 2005 explains why maize prices started rising in the first place.

Ethanol accounts for some of the rise in the prices of other crops and foods too. Partly this is because maize is fed to animals, which are now more expensive to rear. Partly it is because America’s farmers, eager to take advantage of the biofuels bonanza, went all out to produce maize this year, planting it on land previously devoted to wheat and soyabeans. This year America’s maize harvest will be a jaw-dropping 335m tonnes, beating last year’s by more than a quarter. The increase has been achieved partly at the expense of other food crops.

Guess who loses
According to the World Bank, 3 billion people live in rural areas in developing countries, of whom 2.5 billion are involved in farming. That 3 billion includes three-quarters of the world’s poorest people. So in principle the poor overall should gain from higher farm incomes. In practice many will not. There are large numbers of people who lose more from higher food bills than they gain from higher farm incomes. Exactly how many varies widely from place to place.

Among the losers from higher food prices are big importers. … some of the poorest places in Asia (Bangladesh and Nepal) and Africa (Benin and Niger) also face higher food bills. Developing countries as a whole will spend over $50 billion importing cereals this year, 10% more than last.

In every country, the least well-off consumers are hardest hit when food prices rise. This is true in rich and poor countries alike but the scale in the latter is altogether different. As Gary Becker, a Nobel economics laureate at the University of Chicago, points out, if food prices rise by one-third, they will reduce living standards in rich countries by about 3%, but in very poor ones by over 20%.

Stern: In Bali the rich must pay

Nicholas Stern, former chief economist of the World Bank and who led the Stern review on the economics of climate change, writes in the Guardian (Nov 30, 2007), that in Bali the rich must pay to produce a system to tackle climate change that is effective, efficient and equitable. He writes that A fair and global effort to tackle climate change needs wealthy states to take the lead in CO2 cuts:

The Bali summit on climate change, which starts next week, will seek to lay the foundations for a new global agreement on reducing the greenhouse gas emissions that cause rising temperatures and climate change. Ambitious targets for emission reduction must be at the heart of that agreement, together with effective market mechanisms that encourage emission trading between countries, rich and poor. The problem of climate change involves a fundamental failure of markets: those who damage others by emitting greenhouse gases generally do not pay. Climate change is a result of the greatest market failure the world has seen.The evidence on the seriousness of the risks from inaction is now overwhelming. We risk damage on a scale larger than the two world wars of the past century. The problem is global and the response must be collaboration on a global scale. The rich countries must lead the way in taking action. And in thinking about global action to reduce greenhouse gas emissions, we must invoke three basic criteria.

The first is effectiveness: the scale of the response must be commensurate with the challenge. This means setting a target for emission reduction that can keep the risks at acceptable levels.

The overall targets of 50% reductions in emissions by 2050 (relative to 1990) agreed at the G8 summit in Heiligendamm last June are essential if we are to have a reasonable chance of keeping temperature increases below 2C or 3C. While these targets involve strong action, they are not overambitious relative to the risk of failing to achieve them.

The second criterion is efficiency: we must keep down the costs of emission reduction, using prices or taxes wherever possible. Emission trading between countries must be a central part of the story. And helping poor countries cover their costs of emission reduction gives them an incentive to join a global deal.

Third, we should be concerned about equity. Our starting point is deeply inequitable with poor countries certain to be hit earliest and hardest by climate change. But rich countries are responsible for the bulk of past emissions: US emissions are currently more than 20 tonnes of CO2 equivalent per annum, Europe’s are 10-15 tonnes, China’s five or more tonnes, India’s around one tonne, and most of Africa much less than one.

For a 50% reduction in global emissions by 2050, the world average per capita must drop from seven tonnes to two or three. Within these global targets, even a minimal view of equity demands that the rich countries’ reductions should be at least 80% – either made directly or purchased. An 80% target for rich countries would bring equality of only the flow of current emissions – around the two to three tonnes per capita level. In fact, they will have consumed the big majority of the available space in the atmosphere.

Rich countries also need to provide funding for three more key elements of a global deal. First, there should be an international programme to combat deforestation, which contributes 15-20% of emissions. For $10bn-$15bn per year, half the deforestation could be stopped.

Second, there needs to be promotion of rapid technological advance to mitigate the effects of climate change. The development of technologies must be accelerated and methods found to promote their sharing. Carbon capture and storage for coal (CCS) is particularly urgent since coal-fired electric power is currently the dominant technology around the world, and emerging nations will be investing heavily in these technologies. For $5bn a year, it should be possible to create 30 commercial-scale coal-fired CCS stations within seven or eight years.

Finally, rich countries should honour their commitment to 0.7% of GDP in aid by 2015. This would yield increases in flows of $150bn-$200bn per year. The extra costs that developing countries face as a result of climate change are likely to be upwards of $80bn a year, and it is vital that extra resources are available. This proposed programme of action can be built if rich countries take a lead in Bali on their targets, the promotion of trading mechanisms and funding for deforestation and technology. With leadership and the right incentives, developing countries will join.

The building of the deal, and its enforcement, will come from the willing participation of countries driven by the understanding that action is vital. It will not be a wait-and-see game as in World Trade Organisation talks, where nothing is done until everything is settled.

The necessary commitments are increasingly being demonstrated by political action and elections around the world. A clear idea of where we are going as a world will make action at the individual, community and country level much easier and more coherent.

These commitments must, of course, be translated into action. There is a solution in our hands. It will not be easy to build. But the alternative is too destructive to accept. Bali is an opportunity to draw the outline of a common understanding, which will both guide action now and build towards the deal.

via Globalization and Environment

Climate change: What to do in Bali? Avoid rearranging the deckchairs

Soon the international climate policy will meet under the UN’s framework convention on climate change in Bali where representative’s of the world’s nations will attempt to forge an effective international strategy to succeed the Kyoto protocol when it expires in 2012. There has been a lot of thinking in recent years on what form this agreement should take, and strong statements from the world’s scientific community that the world requires immediate reductions in CO2 emissions. The head of the IPCC, Rajendra Pachauri, said “If there’s no action before 2012, that’s too late. What we do in the next two to three years will determine our future. This is the defining moment.”

British social scientists Gwyn Prins and Steve Rayner recently wrote a commentary in Nature Time to ditch Kyoto (Oct 25 2007)

The Kyoto Protocol is a symbolically important expression of governments’ concern about climate change. But as an instrument for achieving emissions reductions, it has failed. It has produced no demonstrable reductions in emissions or even in anticipated emissions growth. And it pays no more than token attention to the needs of societies to adapt to existing climate change. The impending United Nations Climate Change Conference being held in Bali in December — to decide international policy after 2012 — needs to radically rethink climate policy.

Influenced by three major policy initiatives of the 1980s, the Kyoto strategy is elegant but misguided. Ozone depletion, acid rain and nuclear arms control are difficult problems, but compared to climate change they are relatively simple. Ozone depletion could be prevented by controlling a small suite of artificial gases, for which technical substitutes could be found. Acid rain was mainly caused by a single activity in a single industrial sector (power generation) and nuclear arms reductions were achieved by governments agreeing to a timetable for mutually verifiable reductions in warheads. None of this applies to global warming.

In practice, Kyoto depends on the top-down creation of a global market in carbon dioxide by allowing countries to buy and sell their agreed allowances of emissions. But there is little sign of a stable global carbon price emerging in the next 5–10 years. Even if such a price were to be established, it is likely to be modest — sufficient only to stimulate efficiency gains. Without a significant increase in publicly funded research and development (R&D) for clean energy technology and changes to innovation policies, there will be considerable delay before innovation catches up with this modest price signal.

Sometimes the best line of attack is not head-on. Indirect measures can deliver much more: these range from informational instruments, such as labelling of consumer products; market instruments, such as emissions trading; and market stimuli, such as procurement programmes for clean technologies; to a few command-and-control mechanisms, such as technology standards. The benefit of this approach is that it focuses on what governments, firms and households actually do to reduce their emissions, in contrast to the directive target setting that has characterized international discussions since the late 1980s.

Because no one can know beforehand the exact consequences of any portfolio of policy measures, with a bottom-up approach, governments would focus on navigation, on maintaining course and momentum towards the goal of fundamental technological change, rather than on compliance with precise targets for emissions reductions. The flexibility of this inelegant approach would allow early mitigation efforts to serve as policy experiments from which lessons could be learned about what works, when and where. Thus cooperation, competition and control could all be brought to bear on the problem.

Does the Kyoto bandwagon have too much political momentum? We hope not. It will take courage for a policy community that has invested much in boosting Kyoto to radically rethink climate policy and adopt a bottom-up ‘social learning’ approach. But finding a face-saving way to do so is imperative. Not least, this is because today there is strong public support for climate action; but continued policy failure ‘spun’ as a story of success could lead to public withdrawal of trust and consent for action, whatever form it takes.

Nature has a follow-up discussion on this commentary on their climate blog ClimateFeedback.

A recent issue of Nature (15 November 2007) includes a letter from German climate scientist and policy advisor John Schellnhuber in which he responds. In Kyoto: no time to rearrange deckchairs on the Titaniche writes:

Gwyn Prins and Steve Rayner … manage to be perfectly right and utterly wrong at the same time. Their criticism of the bureaucratic Kyoto Protocol is justified on many crucial points (although they don’t mention that the physical impact of the protocol on the climate system would be negligible even if it worked). The novelty of this summary of well-known deficiencies in the treaty is that the list comes from independent European scientists rather than White House mandarins. Is there anything substantially new beyond that provocation?

Yes, in the sense that Prins and Rayner boldly propagate a “bottom-up ‘social learning’ ” approach to climate policy that aspires to “put public investment in energy R&D on a wartime footing”. I agree with the importance of both elements to twenty-first century climate protection, but doubt whether there is a solid causal chain linking them. Fine-scale measures and movements towards sustainability, as well as technological and institutional innovation strategies, are needed to decarbonize our industrial metabolism and to force policy-makers to face the challenges ahead. …

Time is crucial, however. It is unlikely that a bottom-up, multi-option approach alone will be able to mobilize war-level climate-protection efforts by all the major emitters (including Russia, China and India) within the one or two decades left to avert an unmanageable planetary crisis. Without a ‘global deal’ — designed for effectiveness, efficiency and fairness and providing a framework to accommodate every nation — there will be neither sufficient pressure nor appropriate orientation towards the climate solutions we desperately need. The bottom-up and top-down approaches are complementary and must be pursued interactively.

Kyoto is simply a miserable precursor of the global regime intended to deliver genuine climate stablization — and was never expected to be more. “Ditching” it now would render all the agonies involved completely meaningless after the event, denying the entire process of policy evolution the slightest chance to succeed. So, instead of rearranging the deckchairs on the Titanic through social learning, let us ditch pusillanimity.

Discussion of Scott’s Seeing Like a State

seeing like a state coverIn the late 1990s James Scott wrote a very interesting book Seeing Like a State: How certain schemes to improve the human condition have failed about the failure of bureaucratic planning to accomodate local-tacit knowledge that doesn’t easily fit within bureaucratic systems.

The failure of bureaucratic management to cope with social-ecological diversity is a strong theme in studies of common property and human ecology. I read the book from this perspective informed Holling’s pathology of natural resource management, and found much in the book that was congruent with the pathology. From descriptions of scientific forestry in Germany that simplified the forest to an extent that foresters had to encourage local school children to raise bees for pollination. Other have read the book for different perspectives, and their responses are interesting. Below I quote from the comments of an economist and political scientist who noticed different parts of the book.

Economist Brad DeLong criticizes the book’s lack of engangement with economic thought on the collective problem solving ability of individuals. He writes:

The key fault of what Scott calls “high modernism” is its belief that details don’t matter–that planners decree from on high, people obey, and utopia results. Note that Scott’s conclusion is not just that attempts at high-modernist centrally-planned social-engineering have failed. It is–as von Mises argued 70 years ago–they are always overwhelmingly likely to fail. As Scott puts it:

… [the] larger point [is that]… [i]n each case, the necessarily thin, schematic model of social organization and production animating the planning was inadequate as a set of instructions for creating a successful social order. By themselves, the simplified rules can never generate a functioning community, city, or economy. Formal order, to be more explicit, is always and to some degree parasitic on informal processes, which the formal scheme does not recognize, without which it could not exist, and which it alone cannot create or maintain (p. 310).

Yet even as he makes his central points, Scott appears unable to make contact with his intellectual roots–thus he is unable to draw on pieces of the Austrian argument as it has been developed over the past seventy years. Just as seeing like a state means that you cannot see the local details of what is going on, so seeing like James Scott seems to me that you cannot see your intellectual predecessors.

That the conclusion is so strong where the evidence is so weak is, I think, evidence of profound subconscious anxiety: subconscious fear that recognizing that one’s book is in the tradition of the Austrian critique of the twentieth century state will commit one to becoming a right-wing inequality-loving Thatcher-worshiping libertarian (even though there are intermediate positions: you can endorse the Austrian critique of central planning without rejecting the mixed economy and the social insurance state).

And when the chips are down, this recognition is something James Scott cannot do. At some level he wishes–no matter what his reason tells him–to take his stand on the side of the barricades with the revolutionaries and their tools to build utopia. He ends the penultimate chapter of his book with what can only be called a political pledge-of-allegiance:

Revolutionaries have had every reason to despise the feudal, poverty-stricken, inegalitarian past that they hoped to banish forever, and sometimes they have also had a reason to suspect that immediate democracy would simply bring back the old order. Postindependence leaders in the nonindustrial world (occasionally revolutionary leaders themselves) could not be faulted for hating their past of colonial domination and economic stagnation, nor could they be faulted for wasting no time or democratic sentimentality on creating a people that they could be proud of (p. 341).

But then comes the chapter’s final sentence: “Understanding the history and logic of their commitment to high-modernist goals, however, does not permit us to overlook the enormous damage that their convictions entailed when combined with authoritarian state power” (p. 341).

Political Scientist Henry Farrell responds by arguing that “rational planning” that ignores local conditions is not just a problem of state planning:

What Scott argues, as I understand it is as follows. First – that processes of rationalization lead to the destruction of metis, or local knowledge if you would prefer, and the prioritization of codifiable, quantifiable, epistemic knowledge. Second, that this process involves obvious and (sometimes quite important) trade-offs, but may often be worth it – e.g. there is no point in idealizing serf-like conditions that preserve local knowledge at the expense of human freedom. Third, that the real problem is when the creation of epistemic knowledge is combined with high modernist attempts to engage in social engineering. This arrives at similar conclusions to Hayek etc about how terrible collectivization processes are, but from different premises. Specifically, what Hayek etc would see as the result of state planning, Scott sees as the result of broader forms of rationalization (hence, perhaps, the linkages to Foucault that Brad worries about) when they coincide with a certain kind of state hubris (the hubris doesn’t necessarily follow from the creation of codifiable knowledge).

Thus, I think there is a argument against the Hayekians which is not very far from the surface of Seeing Like a State and which can be drawn out quite easily. First – Scott makes it clear that the processes of market development and of state imposition of standards goes hand in hand. Brad talks about how the very first example that Scott draws on – German scientific forestry in the nineteenth century – is intended to show the failures of state planning. But as Scott makes clear, the relevant failures are driven as much by the market as by the state – Scott writes about how the “utilitarian state could not see the real, existing forest for the (commercial trees)” and about how the forest as a habitat disappears and is replaced by the forest as an economic resource to be managed efficiently and profitably. Here, fiscal and commercial logics coincide; they are both resolutely fixed on the bottom line.

This is an important sub-theme of the book, and indeed of our understanding of how states and markets have developed hand-in-hand. Sometimes, the state has sought to impose its view for reasons of its own interest and survival (whether this be the promotion of ‘public order,’ the increase of fiscal revenues or whatever), sometimes at the behest of market actors who are interested in standardization, and sometimes for rationales that blur these two together.

This leads on to the second point – that a lot of what Scott argues is correct. His claim, as I read it is less about the specific problems of state-created institutions, than the ways in which a large variety of abstracting institutions or standards miss out on, and perhaps undermine important forms of local knowledge. As I understand him, any standards sufficient for impersonal exchange are likely to abstract away the actual relationships that people have with their environment. Here, Scott is less a closet-Hayekian than a more-or-less-overt Polanyian, who develops some of Polanyi’s arguments (especially his claims about the institutional consequences of long distance trade, and the economy as an instituted process) to make them sharper and more interesting.

Another, more homely example is food. Brad criticizes Scott’s discussion of the much-cited tasteless tomato arguing that it are an example of market success rather than failure – people bought tasteless tomatoes because they were cheap. This seems to me to have a bit of a flavor of a revealed preferences argument, and also to miss the point. I lived in Florence for three years, a city which has cheap and delicious tomatoes, despite being some distance from the parts of Italy where tomatoes are grown. While I can’t prove it, I strongly suspect that the deliciousness of the tomatoes had a lot to do with informal relationships between the small shops where you bought the tomatoes, the small companies that delivered them, and the small farms from where they were bought. Certainly, this would be consonant with the research that I and many others have done on the Italian political economy and how it works. Italy protects small businesses and local communities in a lot of ways. This means that it misses out badly on certain economies of scale. It also means that certain kinds of high quality production are possible in Italy that are difficult or impossible to replicate elsewhere – a myriad of small firms cooperating to produce final goods through purely informal means. Hence the success, for example, of Italian sunglasses, shoes, and (the rather unglamorous topic of my own research) packaging machinery. All of these build on forms of informal knowledge that would likely be damaged in a more standard market economy, where collaboration happened (to the extent that it did), within the hierarchy of the firm, or through arms-length contracts.

Thus, there are trade-offs. Italian firms in small-firm districts are excellent at gradual innovation and refinement of knowledge – in part because of their reliance on metis. They are not so good at producing profound, industry-changing forms of innovation. They also tend to stick closer to home than their equivalents in other countries (somewhat ironically, they replicate the logic of Avner Greif’s mediaeval Maghribi merchants far more than the behaviour of his Genoese traders).

…This allows me to come back to the roots of my disagreement with Brad. Brad is a fan of markets, and believes that they contribute in very important ways to human freedom. I agree with him on this. But I think that Brad sometimes underemphasizes the real trade-offs that markets may involve, and overstates his criticisms of people who are concerned with these trade-offs. Sometimes, perhaps often, these trade-offs are relatively slight – as Brad says, many forms of redundant local knowledge can be discarded without compunction. Sometimes, these trade-offs are real, but still worthwhile – while we should acknowledge the costs of markets, we should acknowledge that the benefits of introducing them are higher. And sometimes they are not worth paying – there are areas of social life where marketization has more downsides than advantages.

Climate Change Escapism

In Spain Greenpeace has published a short photo book Photoclima that uses estimates from IPPC and photomontages to show six landscapes of Spain a changed climate. The book is bilingual in Spanish and English.

By Pedro Armestre and Mario Gómez. La Manga del Mar menor, Murcia now and after a few decades of climate change,

On BLDGBLOG Geoff Manaugh comments on how this project, and how not to envision the future in Climate Change Escapism:

The basic idea here is that these visions of flooded resort hotels, parched farmlands, and abandoned villages, half-buried in sand, will inspire us to take action against climate change. Seeing these pictures, such logic goes, will traumatize people into changing how they live, vote, consume, and think. You can visually shock them into action, in other words: one or two glimpses of pictures like these and you’ll never think the same way about climate change again.
But I’m not at all convinced that that’s what these images really do.

In fact, these and other visions of altered planetary conditions might inadvertantly be stimulating people’s interest in experiencing the earth’s unearthly future. Why travel to alien landscapes when you can simply hang around, driving your Hummer…?

Climate change is the adventure tour of a lifetime – and all it requires is that you wait. Then all the flooded hotels of Spain and south Florida will be yours for the taking.
Given images like these, the future looks exciting again.

Of course, such thinking is absurd; thinking that flooded cities and continent-spanning droughts and forest fires will simply be a convenient way to escape your mortgage payments is ridiculous. Viewing famine, mass extinction, and global human displacement into diarrhea-wracked refugee camps as some sort of Outward Bound holiday – on the scale of a planet – overlooks some rather obvious downsides to the potentially catastrophic impact of uncontrolled climate alteration.

Whether you’re talking about infant mortality, skin cancer, mass violence and rape, waterborne diseases, vermin, blindness, drowning, and so on, climate change entails radically negative effects that aren’t being factored into these escapist thought processes.

But none of those things are depicted in these images.

These images, and images like them, don’t show us identifiable human suffering.

Building Transformation: CO2 emissions and change

According to the US government’s new report North American Carbon Budget and Implications for the Global Carbon Cycle buildings in North America contribute 37% of total CO2 emissions, while US buildings correspond to 10% of all global emissions (for more see Andrew Revkin’s weblog). This fact means that improving the environmental efficiency (in terms of carbon intensity) in the US has a big potential to reduce global emissions. The summary of Chapter 9 of the report writes:

The buildings sector of North America was responsible for annual carbon dioxide emissions of 671 million tons of carbon in 2003, which is 37% of total North American carbon dioxide emissions and 10% of global emissions. United States buildings alone are responsible for more carbon dioxide emissions than total carbon dioxide emissions of any other country in the world, except China.

USA CO2 emission sources

Carbon dioxide emissions from energy use in buildings in the United States and Canada increased by 30% from 1990 to 2003, an annual growth rate of 2.1% per year. Carbon dioxide emissions from buildings have grown with energy consumption, which in turn is increasing with population and income. Rising incomes have led to larger residential buildings and increased household appliance ownership.

These trends are likely to continue in the future, with increased energy efficiency of building materials and equipment and slowing population growth, especially in Mexico, only partially offsetting the general growth in population and income.

Options for reducing the carbon dioxide emissions of new and existing buildings include increasing the efficiency of equipment and implementing insulation and passive design measures to provide thermal comfort and lighting with reduced energy. Current best practices can reduce emissions from buildings by at least 60% for offices and 70% for homes. Technology options could be supported by a portfolio of policy options that take advantage of cooperative activities, avoid unduly burdening certain sectors, and are cost effective.

On WorldChanging Patrick Rollens writes the scale of expected construction in the USA. While construction contributes to CO2 emmissions, new infrastructure that is CO2 neutral or negative can substantially reduce emissions. Rollens reports on estmates that suggest that in about half of all buildings existing in 25 yearswill be new. This offers a great opportunity for both green building, but also building more green urban areas. In Remaking the Built Environment by 2030 he writes:

By 2030, about half of the buildings in America will have been built after 2000. This statistic, courtesy of Professor Arthur C. Nelson’s report for the Brookings Institution, means that over the next 25 years, we will be responsible for re-creating half the volume of our built environment.

The report has been around since 2004, but Nelson re-examined his own findings last year to see if the housing market’s downturn impacted the forecast. The sheer volume was essentially unchanged, and the mainstreaming of the green movement that’s occurred in the last two years presents a colossal challenge–and a magnificent opportunity–for the burgeoning sustainable building industry.

Nelson’s report states that the country will need about 427 billion square feet of space (up from 2000’s total volume of just 300 billion). Moreover, only a small portion of this space can be acquired by renovating existing real estate. We’re already well on our way; the U.S. Green Building Council estimates that we’re developing about twice times as fast as the associated population growth. Every new building built between now and 2030 should be seen as an opportunity to push the envelope and transform our structured world.

Taleb on the failures of financial economics

Nassim Nicholas Taleb writes in Financial Times that because financial economics focus on normal and marginal behaviour at the expense of shocks and market reorganizations it is a pseudo-science hurting markets:

I was a trader and risk manager for almost 20 years (before experiencing battle fatigue). There is no way my and my colleagues’ accumulated knowledge of market risks can be passed on to the next generation. Business schools block the transmission of our practical know-how and empirical tricks and the knowledge dies with us. We learn from crisis to crisis that MPT [modern portfolio theory] has the empirical and scientific validity of astrology (without the aesthetics), yet the lessons are ignored in what is taught to 150,000 business school students worldwide.

Academic economists are no more self-serving than other professions. You should blame those in the real world who give them the means to be taken seriously: those awarding that “Nobel” prize.

In 1990 William Sharpe and Harry Markowitz won the prize three years after the stock market crash of 1987, an event that, if anything, completely demolished the laureates’ ideas on portfolio construction. Further, the crash of 1987 was no exception: the great mathematical scientist Benoît Mandelbrot showed in the 1960s that these wild variations play a cumulative role in markets – they are “unexpected” only by the fools of economic theories.

Then, in 1997, the Royal Swedish Academy of Sciences awarded the prize to Robert Merton and Myron Scholes for their option pricing formula. I (and many traders) find the prize offensive: many, such as the mathematician and trader Ed Thorp, used a more realistic approach to the formula years before. What Mr Merton and Mr Scholes did was to make it compatible with financial economic theory, by “re-deriving” it assuming “dynamic hedging”, a method of continuous adjustment of portfolios by buying and selling securities in response to price variations.

Dynamic hedging assumes no jumps – it fails miserably in all markets and did so catastrophically in 1987 (failures textbooks do not like to mention).

Later, Robert Engle received the prize for “Arch”, a complicated method of prediction of volatility that does not predict better than simple rules – it was “successful” academically, even though it underperformed simple volatility forecasts that my colleagues and I used to make a living.

The environment in financial economics is reminiscent of medieval medicine, which refused to incorporate the observations and experiences of the plebeian barbers and surgeons. Medicine used to kill more patients than it saved – just as financial economics endangers the system by creating, not reducing, risk. But how did financial economics take on the appearance of a science? Not by experiments (perhaps the only true scientist who got the prize was Daniel Kahneman, who happens to be a psychologist, not an econ­omist). It did so by drowning us in mathematics with abstract “theorems”. Prof Merton’s book Continuous Time Finance contains 339 mentions of the word “theorem” (or equivalent). An average physics book of the same length has 25 such mentions. Yet while economic models, it has been shown, work hardly better than random guesses or the intuition of cab drivers, physics can predict a wide range of phe­nomena with a tenth decimal precision.

via 3quarks daily.

For more see Taleb’s home page – Fooled by Randomness.