Enhancing ecosystem services in agricultural lands

Farmers are the stewards of a third of the world’s terrestrial surface, the amount of land covered by croplands and grazing areas. Although the land use in these areas might be the dominant driver behind loss of ecosystem services globally a change in focus and management here can provide enormous opportunity it terms of restoring some of the ecosystem services that have already been degraded (see e.g. The Science review by Foley et al. 2005, or the results from the MA 2005). Beside being economically very important for food production, agricultural systems like all other ecosystems, can also provide other services, including carbon sequestration, erosion control, habitat for pests or pollinators and water modification.

Peter Karieva and collaborators provide an argument in a review paper in this weeks Science for refocusing ecosystem management, from preserving natural areas to shaping the ecological processes in domesticated land for enhancing human well-being. The figure illustrate the human footprint on Earth. Human impact is expressed as the percentage of human influence relative to the maximum influence recorded for each biome.

The human footprint on Earth. Human impact is expressed as the percentage of human influence relative to the maximum influence recorded for each biome.

They argue that:

if one accepts that virtually all of nature is now domesticated, the key scientific and social questions concern future options for the type of domesticated nature humans impose upon the world

Last week, there was a different policy forum paper in the same journal by N. Jordan and colleagues called Sustainable Development of the Agricultural Bio-Economy. They argue that major gains may result from a “working landscape” approach in ecosystem management. This approach focuses on improving ecosystem processes of farmlands by rewarding farmers for delivering environmental benefits, as well as food and biomass. They particularly stress the potential of multifunctional agriculture to enhance the many synergies that actually can take place in systems that are managed for multiple services rather than optimized production of one thing. For example, inclusion of more perennial species in agricultural production have been found to reduce soil and nitrogen losses, to have greater capacity to sequester greenhouse gases than annual based systems; to increase species of concern for conservation.

Multifunctional production systems can be highly valuable. The 34-million-acre Conservation Reserve Program (CRP) has been estimated to produce $500 million/year in benefits from reduced erosion and $737 million/year in wildlife viewing and hunting benefits at a cost of ~$1.8 billion. If benefits such as carbon sequestration are added, CRP likely produces a net gain in many areas, if not for the entire nation.

Karivera et al. caution against the romantic glorification of natural, or wild, ecosystems by stressing that

some paths of domestication will result in improved ecosystems both for people and for other species; other paths of domestication will result in ecosystems that are clearly better for humans but not for other species; and some paths of domestication will result in ecosystems that are too degraded to benefit people or other species. The key scientific goals for the study of domesticated nature are to understand what tradeoffs exist between the promotion or selection of different ecosystem services and to determine to what extent we can change a negative tradeoff to a positive one by altering the details of our domestication process

To be better at managing agriculture for multiple ecosystem services, they therefore argue that we need to become better at assessing trade-offs in these human dominated lands. The need for improving tools of trade-off analysis have also been emphasized by Elena Bennett and Patricia Balvanera in their recent paper in Frontiers in Ecology, as well as by Carpenter et al. in their analysis of research gaps from the Millennium Ecosystem Assessment. Bennett, Balvanera and Carpenter et al. argue that while we are relatively good at assessing trade-offs between two or a few ecosystem services we need to develop tools to assess how whole sets of bundles of ecosystem services relates to one another. While this might seem relatively straight forward it is actually very complicated.

Figure shows a conceptual framework for analysing trade-offs among bundles of ecosystem services, from Foley et al. 2005.

One of the problems that Jordan et al. points to is that we need better experimental experience at scales that are of interest for the relevant ecosystem processes:

Multifunctional systems have been tested only at relatively small scales. We propose creation of a network of research and demonstration projects to establish and evaluate economic enterprises based on multifunctional production systems. … These projects must be sufficiently scaled to address the complexity inherent in landscape-scale multifunctionality and in the feedback loops connecting natural, human, and social resources. They should be established in medium-sized watersheds (~5000 km 2) and should be managed by groups that encompass multiple stakeholders and levels of government.

Additional aspects to the need for analysing trade-offs, some highlighted Bennett and Balvanera include:
• Increased understanding of how trade-offs are altered across spatial and temporal scales.
• Improved capacity to evaluate uncertainty in dealing with trade-offs. Several of the uncertainties are linked to non-linear ecological processes, thresholds and resilience of ecosystems.
• Just developing tools for trade-off analysis will not be enough, but is just when the hard part starts. We need better processes for, and understanding of multilevel negotiations among stakeholders, power plays, multi-stakeholder processes of learning, deliberation, negotiation, and experimentation.
• How do we deal with preferences for some of the ecosystem services that people have not yet developed preferences for, simply because we don’t understand how these contribute to enhancing our well-being? Here is a need for strongly emphasizing the ‘pre-analytic vision’ of assessments to ensure that we at least try to address issues that are important although we might not yet have realized that they are

The how and why of linking future scenarios across scales

A group of young scholars from a variety of disciplines, many of whom have been involved in important scenario development exercises including those of the Millennium Ecosystem Assessment (MA), published a paper last week based on an online dialogue they had about linking scenarios across scales. During their month-long online discussion, the authors reviewed a variety of scenario studies at various scales to explore how scenarios can be linked across scales and importantly, what is to be gained (or lost) in both the process and outcome by connecting scenarios at multiple scales.

Scenarios are essentially stories about the future that draw on information about the past and present, often involving qualitative and/or quantitative models, in order to explore future outcomes under a variety of different criteria (e.g. policies, practices, or social values).

Scenarios, according to the authors, “allow us to envision alternative future development pathways by taking a systems perspective and accounting for critical uncertainties such as far-reaching technological changes or changes in social values. By envisioning alternative futures, scenarios can help decision makers identify ecosystem management policies and actions that will be robust across a range of potential future outcomes, or that promote desired outcomes or characteristics, such as ecosystem resilience (Shearer 2005, Carpenter and Folke 2006).”

Multi-scale scenarios involve storylines that are developed and connected at more than one scale (e.g., local, regional, national, and global). The authors suggest such multi-scale scenarios “make it easier to examine the impacts of mismatches between the scale at which ecological processes occur and the scale at which management occurs (Folke et al. 1998, Brown 2003).” In the paper they characterize scenarios in three categories: single-scale scenarios, loosely-linked multiscale scenarios, and tightly-coupled (cross-scale) scenarios and summarize the costs and benefits of each type in the excerpt below.

“The advantage of multiscale scenarios are that they can, at least to some extent, take account of cross-scale feedbacks and differences in drivers and stakeholder perspectives at different scales. Based on our assessment of multiscale scenarios, we suggest that, if the aim is to engage stakeholders, loosely linked scenarios are generally more appropriate. Loosely linked multiscale scenarios tend to allow more freedom to explore the issues of concern to the stakeholders at each scale. In this case, any of the linking options identified above may serve as a bridging mechanism between stakeholders at different scales to understand the impact of decisions made at one scale on other scales. A major disadvantage of loosely linked scenarios is that the storylines are often inconsistent across scales and cross-scale interactions are not well accounted for. Tightly coupled cross-scale scenario exercises are more appropriate when the aim is to evaluate cross-scale processes and potential responses. We therefore suggest that tightly coupled cross-scale scenarios are most appropriate if the main objective is to further scientific understanding or to inform policy making with respect to an issue that has differential effects at different scales or has strong cross-scale interactions or feedbacks. Such fully coupled scenarios can include processes and perspectives necessary to allow an in-depth cross-scale analysis and the development of cross-scale institutional links. However, developing tightly coupled cross-scale scenarios requires a very large input of time, technical expertise, and financial resources, which should not be underestimated.”

Transforming Universities

All my career my work was launched from a disciplinary base, but grew from developing an interdisciplinary character. And now some of the best of natural and social sciences is just that – complexity theory, for example is a lovely mix of just about any discipline imaginable, infused with the idea of complex adaptive system theory. And the practice of living in our world now is infused with the same spirit and the recognition of the power of the uncertain and unknown. That simply is delightful.

I was always in a situation where I could be interdisciplinary , but I carefully nurtured the needs to maintain disciplinary roots. And my courses drew upon grad students from just about any discipline imaginable – to their benefit, and my own.

I once asked the President of the University of Florida in a public meeting, what his image of a future university was. His answer, basically, was “just the same as it has always been”. I had been hoping for an answer closer to what this Nature editorial – The university of the future – presents. This editorial speaks very much to the future I see , one very much being attempted at ASU. Our Resilience Alliance has one of those interdisciplinary teams as a member and that enriches us all.

The American research university is a remarkable institution, long a source of admiration and wonder. …

Seen from the inside, however, everything is not quite so rosy. … the structure of these institutions is straightforward and consistent. The bedrock of each university is a system of discipline-specific departments. The strength of these departments determines the success and prestige of the institution as a whole.

This structure raises a few obvious questions. One is the relevance of the department-based structure to the way scientific research is done. Many argue that in a host of areas — ranging from computational biology and materials science to pharmacology and climate science — much of the most important research is now interdisciplinary in nature. And there is a sense that, notwithstanding years of efforts to adapt to this change by encouraging interdisciplinary collaboration, the department-based structure of the university is essentially at odds with such collaboration.

Continue reading

Seven Ways to Improve Environmental Education

An essay in PLOS Biology The Failure of Environmental Education (and How We Can Fix It), by Daniel Blumstein & Charlie Saylan propose seven ways to improve environmental education. The proposal has some good points, but is both US-centric and and some of their points are overly based on the assumption that everyone agrees on what environmental outcomes are desireable (1,5) and that we know what is needed to create a sustainable society (2, 6). Their seven proposals are:

  1. Design environmental education programs that can be properly evaluated, for example, with before-after, treatment-control designs. Such approaches represent a sea change from programs today, and we expect considerable resistance from environmental educators. But the environmental community at large must stop rejecting criticism as negative and must embark on a policy of continuing self-evaluation and assessment. To be deemed effective, environmental education and the funding process that supports it must also work backward from specific environmental problems by evaluating the degree of actual impact on a specific issue versus the amount of money and energy spent on public education. …
  2. Many environmental issues facing us today are caused by over-consumption — primarily by developed countries. Changing consumption patterns is not generally a targeted outcome of environmental education, but we believe it is one of the most important lessons that must be taught. The magnitude of our impact, as first proposed in 1971 by Paul Ehrlich and John Holdren, can be viewed as dependent upon population size, affluence (specifically, per capita economic output), and technology (specifically, the environmental output per unit of economic impact). As countries develop, their environmental footprint may expand, and consumption control may become more important. … Thus, we need to radically overhaul curricula to teach the conservation of consumable products. Teaching where and how resources come from—that food, clean water, and energy do not originate from supermarkets, taps, and power points—may be an important first step.
  3. We need to teach that nature is filled with nonlinear relationships, which are characterized by “tipping points” (called “phase shifts”): there may be little change in something of interest across a range of values, but above a particular threshold in a causal factor, change is rapid. For instance, ecology, which focuses on understanding the distribution and abundance of life on Earth, is a complex, nonlinear science. If environmental education is linear—in other words, if you teach that recycling one beer bottle will save “x” gallons of water—people will not have the foundation to think about linkages or nonlinear relationships. …
  4. We need to teach a world view. … A greater appreciation of the diversity of cultures and peoples in the world should help us realize the selfish consequences of our consumption. “Not in my backyard” is not a sustainable rallying cry in an interconnected world when we are faced with global climate change. We are too late for “think globally and act locally” to work. …
  5. We must teach how governments work and how to effect change within a given socio-political structure. We suspect that many individuals will be offended by the thought that large industries have so much sway on the wording of state and federal legislation. We all suffer from polluted water and greenhouse gasses, but lobbyists are very effective in diluting potentially costly legislation meant to safeguard our water supplies or prevent rampant climate change. Understanding how the system works will empower subsequent generations to change it.
  6. We must teach that conservation-minded legislation may deprive us of some of the goods and services that we previously enjoyed. Self-sacrifice will be necessary to some degree if we are to avoid or minimize adverse effects of imminent environmental threats with truly global consequences.
  7. Finally, we must teach critical thinking. Environmentally aware citizens must be able to evaluate complex information and make decisions about things that we can’t currently envision. True scientific literacy means that people have a conceptual tool kit that can be applied to a variety of questions. Unfortunately, much science education is not inspired, and students are required to learn facts without being given the ability to manipulate and analyze those facts. …

Pathological overproduction

fast foodMichael Pollan writes on potential reform of the US’s pathologic farm bill in the New York Times Magazine:

A few years ago, an obesity researcher at the University of Washington named Adam Drewnowski ventured into the supermarket to solve a mystery. He wanted to figure out why it is that the most reliable predictor of obesity in America today is a person’s wealth. For most of history, after all, the poor have typically suffered from a shortage of calories, not a surfeit. So how is it that today the people with the least amount of money to spend on food are the ones most likely to be overweight?

Drewnowski gave himself a hypothetical dollar to spend, using it to purchase as many calories as he possibly could. He discovered that he could buy the most calories per dollar in the middle aisles of the supermarket, among the towering canyons of processed food and soft drink. (In the typical American supermarket, the fresh foods — dairy, meat, fish and produce — line the perimeter walls, while the imperishable packaged goods dominate the center.) Drewnowski found that a dollar could buy 1,200 calories of cookies or potato chips but only 250 calories of carrots. Looking for something to wash down those chips, he discovered that his dollar bought 875 calories of soda but only 170 calories of orange juice.As a rule, processed foods are more “energy dense” than fresh foods: they contain less water and fiber but more added fat and sugar, which makes them both less filling and more fattening. These particular calories also happen to be the least healthful ones in the marketplace, which is why we call the foods that contain them “junk.” Drewnowski concluded that the rules of the food game in America are organized in such a way that if you are eating on a budget, the most rational economic strategy is to eat badly — and get fat.

This perverse state of affairs is not, as you might think, the inevitable result of the free market. Compared with a bunch of carrots, a package of Twinkies, to take one iconic processed foodlike substance as an example, is a highly complicated, high-tech piece of manufacture, involving no fewer than 39 ingredients, many themselves elaborately manufactured, as well as the packaging and a hefty marketing budget. So how can the supermarket possibly sell a pair of these synthetic cream-filled pseudocakes for less than a bunch of roots?

For the answer, you need look no farther than the farm bill. This resolutely unglamorous and head-hurtingly complicated piece of legislation, which comes around roughly every five years and is about to do so again, sets the rules for the American food system — indeed, to a considerable extent, for the world’s food system. Among other things, it determines which crops will be subsidized and which will not, and in the case of the carrot and the Twinkie, the farm bill as currently written offers a lot more support to the cake than to the root. Like most processed foods, the Twinkie is basically a clever arrangement of carbohydrates and fats teased out of corn, soybeans and wheat — three of the five commodity crops that the farm bill supports, to the tune of some $25 billion a year. (Rice and cotton are the others.) For the last several decades — indeed, for about as long as the American waistline has been ballooning — U.S. agricultural policy has been designed in such a way as to promote the overproduction of these five commodities, especially corn and soy.

Great Lakes hemorrahagic fish virus surprise

Viral hemorrhagic septicemia (V.H.S.) is an invasive virus that causes internal bleeding and organ failure of most of the sport and commercial fish in the Great Lakes. It has already killed tens of thousands of fish in the eastern Great Lakes, and is now spreading through the Great Lakes. It is likely to indirectly change the Great Lakes’ already unstable ecological structure. A New York Times article Fish-Killing Virus Spreading in the Great Lakes and the Toronto Star article Pathogen stalks fish report on the spread of the virus:

One of Dr. Casey’s colleagues researching the virus, Dr. Paul Bowser, a professor of aquatic animal medicine, added, “This is a new pathogen and for the first number of years — 4, 5 or 10 years — things are going to be pretty rough, then the animals will become more immune and resistant and the mortalities will decline.”

No one is sure where the virus came from or how it got to the Great Lakes. In the late 1980s, scientists saw a version of V.H.S. in salmon in the Pacific Northwest, which was the first sighting anywhere in North America. V.H.S. is also present in the Atlantic Ocean. But the genesis of a new, highly aggressive mutated strain concentrating on the Great Lakes is a biological mystery.

“We really don’t know how it got there,” said Jill Roland, a fish pathologist and assistant director for aquaculture at the U.S. Department of Agriculture. “People’s awareness of V.H.S. in the lakes was unknown until 2005. But archived samples showed the virus was there as early as 2003.”

Scientists pointed to likely suspects, mainly oceangoing vessels that dump ballast water from around the world into the Great Lakes. (Ships carry ballast water to help provide stability, but it is often contaminated and provides a home for foreign species. The water is loaded and discharged as needed for balance.)

Fish migrate naturally, but also move with people as they cast nets for sport, for instance, or move contaminated water on pleasure boats from lake to lake.

The United States Department of Agriculture issued an emergency order in October to prohibit the movement of live fish that are susceptible to the virus out of the Great Lakes or bordering states. The order was later amended to allow limited movement of fish that tested negative for the virus.

“Getting rid of it is extremely hard to foresee,” said Henry Henderson, director of the Natural Resources Defense Council’s Midwest office in Chicago. “These species spread, and reproduce. It is a living pollution.”

From the Toronto Star:

The deaths to date are just a small fraction of the millions of fish in the lakes. Even so, governments around the lakes are worried enough to try unprecedented steps to contain the virus.

VHS is suspected to be the latest on a growing list of destructive species – including zebra mussels and round gobies – brought into the lakes from Europe and Asia, usually in the ballast water of ocean-going ships.

The potential impact on fish isn’t the only concern. VHS doesn’t harm humans, but that doesn’t mean others that follow will be so benign, says Jennifer Nalbone, of Great Lakes United, a cross-border advocacy group based in Buffalo that for years has demanded strict controls on ballast.

“It’s a wake-up call that the lakes are vulnerable to any pathogen getting in here. We need to try to slow the spread but also to close the door.”

 

How to write consistently boring scientific literature

Danish biology professor Kaj Sand-Jensen has a new Oikos paper (2007 – 116: 723-727) which provides advice on How to write consistently boring scientific literature:

A Scandinavian professor has told me an interesting story. The first English manuscript prepared by one of his PhD students had been written in a personal style, slightly verbose but with a humoristic tone and thoughtful side-tracks. There was absolutely no chance, however, that it would meet the strict demands of brevity, clarity and impersonality of a standard article. With great difficulty, this student eventually learned the standard style of producing technical, boring and impersonal scientific writing, thus enabling him to write and defend his thesis successfully.

I recalled the irony in this story from many discussions with colleges, who have been forced to restrict their humor, satire and wisdom to the tyranny of jargon and impersonal style that dominates scientific writing. Personally, I have felt it increasingly difficult to consume the steeply growing number of hardly digestible original articles. It has been a great relief from time to time to read and write essays and books instead.

Because science ought to be fun and attractive, particularly when many months of hard work with grant applications, data collections and calculations are over and everything is ready for publishing the wonderful results, it is most unfortunate that the final reading and writing phases are so tiresome.

I have therefore tried to identify what characteristics make so much of our scientific writing unbearably boring, and I have come up with a top-10 list of recommendations for writing consistently boring publications.

  • Avoid focus
  • Avoid originality and personality
  • Write long contributions
  • Remove implications and speculations
  • Leave out illustrations
  • Omit necessary steps of reasoning
  • Use many abbreviations and terms
  • Suppress humor and flowery language
  • Degrade biology to statistics
  • Quote numerous papers for trivial statements

Via Erik Andersson.

Sandstorms and Land degradation in China

Gaoming Jiang, a professor at the Chinese Academy of Sciences’ Institute of Botany, writes about China’s failure to restore degraded arid land in a China Dialogue article Stopping the Sandstorms:

In Beijing, the weather forecast says that more sandstorms are on the way. The capital was hit by four sandstorms in March, and even Shanghai was recently smothered by dust clouds from the north. Television reports now describe these events as “sandy weather”, rather than “sandstorms”. But whatever you call them, they are becoming ever more frequent visitors to Beijing in springtime.

While everyone is cursing the weather, I find myself worrying: how many tonnes of soil are being lost? And how long will it be before there is nowhere in China for plants to take root? Academics argue to what extent these sandstorms are “imports” from Mongolia and the former Soviet Republics, or whether they are the “domestic” products of the arid deserts and damaged grasslands of China’s west. But either way, there is no denying the degree of environmental degradation in western China over the last three decades. Regardless of whether the capital’s weather comes from beyond its borders, China needs to put measures in place to restore the grasslands and reduce the risk of sandstorms.

Sixty billion yuan has been invested in projects to control the sandstorms that are hitting northeastern China. Tree-planting projects have also been running for 30 years across north China. But why haven’t they worked? And more importantly – what will?
Continue reading

Gelman’s notes on Black Swans

Noted Bayesian statistician Andrew Gelman writes his notes on Nassim Taleb‘s book the Black Swan:

As I noted earlier, reading the book with pen in hand jogged loose various thoughts. . . . The book is about unexpected events (“black swans”) and the problems with statistical models such as the normal distribution that don’t allow for these rarities. From a statistical point of view, let me say that multilevel models (often built from Gaussian components) can model various black swan behavior. In particular, self-similar models can be constructed by combining scaled pieces (such as wavelets or image components) and then assigning a probability distribution over the scalings, sort of like what is done in classical spectrum analysis of 1/f noise in time series. For some interesting discussion in the context of “texture models” for images, see the chapter by Yingnian Wu in my book with Xiao-Li on applied Bayesian modeling and causal inference. (Actually, I recommend this book more generally; it has lots of great chapters in it.)

That said, I admit that my two books on statistical methods are almost entirely devoted to modeling “white swans.” My only defense here is that Bayesian methods allow us to fully explore the implications of a model, the better to improve it when we find discrepancies with data. Just as a chicken is an egg’s way of making another egg, Bayesian inference is just a theory’s way of uncovering problems with can lead to a better theory. I firmly believe that what makes Bayesian inference really work is a willingness (if not eagerness) to check fit with data and abandon and improve models often.

update: Gelman follows up on his comments with:

Dan Goldstein and Nassim Taleb’s paper writes: “Finance professionals, who are regularly exposed to notions of volatility, seem to confuse mean absolute deviation with standard deviation, causing an underestimation of 25% with theoretical Gaussian variables. In some fat tailed markets the underestimation can be up to 90%. The mental substitution of the two measures is consequential for decision making and the perception of market variability.”

This interests me, partly because I’ve recently been thinking about summarizing variation by the mean absolute difference between two randomly sampled units (in mathematical notation, E(|x_i-x_j})), because that seems like the clearest thing to visualize. Fred Mosteller liked the interquartile range but that’s a little too complicated for me, also I like to do some actual averaging, not just medians which miss some important information. I agree with Goldstein and Taleb that there’s not necessarily any good reason for using sd (except for mathematical convenience in the Gaussian model).

 

Climate change and Tipping Points in the Amazon

Most of the talks from a recent conference on Climate change and the fate of the Amazon at University of Oxford are available online as slides and podcasts. Some of the interesting points from the conference:

  • Intact forests may be more resistant to drought than climate-vegetation models usually assume (deep roots, large soil water reserves, hydraulic uplift)
  • The interaction of drought with forest fragmentation and fire ignition points can trigger tipping to savanna forest with less biodiversity and biomass.
  • Global demand for soybeans and biofuels could drive substantial land clearing.
  • Substantial opportuntities for land use change feedbacks exist in Amazonia. Climatic drying could allow the expansion of soy and sugarcane cultivation, which would feedback to stimulate further drying.
  • There is a need increase the resilience of the Amazon, because models estimate a non-trival chance of severe drought and forest dieback over the 21st century. Resilience can be enhanced by enhancing the recycling of water vapour that maintains mesic forests in the amazon.

David Oswald works on Amazonia forest resilience in my lab. He attended the conference and has these recommendations on the talks:

Carlos Nobre – Dr. Nobre is very well-known internationally and especially in Brazil. He is a climate scientist by training but is involved in the leadership of scientific research projects such as IGBP, CPTEC, and the LBA project. He alludes to the importance of Ecological Resilience and Stability in his talk, but more detail and a conceptual framework is required – (that is what I am working on).

Peter Cox – Dr. Cox is a well-known global climate modeller and first published a paper in 2000 about the “Dieback” of the Amazon. This was very controversial when it came out and inspired many people to look at this problem from different perspectives and also using different global climate models. The follow up work to the 2000 paper has similar results and unfortunately, one of the outcomes of the conference was that there is general concensus that increasing greenhouse gas emissions and the corresponding climate change could have very serious effects on the Amazon. Again, these research projects at this scale have a high degree of uncertainty, but the people presenting, who are all experts, came to similar conclusions. Check it out for yourself.

Chris Huntingford – Dr. Huntingford’s presentation was a follow up to Cox’s work, basically testing the hyothesis and strength of results.

Luiz Aragao – Dr. Aragao and his collaborators did some interesting work with remote sensing, similar to the type of approach I am taking. Very solid work.

Michael Keller – Dr. Keller is with the US Forest Service and has been involved with the LBA project in a leadership position since the early 90’s. He has a broad historical as well as sound scientific perspective on things.

Dan Neptad – Dr. Nepstad is extremely well known in Amazonian research and is at the Woods Hole Research center. He has done some very interesting work with water availability and ecosystem health in the Amazon and has designed some very cool experiments. Increasingly, his work is focused on the interaction between science and development policy in this region. His presentation speaks to that. He is a progressive thinker, and also very active on the ground in the Amazon.

Juan Carlos Riveros – Dr. Riveros gave a very interesting talk on conservation strategies in the Amazon. I was blown away by the extent of the research they have done and continue to do with respect to conservation strategies. They have done some very interesting spatial analytical work. Good for a geography-oriented person.

Diogenes Alves – Dr. Alves is an interesting person. By training, he is a computational mathematician. He has been involved extensively with the design and planning of the LBA project. His presentation outlined the epistemological framework they used and also some of the challenges they initally faced with the structuring of an international scientific research project that clearly was embedded in a complex social and economic situation. He alluded to Systems Theory in his talk, and that really appealed to me, so I am including this one for those that are interested in the links between Social Science and Natural Science and the practical realities one faces when doing this type of research.

Kevin Conrad – Mr. Conrad is with a group called the Rainforest Coalition. He presented a strategy for rainforest conservation based on using the Clean Development Mechanism of the Kyoto Protocol as a means of attaching economic value on the carbon market to rainforests that are preserved and not degraded. I did not understand in depth this strategy, but it seems that there are positive merits to this approach. I personally, am not 100% sold on exclusively using market solutions but I think that they do play an important role. For more detail you can check out his presentation and come to your own conclusions.

Dr. Yadvinder Malhi’s provides a summary of the conference. He draws out the key points and overall conclusions.