Tag Archives: Science

Eric Idle on “Who Wrote Shakespeare?” : The New Yorker

In the New Yorker, former Python, Eric Idle writes “Who Wrote Shakespeare?”. He explains:

While it is perfectly obvious to everyone that Ben Jonson wrote all of Shakespeare’s plays, it is less known that Ben Jonson’s plays were written by a teen-age girl in Sunderland, who mysteriously disappeared, leaving no trace of her existence, which is clear proof that she wrote them. The plays of Marlowe were actually written by a chambermaid named Marlene, who faked her own orgasm, and then her own death in a Deptford tavern brawl. Queen Elizabeth, who was obviously a man, conspired to have Shakespeare named as the author of his plays, because how could a man who had only a grammar-school education and spoke Latin and a little Greek possibly have written something as bad as “All’s Well That Ends Well”? It makes no sense. It was obviously an upper-class twit who wished to disguise his identity so that Vanessa Redgrave could get a job in her old age.

Mere lack of evidence, of course, is no reason to denounce a theory. Look at intelligent design. The fact that it is bollocks hasn’t stopped a good many people from believing in it. Darwinism itself is only supported by tons of evidence, which is a clear indication that Darwin didn’t write his books himself. They were most likely written by Jack the Ripper, who was probably King Edward VII, since all evidence concerning this has been destroyed.

Paranoia? Of course not. It’s alternative scholarship. What’s wrong with teaching alternative theories in our schools? What are liberals so afraid of? Can’t children make up their own minds about things like killing and carrying automatic weapons on the playground? Bush was right: no child left unarmed. Why this dictatorial approach to learning, anyway? What gives teachers the right to say what things are? Who’s to say that flat-earthers are wrong? Or that the Church wasn’t right to silence Galileo, with his absurd theory (actually written by his proctologist) that the earth moves around the sun. Citing “evidence” is so snobbish and élitist. I think we all know what lawyers can do with evidence. Look at Shakespeare. Poor bloke. Wrote thirty-seven plays, none of them his.

And of course there is that great competitor to the theory of gravity – the theory of intelligent falling.

BP wins ’2010 Accidental Earth Experiment’ Prize

Bill Chameides Dean of the Nicholas School of Environment at Duke awards BP his 2010 Accidental Earth Experiment’ Prize!!! on his blog the Green Grok.  His award recognizes that BP’s incompetence created a disaster that created novel conditions allowing scientists to learn how the Earth works.  He writes:

For the Environmental Scientist, the Ultimate Lab Is Earth

Science is at its core an empirical endeavor. You can come up with all the clever and compelling theories you want, but data gathered from experiments are and will always be the ultimate arbiters of truth. That presents a problem for environmental and Earth scientists. The only laboratory that accurately replicates the thing we study is our little blue planet.

As a result, environmental scientists are forever looking for real-world events that, like a chemist’s laboratory experiments, directly test specific aspects of the Earth system. For example, volcanoes that spew tons of small particles into the upper atmosphere and variations in sunspots provide unique experiments to test the accuracy of climate models built on the basis of our understanding of climate.

The Accidental Experiments

But natural events are not the only sources of environmental experiments. Humanity is now arguably the greatest driver of environmental change on the globe, and as a result is increasingly and inadvertently causing events that double as experiments for inquisitive environmental scientists.

Unfortunately these “accidental experiments” often carry devastating consequences, but nevertheless provide a kind of consolation prize in the form of unique data to learn about the Earth with.

Case in Point: The Oil Rig Blowout in the Gulf of Mexico Last Spring

We can all agree the Deepwater Horizon disaster was a mess. But let’s not forget it’s also a grand experiment. How else could we learn what happens when you dump billions of barrels of oil into the gulf roughly a mile below the surface?

For example, we’ve learned that some bugs that inhabit the gulf’s waters have been effective in gobbling up the stuff the blown wellhead spewed into their home turf. A paper published last year in the journal Science by Terry Hazen of Lawrence Berkeley National Laboratory and colleagues reported on the discovery of a heretofore unknown voracious hydrocarbon-eating microbe.

Just last week came another paper in Science, this one by John Kessler of Texas A&M University and colleagues, which showed that other microbes had also made short work of most of the natural gas released from the blowout.

This is a great example of the natural system’s adaptability and ingenuity. Put a bunch of oil and gas in the ocean, and native bug populations swell to take advantage of it. I should note that we were somewhat lucky in this regard. The Gulf of Mexico was the beneficiary of an in situ population of bugs due to natural gas and oil seeps. Without these microbes the environmental consequences of the disaster (still the largest in marine history) would no doubt be worse.

An Algorithm for Discovery

A decade ago in Science, Paydarfar and Schwartz, neurologists from University of Massachusetts, wrote about an An Algorithm for Discovery (DOI:10.1126/science.292.5514.13).  They suggest that there is a useful algorithm for creating new knowledge that has five steps:

1. Slow down to explore. Discovery is facilitated by an unhurried attitude. We favor a relaxed yet attentive and prepared state of mind that is free of the checklists, deadlines, and other exigencies of the workday schedule. Resist the temptation to settle for quick closure and instead actively search for deviations, inconsistencies, and peculiarities that don’t quite fit. Often hidden among these anomalies are the clues that might challenge prevailing thinking and conventional explanations.

2. Read, but not too much. It is important to master what others have already written. Published works are the forum for scientific discourse and embody the accumulated experience of the research community. But the influence of experts can be powerful and might quash a nascent idea before it can take root. Fledgling ideas need nurturing until their viability can be tested without bias. So think again before abandoning an investigation merely because someone else says it can’ be done or is unimportant.

3. Pursue quality for its own sake. Time spent refining methods and design is almost always rewarded. Rigorous attention to such details helps to avert the premature rejection or acceptance of hypotheses. Sometimes, in the process of perfecting one’s approach, unexpected discoveries can be made. An example of this is the background radiation attributed to the Big Bang, which was identified by Penzias and Wilson while they were pursuing the source of a noisy signal from a radio telescope. Meticulous testing is a key to generating the kind of reliable information that can lead to new breakthroughs.

4. Look at the raw data. There is no substitute for viewing the data at first hand. Take a seat at the bedside and interview the patient yourself; watch the oscilloscope trace; inspect the gel while still wet. Of course, there is no question that further processing of data is essential for their management, analysis, and presentation. The problem is that most of us don’t really understand how automated packaging tools work. Looking at the raw data provides a check against the automated averaging of unusual, subtle, or contradictory phenomena.

5. Cultivate smart friends. Sharing with a buddy can sharpen critical thinking and spark new insights. Finding the right colleague is in itself a process of discovery and requires some luck. Sheer intelligence is not enough; seek a pal whose attributes are also complementary to your own, and you may be rewarded with a new perspective on your work. Being this kind of friend to another is the secret to winning this kind of friendship in return.

Although most of us already know these five precepts in one form or another, we have noticed some difficulty in putting them into practice. Many obligations appear to erode time for discovery. We hope that this essay can serve as an inspiration for reclaiming the process of discovery and making it a part of the daily routine. In 1936, in Physics and Reality, Einstein wrote, “The whole of science is nothing more than a refinement of everyday thinking.” Practicing this art does not require elaborate instrumentation, generous funding, or prolonged sabbaticals. What it does require is a commitment to exercising one’s creative spirit—for curiosity’s sake.

Mapping Science

A nice 2008 PNAS paper Maps of random walks on complex networks reveal community structure (PNAS 105, 1118) [pdf] by Martin Rosvall and Carl T Bergstrom creates  beautiful and informative visualizations of citation networks in science (from 2004 ISI data) using a neat method for visualizing and analyzing complex networks.  Martin Rosvall has a created a website that enables the creation of similar maps of network data.

Figure 3. In Rosvall and Bergstrom 2008. A map of science based on citation patterns. Analysis of 6,128 journals connected by 6,434,916 citations were clustered into 88 modules and 3,024 directed and weighted links.

Figure 4. A map of the social sciences. The journals listed in the 2004 social science edition of Journal Citation Reports (32) are a subset of those illustrated in Fig. 3, totaling 1,431 journals and 217,287 citations.

Naomi Oreskes on Merchants of Doubt

Historian of science Naomi Oreskes recently gave a talk at Brown University, based on her new book, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, about how right wing scientists founded the George Marshall Institute which has become a key hub for successfully spreading fear, uncertainty and doubt about climate change, along with other environmental issues, and how myths about science enable these political strategies to work.  Below is a video of her talk.

Below is a related 2007 talk of her’s from the University of California The American Denial of Global Warming, that provides more details on environmental denial.

Economics as a complex systems science

An interesting interview of Paul Krugman by Edward Hugh is on A Fistful of Euros:

E.H. : The late Sir Karl Popper used to contrast what he regarded as science with ideologies like Marxism and Psychoanalysis, because there seemed to be no way whatever of consenually agreeing with their practitioners a series of simple tests which would enable their theories to be falsified. Some critics of neoclassical economics – including Popper’s heir Imre Lakatos – have expressed similar frustrations. Do you think we economists are, as a profession, up to the challenge of formulating testable hypotheses in such a way that the public at large might come to have more confidence in what we are up to, or are we a lost cause?

P.K.: I really don’t think that’s a helpful way to pose this question. Economics is about modeling complex systems, and as such the models are always less than fully accurate. What economists do need, however, is some demonstrated ability to get big things right. They had that after the Great Depression, when Keynesian economics clearly made sense of both the depression and the wartime recovery. But now the profession needs to get back on track.

Clickstreams to map scientific knowledge production

Johan Bollen and collegues (2009) use “clickstreams” to map science interaction in their latest PLoS article. And they find in Figure 5 that “Ecology” sits in-between as a broker between social science and environmental/biological science.

The network universe of scientific knowledge production

The other researchers of the article are Herbert Van de Sompel, Aric Hagberg, Luis Bettencourt, Ryan Chute, Marko A. Rodrigue, and Lyudmila Balakireva.

The article is discussed further by Kelvin Kelly on his blog The Technicum

Previous maps of the relationship between branches of modern science were done by mapping the citations among journal articles. [...] Instead of mapping links, [the new method by Bollen et al 2009] maps clicks. The program reads the logs of the servers offering online journals (the most popular way to get articles today) and records the clickstream of a researcher as they hop from one article to the next. Then these clickstreams (1 billion interactions in this case) are mapped to sort out the relationships generated by users. [...] According to the authors of the the paper the advantages of the clickstream versus citation method is that clickstreams give you a real time picture and are broader in scope. They note that “the number of logged interactions now greatly surpasses the volume of all existing citations.”

I’ve been wondering about the future of Google and search engines in
general. [...] Wouldn’t be smart to also incorporate the wisdom of crowds of people clicking on sites as well. Mining the clickstream as well as the link graph? I wondered if Google was already doing this? [which they do according to Kelvin Kelly...] The number of clicks will continue to outpace the number of links, so I expect that in the future more and more of the web’s structure will be determined by clickage rather than linkage.

Caribbean reef fish decline in wake of coral collapse

A recent paper by Paddack et al in Current Biology (doi:10.1016/j.cub.2009.02.041) that shows that observed declines in fish populations in the Caribbean are  consistent across all subregions of the Caribbean basin (2.7% to 6.0% loss per year) and appear to be linked to coral reef collapse.  In Science Jackie Grom reports on the paper in Reef Fish Threatened by Coral Loss:

Ecologist Michelle Paddack, a postdoctoral student at Simon Fraser University in Burnaby, Canada, teamed up with an international group of scientists to find out. They analyzed data from 48 studies, including peer-reviewed papers, government and university research reports, and unpublished data sets, that covered trends on 318 Caribbean coral reefs and 273 species of reef fish over a 53-year period. Today in Current Biology, the team reports that reef fish populations were relatively constant from 1955 through 1995 but then plunged by about 3% to 6% each year through 2007. The declines occurred in three of six dietary groups, including those that fed primarily on algae, invertebrates, or a combination of fish and invertebrates. The loss of algae-eating fish, such as parrotfish and surgeonfish, is worrying, says Paddack, because they help the reefs thrive by clearing away algae.

The declines don’t appear to be caused by overfishing, because the losses were similar for fished and nonfished species. Paddack says that doesn’t mean fishing doesn’t have an impact but that something even bigger is influencing the entire sea. The researchers suggest that the culprit is unprecedented loss of coral reefs over the past 3 decades. Even though the reduction in fish populations lags nearly 20 years behind the coral loss, the consistency in fish declines across a wide range of species points to the loss of coral as the cause, they say.

“We’ve known that corals are declining and fish are declining, but boy, I think it’s powerful just to see the patterns at the regional scale,” says marine ecologist John Bruno of the University of North Carolina, Chapel Hill. Biologist Richard Aronson of the Florida Institute of Technology in Melbourne says that the suggestion that coral reef loss is behind the declines in reef fish is intriguing. But to nail down the link, he’s hoping to see studies that relate fish declines to the time it takes for the reefs to structurally deteriorate after they die. “I liked this paper a lot; it got me excited [about coral reefs] all over again,” he says.

Ecology and Wikipedia pt 2

To follow up on my post Wikipedia and ecology, the ESA blog EcoTone has posted an interview with the authors of the recent TREE paper on wikipedia (DOI:10.1016/j.tree.2009.01.003):

Why don’t you think more scientists contribute to Wikipedia?

EB: I know exactly why they don’t contribute. It’s because they don’t get any credit for it. We get credit for certain things to get promoted in science, and writing Wikipedia entries isn’t one of them. We work on an incentive system, and the incentive isn’t there.

Touché. What are some incentives that could be added?

EB: At a university, the ways to get credit would enhance your publication record, enhance your teaching program, and — if you’re at a land-grant university — enhance your extension program. Incorporating revision of Wikipedia entries into classes is a really creative way to get these entries revised. Students are on the cutting edge in terms of knowledge of the literature and they can further practice their writing by editing entries. Assigning them as projects hits all those goals we have as teachers: writing, critical thinking, and revisions of the literature. And it also gets that quality of thinking and writing out there for everyone else to see.

Kristine, as a student, what was your most valuable experience with this project?

KC: We were much more motivated to do a good job than if we were just turning this assignment in to a professor. This was going to go out to everybody, so we wanted to triple-check everything and make sure that it was exactly the way that we wanted it. If you’re just doing a term paper, sure, you do a good job, but it only influences your reputation with the professor. Not only did we learn something, but we also gave back to society. Also, we didn’t just learn how to publish, but we learned how to publish collaboratively. It’s very easy when there are just two or three authors on a paper, but …how many did we have, twelve authors?

EB: Fourteen. There were fourteen student authors on the paper, besides me.

KC: It’s a whole new ball game when you have fifteen different authors trying to agree on things.

So, given your experience, how would you convince scientists that they should contribute to Wikipedia?

KC: No matter where you publish, even if you’re publishing in Science or Nature, you’re not getting your research out to as many people as you will through Wikipedia. And it’s so important today because so much of the general public doesn’t understand or appreciate the science that goes on. Disseminating knowledge can really help motivate more appreciation and more funding for the sciences. If we continue to publish only in journals that scientists read, the public will continue to be in the dark.

EB: It’s a way to do the things that we want to do as teachers while also doing the things that our universities want us to do for the public.

Wikipedia and Ecology

Journal Watch Online reports on a recent TREE paper Callis et al Improving Wikipedia: educational opportunity and professional responsibility (DOI:10.1016/j.tree.2009.01.003 ) in Open Source Ecology

A University of Florida professor directed those energies towards a more noble cause: surveying and improving Wikipedia entries on ecological topics. The graduate students, enrolled in a seminar on plant-animal interactions, found the entries on frugivory, herbivory, pollination, granivory and seed dispersal to be lacking in breadth, and sometimes sidetracked by irrelevant topics (they were especially piqued by a long discourse about fruitarians – humans who choose a fruit diet — in the frugivory entry).

In Trends in Ecology and Evolution, the class reports that, although occasionally frustrated by other authors determined to repeatedly delete their changes, improving the entries was a valuable educational experience not too much different than writing a term paper.

They argue that updating Wikipedia, an increasingly influential public information source, is among the civic duties of scientists and should be an activity incorporated into student coursework, professional meetings, and even the peer-review publication process